00:00:00.000 Started by upstream project "autotest-spdk-v24.01-LTS-vs-dpdk-v22.11" build number 1052 00:00:00.000 originally caused by: 00:00:00.000 Started by upstream project "nightly-trigger" build number 3719 00:00:00.000 originally caused by: 00:00:00.000 Started by timer 00:00:00.025 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.026 The recommended git tool is: git 00:00:00.026 using credential 00000000-0000-0000-0000-000000000002 00:00:00.029 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.043 Fetching changes from the remote Git repository 00:00:00.050 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.066 Using shallow fetch with depth 1 00:00:00.066 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.066 > git --version # timeout=10 00:00:00.096 > git --version # 'git version 2.39.2' 00:00:00.096 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.125 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.126 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:04.591 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:04.601 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:04.612 Checking out Revision db4637e8b949f278f369ec13f70585206ccd9507 (FETCH_HEAD) 00:00:04.612 > git config core.sparsecheckout # timeout=10 00:00:04.621 > git read-tree -mu HEAD # timeout=10 00:00:04.635 > git checkout -f db4637e8b949f278f369ec13f70585206ccd9507 # timeout=5 00:00:04.651 Commit message: "jenkins/jjb-config: Add missing SPDK_TEST_NVME_INTERRUPT flag" 00:00:04.651 > git rev-list --no-walk db4637e8b949f278f369ec13f70585206ccd9507 # timeout=10 00:00:04.728 [Pipeline] Start of Pipeline 00:00:04.739 [Pipeline] library 00:00:04.740 Loading library shm_lib@master 00:00:04.740 Library shm_lib@master is cached. Copying from home. 00:00:04.754 [Pipeline] node 00:00:04.775 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:04.777 [Pipeline] { 00:00:04.787 [Pipeline] catchError 00:00:04.789 [Pipeline] { 00:00:04.799 [Pipeline] wrap 00:00:04.807 [Pipeline] { 00:00:04.816 [Pipeline] stage 00:00:04.818 [Pipeline] { (Prologue) 00:00:05.033 [Pipeline] sh 00:00:05.319 + logger -p user.info -t JENKINS-CI 00:00:05.337 [Pipeline] echo 00:00:05.339 Node: WFP20 00:00:05.344 [Pipeline] sh 00:00:05.643 [Pipeline] setCustomBuildProperty 00:00:05.656 [Pipeline] echo 00:00:05.657 Cleanup processes 00:00:05.663 [Pipeline] sh 00:00:05.947 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:05.947 358632 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:05.960 [Pipeline] sh 00:00:06.247 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:06.247 ++ grep -v 'sudo pgrep' 00:00:06.247 ++ awk '{print $1}' 00:00:06.247 + sudo kill -9 00:00:06.247 + true 00:00:06.260 [Pipeline] cleanWs 00:00:06.270 [WS-CLEANUP] Deleting project workspace... 00:00:06.270 [WS-CLEANUP] Deferred wipeout is used... 00:00:06.277 [WS-CLEANUP] done 00:00:06.281 [Pipeline] setCustomBuildProperty 00:00:06.296 [Pipeline] sh 00:00:06.582 + sudo git config --global --replace-all safe.directory '*' 00:00:06.669 [Pipeline] httpRequest 00:00:06.997 [Pipeline] echo 00:00:06.999 Sorcerer 10.211.164.20 is alive 00:00:07.006 [Pipeline] retry 00:00:07.008 [Pipeline] { 00:00:07.017 [Pipeline] httpRequest 00:00:07.021 HttpMethod: GET 00:00:07.022 URL: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:07.022 Sending request to url: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:07.026 Response Code: HTTP/1.1 200 OK 00:00:07.027 Success: Status code 200 is in the accepted range: 200,404 00:00:07.028 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:08.336 [Pipeline] } 00:00:08.353 [Pipeline] // retry 00:00:08.360 [Pipeline] sh 00:00:08.645 + tar --no-same-owner -xf jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:08.661 [Pipeline] httpRequest 00:00:09.040 [Pipeline] echo 00:00:09.042 Sorcerer 10.211.164.20 is alive 00:00:09.051 [Pipeline] retry 00:00:09.053 [Pipeline] { 00:00:09.067 [Pipeline] httpRequest 00:00:09.072 HttpMethod: GET 00:00:09.072 URL: http://10.211.164.20/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:09.074 Sending request to url: http://10.211.164.20/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:09.096 Response Code: HTTP/1.1 200 OK 00:00:09.097 Success: Status code 200 is in the accepted range: 200,404 00:00:09.097 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:59.957 [Pipeline] } 00:00:59.973 [Pipeline] // retry 00:00:59.980 [Pipeline] sh 00:01:00.264 + tar --no-same-owner -xf spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:01:02.814 [Pipeline] sh 00:01:03.099 + git -C spdk log --oneline -n5 00:01:03.099 c13c99a5e test: Various fixes for Fedora40 00:01:03.099 726a04d70 test/nvmf: adjust timeout for bigger nvmes 00:01:03.099 61c96acfb dpdk: Point dpdk submodule at a latest fix from spdk-23.11 00:01:03.099 7db6dcdb8 nvme/fio_plugin: update the way ruhs descriptors are fetched 00:01:03.099 ff6f5c41e nvme/fio_plugin: trim add support for multiple ranges 00:01:03.119 [Pipeline] withCredentials 00:01:03.129 > git --version # timeout=10 00:01:03.143 > git --version # 'git version 2.39.2' 00:01:03.160 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:01:03.162 [Pipeline] { 00:01:03.171 [Pipeline] retry 00:01:03.173 [Pipeline] { 00:01:03.188 [Pipeline] sh 00:01:03.472 + git ls-remote http://dpdk.org/git/dpdk-stable v22.11.4 00:01:03.744 [Pipeline] } 00:01:03.761 [Pipeline] // retry 00:01:03.766 [Pipeline] } 00:01:03.783 [Pipeline] // withCredentials 00:01:03.792 [Pipeline] httpRequest 00:01:04.173 [Pipeline] echo 00:01:04.174 Sorcerer 10.211.164.20 is alive 00:01:04.183 [Pipeline] retry 00:01:04.185 [Pipeline] { 00:01:04.198 [Pipeline] httpRequest 00:01:04.202 HttpMethod: GET 00:01:04.203 URL: http://10.211.164.20/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:04.203 Sending request to url: http://10.211.164.20/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:04.207 Response Code: HTTP/1.1 200 OK 00:01:04.207 Success: Status code 200 is in the accepted range: 200,404 00:01:04.207 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:09.425 [Pipeline] } 00:01:09.442 [Pipeline] // retry 00:01:09.450 [Pipeline] sh 00:01:09.736 + tar --no-same-owner -xf dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:11.127 [Pipeline] sh 00:01:11.412 + git -C dpdk log --oneline -n5 00:01:11.412 caf0f5d395 version: 22.11.4 00:01:11.412 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:01:11.412 dc9c799c7d vhost: fix missing spinlock unlock 00:01:11.412 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:01:11.412 6ef77f2a5e net/gve: fix RX buffer size alignment 00:01:11.422 [Pipeline] } 00:01:11.435 [Pipeline] // stage 00:01:11.444 [Pipeline] stage 00:01:11.446 [Pipeline] { (Prepare) 00:01:11.465 [Pipeline] writeFile 00:01:11.481 [Pipeline] sh 00:01:11.766 + logger -p user.info -t JENKINS-CI 00:01:11.779 [Pipeline] sh 00:01:12.064 + logger -p user.info -t JENKINS-CI 00:01:12.076 [Pipeline] sh 00:01:12.362 + cat autorun-spdk.conf 00:01:12.362 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:12.362 SPDK_RUN_UBSAN=1 00:01:12.362 SPDK_TEST_FUZZER=1 00:01:12.362 SPDK_TEST_FUZZER_SHORT=1 00:01:12.362 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:12.362 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:12.370 RUN_NIGHTLY=1 00:01:12.375 [Pipeline] readFile 00:01:12.399 [Pipeline] withEnv 00:01:12.401 [Pipeline] { 00:01:12.413 [Pipeline] sh 00:01:12.700 + set -ex 00:01:12.700 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:12.700 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:12.700 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:12.700 ++ SPDK_RUN_UBSAN=1 00:01:12.700 ++ SPDK_TEST_FUZZER=1 00:01:12.700 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:12.700 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:12.700 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:12.700 ++ RUN_NIGHTLY=1 00:01:12.700 + case $SPDK_TEST_NVMF_NICS in 00:01:12.700 + DRIVERS= 00:01:12.700 + [[ -n '' ]] 00:01:12.700 + exit 0 00:01:12.709 [Pipeline] } 00:01:12.724 [Pipeline] // withEnv 00:01:12.729 [Pipeline] } 00:01:12.743 [Pipeline] // stage 00:01:12.752 [Pipeline] catchError 00:01:12.754 [Pipeline] { 00:01:12.767 [Pipeline] timeout 00:01:12.767 Timeout set to expire in 30 min 00:01:12.769 [Pipeline] { 00:01:12.783 [Pipeline] stage 00:01:12.785 [Pipeline] { (Tests) 00:01:12.799 [Pipeline] sh 00:01:13.086 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:13.086 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:13.086 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:13.086 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:13.086 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:13.086 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:13.086 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:13.086 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:13.086 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:13.086 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:13.086 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:13.086 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:13.086 + source /etc/os-release 00:01:13.086 ++ NAME='Fedora Linux' 00:01:13.086 ++ VERSION='39 (Cloud Edition)' 00:01:13.086 ++ ID=fedora 00:01:13.086 ++ VERSION_ID=39 00:01:13.086 ++ VERSION_CODENAME= 00:01:13.086 ++ PLATFORM_ID=platform:f39 00:01:13.086 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:01:13.086 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:13.086 ++ LOGO=fedora-logo-icon 00:01:13.086 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:01:13.086 ++ HOME_URL=https://fedoraproject.org/ 00:01:13.086 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:01:13.086 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:13.086 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:13.086 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:13.086 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:01:13.086 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:13.086 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:01:13.086 ++ SUPPORT_END=2024-11-12 00:01:13.086 ++ VARIANT='Cloud Edition' 00:01:13.086 ++ VARIANT_ID=cloud 00:01:13.086 + uname -a 00:01:13.086 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:01:13.086 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:16.381 Hugepages 00:01:16.381 node hugesize free / total 00:01:16.381 node0 1048576kB 0 / 0 00:01:16.381 node0 2048kB 0 / 0 00:01:16.381 node1 1048576kB 0 / 0 00:01:16.381 node1 2048kB 0 / 0 00:01:16.381 00:01:16.381 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:16.381 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:16.381 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:16.381 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:16.381 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:16.381 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:16.381 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:16.381 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:16.381 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:16.381 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:16.381 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:16.381 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:16.381 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:16.381 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:16.381 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:16.381 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:16.381 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:16.381 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:16.382 + rm -f /tmp/spdk-ld-path 00:01:16.382 + source autorun-spdk.conf 00:01:16.382 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:16.382 ++ SPDK_RUN_UBSAN=1 00:01:16.382 ++ SPDK_TEST_FUZZER=1 00:01:16.382 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:16.382 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:16.382 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:16.382 ++ RUN_NIGHTLY=1 00:01:16.382 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:16.382 + [[ -n '' ]] 00:01:16.382 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:16.382 + for M in /var/spdk/build-*-manifest.txt 00:01:16.382 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:01:16.382 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:16.382 + for M in /var/spdk/build-*-manifest.txt 00:01:16.382 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:16.382 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:16.382 + for M in /var/spdk/build-*-manifest.txt 00:01:16.382 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:16.382 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:16.382 ++ uname 00:01:16.382 + [[ Linux == \L\i\n\u\x ]] 00:01:16.382 + sudo dmesg -T 00:01:16.382 + sudo dmesg --clear 00:01:16.382 + dmesg_pid=359555 00:01:16.382 + [[ Fedora Linux == FreeBSD ]] 00:01:16.382 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:16.382 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:16.382 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:16.382 + [[ -x /usr/src/fio-static/fio ]] 00:01:16.382 + export FIO_BIN=/usr/src/fio-static/fio 00:01:16.382 + FIO_BIN=/usr/src/fio-static/fio 00:01:16.382 + sudo dmesg -Tw 00:01:16.382 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:16.382 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:16.382 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:16.382 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:16.382 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:16.382 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:16.382 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:16.382 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:16.382 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:16.382 Test configuration: 00:01:16.382 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:16.382 SPDK_RUN_UBSAN=1 00:01:16.382 SPDK_TEST_FUZZER=1 00:01:16.382 SPDK_TEST_FUZZER_SHORT=1 00:01:16.382 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:16.382 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:16.382 RUN_NIGHTLY=1 06:56:34 -- common/autotest_common.sh@1689 -- $ [[ n == y ]] 00:01:16.382 06:56:34 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:16.382 06:56:34 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:16.382 06:56:34 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:16.382 06:56:34 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:16.382 06:56:34 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:16.382 06:56:34 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:16.382 06:56:34 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:16.382 06:56:34 -- paths/export.sh@5 -- $ export PATH 00:01:16.382 06:56:34 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:16.382 06:56:34 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:16.382 06:56:34 -- common/autobuild_common.sh@440 -- $ date +%s 00:01:16.382 06:56:34 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1734069394.XXXXXX 00:01:16.382 06:56:34 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1734069394.06L4Vi 00:01:16.382 06:56:34 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:01:16.382 06:56:34 -- common/autobuild_common.sh@446 -- $ '[' -n v22.11.4 ']' 00:01:16.382 06:56:34 -- common/autobuild_common.sh@447 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:16.382 06:56:34 -- common/autobuild_common.sh@447 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:01:16.382 06:56:34 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:16.382 06:56:34 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:16.382 06:56:34 -- common/autobuild_common.sh@456 -- $ get_config_params 00:01:16.382 06:56:34 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:01:16.382 06:56:34 -- common/autotest_common.sh@10 -- $ set +x 00:01:16.382 06:56:34 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:01:16.382 06:56:34 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:16.382 06:56:34 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:16.382 06:56:34 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:16.382 06:56:34 -- spdk/autobuild.sh@16 -- $ date -u 00:01:16.382 Fri Dec 13 05:56:34 AM UTC 2024 00:01:16.382 06:56:34 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:16.382 LTS-67-gc13c99a5e 00:01:16.382 06:56:34 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:16.382 06:56:34 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:16.382 06:56:34 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:16.382 06:56:34 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:01:16.382 06:56:34 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:16.382 06:56:34 -- common/autotest_common.sh@10 -- $ set +x 00:01:16.382 ************************************ 00:01:16.382 START TEST ubsan 00:01:16.382 ************************************ 00:01:16.382 06:56:34 -- common/autotest_common.sh@1114 -- $ echo 'using ubsan' 00:01:16.382 using ubsan 00:01:16.382 00:01:16.382 real 0m0.000s 00:01:16.382 user 0m0.000s 00:01:16.382 sys 0m0.000s 00:01:16.382 06:56:34 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:01:16.382 06:56:34 -- common/autotest_common.sh@10 -- $ set +x 00:01:16.382 ************************************ 00:01:16.382 END TEST ubsan 00:01:16.382 ************************************ 00:01:16.382 06:56:34 -- spdk/autobuild.sh@27 -- $ '[' -n v22.11.4 ']' 00:01:16.382 06:56:34 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:16.382 06:56:34 -- common/autobuild_common.sh@432 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:16.382 06:56:34 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:01:16.382 06:56:34 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:16.382 06:56:34 -- common/autotest_common.sh@10 -- $ set +x 00:01:16.382 ************************************ 00:01:16.382 START TEST build_native_dpdk 00:01:16.382 ************************************ 00:01:16.382 06:56:34 -- common/autotest_common.sh@1114 -- $ _build_native_dpdk 00:01:16.382 06:56:34 -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:16.382 06:56:34 -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:16.382 06:56:34 -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:16.382 06:56:34 -- common/autobuild_common.sh@51 -- $ local compiler 00:01:16.382 06:56:34 -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:16.382 06:56:34 -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:16.382 06:56:34 -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:16.382 06:56:34 -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:16.382 06:56:34 -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:16.382 06:56:34 -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:16.382 06:56:34 -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:16.382 06:56:34 -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:16.382 06:56:34 -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:16.382 06:56:34 -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:16.382 06:56:34 -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:16.382 06:56:34 -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:16.382 06:56:34 -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:16.382 06:56:34 -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:01:16.382 06:56:34 -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:16.382 06:56:34 -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:01:16.382 caf0f5d395 version: 22.11.4 00:01:16.382 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:01:16.383 dc9c799c7d vhost: fix missing spinlock unlock 00:01:16.383 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:01:16.383 6ef77f2a5e net/gve: fix RX buffer size alignment 00:01:16.383 06:56:34 -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:16.383 06:56:34 -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:16.383 06:56:34 -- common/autobuild_common.sh@87 -- $ dpdk_ver=22.11.4 00:01:16.383 06:56:34 -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:16.383 06:56:34 -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:16.383 06:56:34 -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:16.383 06:56:34 -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:16.383 06:56:34 -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:16.383 06:56:34 -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:16.383 06:56:34 -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:01:16.383 06:56:34 -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:01:16.383 06:56:34 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:16.383 06:56:34 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:16.383 06:56:34 -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:01:16.383 06:56:34 -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:16.383 06:56:34 -- common/autobuild_common.sh@168 -- $ uname -s 00:01:16.383 06:56:34 -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:01:16.383 06:56:34 -- common/autobuild_common.sh@169 -- $ lt 22.11.4 21.11.0 00:01:16.383 06:56:34 -- scripts/common.sh@372 -- $ cmp_versions 22.11.4 '<' 21.11.0 00:01:16.383 06:56:34 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:16.383 06:56:34 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:16.383 06:56:34 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:16.383 06:56:34 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:16.383 06:56:34 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:16.383 06:56:34 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:16.383 06:56:34 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:16.383 06:56:34 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:16.383 06:56:34 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:16.383 06:56:34 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:16.383 06:56:34 -- scripts/common.sh@343 -- $ case "$op" in 00:01:16.383 06:56:34 -- scripts/common.sh@344 -- $ : 1 00:01:16.383 06:56:34 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:16.383 06:56:34 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:16.643 06:56:34 -- scripts/common.sh@364 -- $ decimal 22 00:01:16.643 06:56:34 -- scripts/common.sh@352 -- $ local d=22 00:01:16.643 06:56:34 -- scripts/common.sh@353 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:01:16.643 06:56:34 -- scripts/common.sh@354 -- $ echo 22 00:01:16.643 06:56:34 -- scripts/common.sh@364 -- $ ver1[v]=22 00:01:16.643 06:56:34 -- scripts/common.sh@365 -- $ decimal 21 00:01:16.643 06:56:34 -- scripts/common.sh@352 -- $ local d=21 00:01:16.643 06:56:34 -- scripts/common.sh@353 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:16.643 06:56:34 -- scripts/common.sh@354 -- $ echo 21 00:01:16.643 06:56:34 -- scripts/common.sh@365 -- $ ver2[v]=21 00:01:16.643 06:56:34 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:16.643 06:56:34 -- scripts/common.sh@366 -- $ return 1 00:01:16.643 06:56:34 -- common/autobuild_common.sh@173 -- $ patch -p1 00:01:16.643 patching file config/rte_config.h 00:01:16.643 Hunk #1 succeeded at 60 (offset 1 line). 00:01:16.643 06:56:34 -- common/autobuild_common.sh@176 -- $ lt 22.11.4 24.07.0 00:01:16.643 06:56:34 -- scripts/common.sh@372 -- $ cmp_versions 22.11.4 '<' 24.07.0 00:01:16.643 06:56:34 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:16.643 06:56:34 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:16.643 06:56:34 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:16.643 06:56:34 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:16.643 06:56:34 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:16.643 06:56:34 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:16.643 06:56:34 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:16.643 06:56:34 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:16.643 06:56:34 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:16.643 06:56:34 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:16.643 06:56:34 -- scripts/common.sh@343 -- $ case "$op" in 00:01:16.643 06:56:34 -- scripts/common.sh@344 -- $ : 1 00:01:16.643 06:56:34 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:16.643 06:56:34 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:16.643 06:56:34 -- scripts/common.sh@364 -- $ decimal 22 00:01:16.643 06:56:34 -- scripts/common.sh@352 -- $ local d=22 00:01:16.643 06:56:34 -- scripts/common.sh@353 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:01:16.643 06:56:34 -- scripts/common.sh@354 -- $ echo 22 00:01:16.643 06:56:34 -- scripts/common.sh@364 -- $ ver1[v]=22 00:01:16.643 06:56:34 -- scripts/common.sh@365 -- $ decimal 24 00:01:16.643 06:56:34 -- scripts/common.sh@352 -- $ local d=24 00:01:16.643 06:56:34 -- scripts/common.sh@353 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:16.643 06:56:34 -- scripts/common.sh@354 -- $ echo 24 00:01:16.643 06:56:34 -- scripts/common.sh@365 -- $ ver2[v]=24 00:01:16.643 06:56:34 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:16.643 06:56:34 -- scripts/common.sh@367 -- $ (( ver1[v] < ver2[v] )) 00:01:16.643 06:56:34 -- scripts/common.sh@367 -- $ return 0 00:01:16.643 06:56:34 -- common/autobuild_common.sh@177 -- $ patch -p1 00:01:16.643 patching file lib/pcapng/rte_pcapng.c 00:01:16.643 Hunk #1 succeeded at 110 (offset -18 lines). 00:01:16.643 06:56:34 -- common/autobuild_common.sh@180 -- $ dpdk_kmods=false 00:01:16.643 06:56:34 -- common/autobuild_common.sh@181 -- $ uname -s 00:01:16.643 06:56:34 -- common/autobuild_common.sh@181 -- $ '[' Linux = FreeBSD ']' 00:01:16.643 06:56:34 -- common/autobuild_common.sh@185 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:01:16.643 06:56:34 -- common/autobuild_common.sh@185 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:21.937 The Meson build system 00:01:21.937 Version: 1.5.0 00:01:21.937 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:21.937 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:01:21.937 Build type: native build 00:01:21.937 Program cat found: YES (/usr/bin/cat) 00:01:21.937 Project name: DPDK 00:01:21.937 Project version: 22.11.4 00:01:21.937 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:01:21.937 C linker for the host machine: gcc ld.bfd 2.40-14 00:01:21.937 Host machine cpu family: x86_64 00:01:21.937 Host machine cpu: x86_64 00:01:21.937 Message: ## Building in Developer Mode ## 00:01:21.937 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:21.937 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:01:21.937 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:01:21.938 Program objdump found: YES (/usr/bin/objdump) 00:01:21.938 Program python3 found: YES (/usr/bin/python3) 00:01:21.938 Program cat found: YES (/usr/bin/cat) 00:01:21.938 config/meson.build:83: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:01:21.938 Checking for size of "void *" : 8 00:01:21.938 Checking for size of "void *" : 8 (cached) 00:01:21.938 Library m found: YES 00:01:21.938 Library numa found: YES 00:01:21.938 Has header "numaif.h" : YES 00:01:21.938 Library fdt found: NO 00:01:21.938 Library execinfo found: NO 00:01:21.938 Has header "execinfo.h" : YES 00:01:21.938 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:01:21.938 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:21.938 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:21.938 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:21.938 Run-time dependency openssl found: YES 3.1.1 00:01:21.938 Run-time dependency libpcap found: YES 1.10.4 00:01:21.938 Has header "pcap.h" with dependency libpcap: YES 00:01:21.938 Compiler for C supports arguments -Wcast-qual: YES 00:01:21.938 Compiler for C supports arguments -Wdeprecated: YES 00:01:21.938 Compiler for C supports arguments -Wformat: YES 00:01:21.938 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:21.938 Compiler for C supports arguments -Wformat-security: NO 00:01:21.938 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:21.938 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:21.938 Compiler for C supports arguments -Wnested-externs: YES 00:01:21.938 Compiler for C supports arguments -Wold-style-definition: YES 00:01:21.938 Compiler for C supports arguments -Wpointer-arith: YES 00:01:21.938 Compiler for C supports arguments -Wsign-compare: YES 00:01:21.938 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:21.938 Compiler for C supports arguments -Wundef: YES 00:01:21.938 Compiler for C supports arguments -Wwrite-strings: YES 00:01:21.938 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:21.938 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:21.938 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:21.938 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:21.938 Compiler for C supports arguments -mavx512f: YES 00:01:21.938 Checking if "AVX512 checking" compiles: YES 00:01:21.938 Fetching value of define "__SSE4_2__" : 1 00:01:21.938 Fetching value of define "__AES__" : 1 00:01:21.938 Fetching value of define "__AVX__" : 1 00:01:21.938 Fetching value of define "__AVX2__" : 1 00:01:21.938 Fetching value of define "__AVX512BW__" : 1 00:01:21.938 Fetching value of define "__AVX512CD__" : 1 00:01:21.938 Fetching value of define "__AVX512DQ__" : 1 00:01:21.938 Fetching value of define "__AVX512F__" : 1 00:01:21.938 Fetching value of define "__AVX512VL__" : 1 00:01:21.938 Fetching value of define "__PCLMUL__" : 1 00:01:21.938 Fetching value of define "__RDRND__" : 1 00:01:21.938 Fetching value of define "__RDSEED__" : 1 00:01:21.938 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:21.938 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:21.938 Message: lib/kvargs: Defining dependency "kvargs" 00:01:21.938 Message: lib/telemetry: Defining dependency "telemetry" 00:01:21.938 Checking for function "getentropy" : YES 00:01:21.938 Message: lib/eal: Defining dependency "eal" 00:01:21.938 Message: lib/ring: Defining dependency "ring" 00:01:21.938 Message: lib/rcu: Defining dependency "rcu" 00:01:21.938 Message: lib/mempool: Defining dependency "mempool" 00:01:21.938 Message: lib/mbuf: Defining dependency "mbuf" 00:01:21.938 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:21.938 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:21.938 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:21.938 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:21.938 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:21.938 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:21.938 Compiler for C supports arguments -mpclmul: YES 00:01:21.938 Compiler for C supports arguments -maes: YES 00:01:21.938 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:21.938 Compiler for C supports arguments -mavx512bw: YES 00:01:21.938 Compiler for C supports arguments -mavx512dq: YES 00:01:21.938 Compiler for C supports arguments -mavx512vl: YES 00:01:21.938 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:21.938 Compiler for C supports arguments -mavx2: YES 00:01:21.938 Compiler for C supports arguments -mavx: YES 00:01:21.938 Message: lib/net: Defining dependency "net" 00:01:21.938 Message: lib/meter: Defining dependency "meter" 00:01:21.938 Message: lib/ethdev: Defining dependency "ethdev" 00:01:21.938 Message: lib/pci: Defining dependency "pci" 00:01:21.938 Message: lib/cmdline: Defining dependency "cmdline" 00:01:21.938 Message: lib/metrics: Defining dependency "metrics" 00:01:21.938 Message: lib/hash: Defining dependency "hash" 00:01:21.938 Message: lib/timer: Defining dependency "timer" 00:01:21.938 Fetching value of define "__AVX2__" : 1 (cached) 00:01:21.938 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:21.938 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:21.938 Fetching value of define "__AVX512CD__" : 1 (cached) 00:01:21.938 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:21.938 Message: lib/acl: Defining dependency "acl" 00:01:21.938 Message: lib/bbdev: Defining dependency "bbdev" 00:01:21.938 Message: lib/bitratestats: Defining dependency "bitratestats" 00:01:21.938 Run-time dependency libelf found: YES 0.191 00:01:21.938 Message: lib/bpf: Defining dependency "bpf" 00:01:21.938 Message: lib/cfgfile: Defining dependency "cfgfile" 00:01:21.938 Message: lib/compressdev: Defining dependency "compressdev" 00:01:21.938 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:21.938 Message: lib/distributor: Defining dependency "distributor" 00:01:21.938 Message: lib/efd: Defining dependency "efd" 00:01:21.938 Message: lib/eventdev: Defining dependency "eventdev" 00:01:21.938 Message: lib/gpudev: Defining dependency "gpudev" 00:01:21.938 Message: lib/gro: Defining dependency "gro" 00:01:21.938 Message: lib/gso: Defining dependency "gso" 00:01:21.938 Message: lib/ip_frag: Defining dependency "ip_frag" 00:01:21.938 Message: lib/jobstats: Defining dependency "jobstats" 00:01:21.938 Message: lib/latencystats: Defining dependency "latencystats" 00:01:21.938 Message: lib/lpm: Defining dependency "lpm" 00:01:21.938 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:21.938 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:21.938 Fetching value of define "__AVX512IFMA__" : (undefined) 00:01:21.938 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:01:21.938 Message: lib/member: Defining dependency "member" 00:01:21.938 Message: lib/pcapng: Defining dependency "pcapng" 00:01:21.938 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:21.938 Message: lib/power: Defining dependency "power" 00:01:21.938 Message: lib/rawdev: Defining dependency "rawdev" 00:01:21.938 Message: lib/regexdev: Defining dependency "regexdev" 00:01:21.938 Message: lib/dmadev: Defining dependency "dmadev" 00:01:21.938 Message: lib/rib: Defining dependency "rib" 00:01:21.938 Message: lib/reorder: Defining dependency "reorder" 00:01:21.938 Message: lib/sched: Defining dependency "sched" 00:01:21.938 Message: lib/security: Defining dependency "security" 00:01:21.938 Message: lib/stack: Defining dependency "stack" 00:01:21.938 Has header "linux/userfaultfd.h" : YES 00:01:21.938 Message: lib/vhost: Defining dependency "vhost" 00:01:21.938 Message: lib/ipsec: Defining dependency "ipsec" 00:01:21.938 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:21.938 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:21.938 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:21.938 Message: lib/fib: Defining dependency "fib" 00:01:21.938 Message: lib/port: Defining dependency "port" 00:01:21.938 Message: lib/pdump: Defining dependency "pdump" 00:01:21.938 Message: lib/table: Defining dependency "table" 00:01:21.938 Message: lib/pipeline: Defining dependency "pipeline" 00:01:21.938 Message: lib/graph: Defining dependency "graph" 00:01:21.938 Message: lib/node: Defining dependency "node" 00:01:21.938 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:21.938 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:21.938 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:21.938 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:21.938 Compiler for C supports arguments -Wno-sign-compare: YES 00:01:21.938 Compiler for C supports arguments -Wno-unused-value: YES 00:01:21.938 Compiler for C supports arguments -Wno-format: YES 00:01:21.938 Compiler for C supports arguments -Wno-format-security: YES 00:01:21.938 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:01:22.516 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:01:22.516 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:01:22.516 Compiler for C supports arguments -Wno-unused-parameter: YES 00:01:22.516 Fetching value of define "__AVX2__" : 1 (cached) 00:01:22.516 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:22.516 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:22.516 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:22.516 Compiler for C supports arguments -mavx512bw: YES (cached) 00:01:22.516 Compiler for C supports arguments -march=skylake-avx512: YES 00:01:22.516 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:01:22.516 Program doxygen found: YES (/usr/local/bin/doxygen) 00:01:22.516 Configuring doxy-api.conf using configuration 00:01:22.516 Program sphinx-build found: NO 00:01:22.516 Configuring rte_build_config.h using configuration 00:01:22.516 Message: 00:01:22.516 ================= 00:01:22.516 Applications Enabled 00:01:22.516 ================= 00:01:22.516 00:01:22.516 apps: 00:01:22.516 dumpcap, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, test-crypto-perf, 00:01:22.516 test-eventdev, test-fib, test-flow-perf, test-gpudev, test-pipeline, test-pmd, test-regex, test-sad, 00:01:22.516 test-security-perf, 00:01:22.516 00:01:22.516 Message: 00:01:22.516 ================= 00:01:22.516 Libraries Enabled 00:01:22.516 ================= 00:01:22.516 00:01:22.516 libs: 00:01:22.516 kvargs, telemetry, eal, ring, rcu, mempool, mbuf, net, 00:01:22.516 meter, ethdev, pci, cmdline, metrics, hash, timer, acl, 00:01:22.516 bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, efd, 00:01:22.516 eventdev, gpudev, gro, gso, ip_frag, jobstats, latencystats, lpm, 00:01:22.516 member, pcapng, power, rawdev, regexdev, dmadev, rib, reorder, 00:01:22.516 sched, security, stack, vhost, ipsec, fib, port, pdump, 00:01:22.516 table, pipeline, graph, node, 00:01:22.516 00:01:22.516 Message: 00:01:22.516 =============== 00:01:22.516 Drivers Enabled 00:01:22.516 =============== 00:01:22.516 00:01:22.516 common: 00:01:22.516 00:01:22.516 bus: 00:01:22.516 pci, vdev, 00:01:22.516 mempool: 00:01:22.516 ring, 00:01:22.516 dma: 00:01:22.516 00:01:22.516 net: 00:01:22.516 i40e, 00:01:22.516 raw: 00:01:22.516 00:01:22.516 crypto: 00:01:22.516 00:01:22.516 compress: 00:01:22.516 00:01:22.516 regex: 00:01:22.516 00:01:22.516 vdpa: 00:01:22.516 00:01:22.516 event: 00:01:22.516 00:01:22.516 baseband: 00:01:22.516 00:01:22.516 gpu: 00:01:22.516 00:01:22.516 00:01:22.516 Message: 00:01:22.516 ================= 00:01:22.516 Content Skipped 00:01:22.516 ================= 00:01:22.516 00:01:22.516 apps: 00:01:22.516 00:01:22.516 libs: 00:01:22.516 kni: explicitly disabled via build config (deprecated lib) 00:01:22.516 flow_classify: explicitly disabled via build config (deprecated lib) 00:01:22.516 00:01:22.516 drivers: 00:01:22.516 common/cpt: not in enabled drivers build config 00:01:22.516 common/dpaax: not in enabled drivers build config 00:01:22.516 common/iavf: not in enabled drivers build config 00:01:22.516 common/idpf: not in enabled drivers build config 00:01:22.516 common/mvep: not in enabled drivers build config 00:01:22.516 common/octeontx: not in enabled drivers build config 00:01:22.516 bus/auxiliary: not in enabled drivers build config 00:01:22.516 bus/dpaa: not in enabled drivers build config 00:01:22.516 bus/fslmc: not in enabled drivers build config 00:01:22.516 bus/ifpga: not in enabled drivers build config 00:01:22.516 bus/vmbus: not in enabled drivers build config 00:01:22.516 common/cnxk: not in enabled drivers build config 00:01:22.516 common/mlx5: not in enabled drivers build config 00:01:22.516 common/qat: not in enabled drivers build config 00:01:22.516 common/sfc_efx: not in enabled drivers build config 00:01:22.516 mempool/bucket: not in enabled drivers build config 00:01:22.516 mempool/cnxk: not in enabled drivers build config 00:01:22.516 mempool/dpaa: not in enabled drivers build config 00:01:22.516 mempool/dpaa2: not in enabled drivers build config 00:01:22.516 mempool/octeontx: not in enabled drivers build config 00:01:22.516 mempool/stack: not in enabled drivers build config 00:01:22.516 dma/cnxk: not in enabled drivers build config 00:01:22.516 dma/dpaa: not in enabled drivers build config 00:01:22.516 dma/dpaa2: not in enabled drivers build config 00:01:22.516 dma/hisilicon: not in enabled drivers build config 00:01:22.516 dma/idxd: not in enabled drivers build config 00:01:22.516 dma/ioat: not in enabled drivers build config 00:01:22.516 dma/skeleton: not in enabled drivers build config 00:01:22.516 net/af_packet: not in enabled drivers build config 00:01:22.516 net/af_xdp: not in enabled drivers build config 00:01:22.516 net/ark: not in enabled drivers build config 00:01:22.516 net/atlantic: not in enabled drivers build config 00:01:22.516 net/avp: not in enabled drivers build config 00:01:22.516 net/axgbe: not in enabled drivers build config 00:01:22.516 net/bnx2x: not in enabled drivers build config 00:01:22.516 net/bnxt: not in enabled drivers build config 00:01:22.516 net/bonding: not in enabled drivers build config 00:01:22.516 net/cnxk: not in enabled drivers build config 00:01:22.516 net/cxgbe: not in enabled drivers build config 00:01:22.516 net/dpaa: not in enabled drivers build config 00:01:22.516 net/dpaa2: not in enabled drivers build config 00:01:22.516 net/e1000: not in enabled drivers build config 00:01:22.516 net/ena: not in enabled drivers build config 00:01:22.516 net/enetc: not in enabled drivers build config 00:01:22.516 net/enetfec: not in enabled drivers build config 00:01:22.516 net/enic: not in enabled drivers build config 00:01:22.516 net/failsafe: not in enabled drivers build config 00:01:22.516 net/fm10k: not in enabled drivers build config 00:01:22.516 net/gve: not in enabled drivers build config 00:01:22.516 net/hinic: not in enabled drivers build config 00:01:22.516 net/hns3: not in enabled drivers build config 00:01:22.516 net/iavf: not in enabled drivers build config 00:01:22.516 net/ice: not in enabled drivers build config 00:01:22.516 net/idpf: not in enabled drivers build config 00:01:22.516 net/igc: not in enabled drivers build config 00:01:22.516 net/ionic: not in enabled drivers build config 00:01:22.516 net/ipn3ke: not in enabled drivers build config 00:01:22.516 net/ixgbe: not in enabled drivers build config 00:01:22.516 net/kni: not in enabled drivers build config 00:01:22.516 net/liquidio: not in enabled drivers build config 00:01:22.516 net/mana: not in enabled drivers build config 00:01:22.516 net/memif: not in enabled drivers build config 00:01:22.516 net/mlx4: not in enabled drivers build config 00:01:22.516 net/mlx5: not in enabled drivers build config 00:01:22.516 net/mvneta: not in enabled drivers build config 00:01:22.516 net/mvpp2: not in enabled drivers build config 00:01:22.516 net/netvsc: not in enabled drivers build config 00:01:22.516 net/nfb: not in enabled drivers build config 00:01:22.516 net/nfp: not in enabled drivers build config 00:01:22.516 net/ngbe: not in enabled drivers build config 00:01:22.516 net/null: not in enabled drivers build config 00:01:22.516 net/octeontx: not in enabled drivers build config 00:01:22.516 net/octeon_ep: not in enabled drivers build config 00:01:22.516 net/pcap: not in enabled drivers build config 00:01:22.516 net/pfe: not in enabled drivers build config 00:01:22.516 net/qede: not in enabled drivers build config 00:01:22.516 net/ring: not in enabled drivers build config 00:01:22.516 net/sfc: not in enabled drivers build config 00:01:22.516 net/softnic: not in enabled drivers build config 00:01:22.516 net/tap: not in enabled drivers build config 00:01:22.516 net/thunderx: not in enabled drivers build config 00:01:22.516 net/txgbe: not in enabled drivers build config 00:01:22.516 net/vdev_netvsc: not in enabled drivers build config 00:01:22.516 net/vhost: not in enabled drivers build config 00:01:22.516 net/virtio: not in enabled drivers build config 00:01:22.516 net/vmxnet3: not in enabled drivers build config 00:01:22.516 raw/cnxk_bphy: not in enabled drivers build config 00:01:22.516 raw/cnxk_gpio: not in enabled drivers build config 00:01:22.516 raw/dpaa2_cmdif: not in enabled drivers build config 00:01:22.516 raw/ifpga: not in enabled drivers build config 00:01:22.516 raw/ntb: not in enabled drivers build config 00:01:22.516 raw/skeleton: not in enabled drivers build config 00:01:22.516 crypto/armv8: not in enabled drivers build config 00:01:22.516 crypto/bcmfs: not in enabled drivers build config 00:01:22.516 crypto/caam_jr: not in enabled drivers build config 00:01:22.516 crypto/ccp: not in enabled drivers build config 00:01:22.516 crypto/cnxk: not in enabled drivers build config 00:01:22.516 crypto/dpaa_sec: not in enabled drivers build config 00:01:22.516 crypto/dpaa2_sec: not in enabled drivers build config 00:01:22.516 crypto/ipsec_mb: not in enabled drivers build config 00:01:22.516 crypto/mlx5: not in enabled drivers build config 00:01:22.516 crypto/mvsam: not in enabled drivers build config 00:01:22.516 crypto/nitrox: not in enabled drivers build config 00:01:22.516 crypto/null: not in enabled drivers build config 00:01:22.516 crypto/octeontx: not in enabled drivers build config 00:01:22.516 crypto/openssl: not in enabled drivers build config 00:01:22.516 crypto/scheduler: not in enabled drivers build config 00:01:22.516 crypto/uadk: not in enabled drivers build config 00:01:22.516 crypto/virtio: not in enabled drivers build config 00:01:22.516 compress/isal: not in enabled drivers build config 00:01:22.516 compress/mlx5: not in enabled drivers build config 00:01:22.516 compress/octeontx: not in enabled drivers build config 00:01:22.516 compress/zlib: not in enabled drivers build config 00:01:22.516 regex/mlx5: not in enabled drivers build config 00:01:22.516 regex/cn9k: not in enabled drivers build config 00:01:22.516 vdpa/ifc: not in enabled drivers build config 00:01:22.517 vdpa/mlx5: not in enabled drivers build config 00:01:22.517 vdpa/sfc: not in enabled drivers build config 00:01:22.517 event/cnxk: not in enabled drivers build config 00:01:22.517 event/dlb2: not in enabled drivers build config 00:01:22.517 event/dpaa: not in enabled drivers build config 00:01:22.517 event/dpaa2: not in enabled drivers build config 00:01:22.517 event/dsw: not in enabled drivers build config 00:01:22.517 event/opdl: not in enabled drivers build config 00:01:22.517 event/skeleton: not in enabled drivers build config 00:01:22.517 event/sw: not in enabled drivers build config 00:01:22.517 event/octeontx: not in enabled drivers build config 00:01:22.517 baseband/acc: not in enabled drivers build config 00:01:22.517 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:01:22.517 baseband/fpga_lte_fec: not in enabled drivers build config 00:01:22.517 baseband/la12xx: not in enabled drivers build config 00:01:22.517 baseband/null: not in enabled drivers build config 00:01:22.517 baseband/turbo_sw: not in enabled drivers build config 00:01:22.517 gpu/cuda: not in enabled drivers build config 00:01:22.517 00:01:22.517 00:01:22.517 Build targets in project: 311 00:01:22.517 00:01:22.517 DPDK 22.11.4 00:01:22.517 00:01:22.517 User defined options 00:01:22.517 libdir : lib 00:01:22.517 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:22.517 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:01:22.517 c_link_args : 00:01:22.517 enable_docs : false 00:01:22.517 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:22.517 enable_kmods : false 00:01:22.517 machine : native 00:01:22.517 tests : false 00:01:22.517 00:01:22.517 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:22.517 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:01:22.517 06:56:40 -- common/autobuild_common.sh@189 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:01:22.517 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:22.517 [1/740] Generating lib/rte_kvargs_def with a custom command 00:01:22.517 [2/740] Generating lib/rte_kvargs_mingw with a custom command 00:01:22.517 [3/740] Generating lib/rte_telemetry_def with a custom command 00:01:22.517 [4/740] Generating lib/rte_telemetry_mingw with a custom command 00:01:22.517 [5/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:22.785 [6/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:22.785 [7/740] Generating lib/rte_rcu_mingw with a custom command 00:01:22.785 [8/740] Generating lib/rte_mempool_mingw with a custom command 00:01:22.785 [9/740] Generating lib/rte_eal_mingw with a custom command 00:01:22.785 [10/740] Generating lib/rte_ring_def with a custom command 00:01:22.785 [11/740] Generating lib/rte_mbuf_def with a custom command 00:01:22.785 [12/740] Generating lib/rte_net_mingw with a custom command 00:01:22.785 [13/740] Generating lib/rte_eal_def with a custom command 00:01:22.785 [14/740] Generating lib/rte_ring_mingw with a custom command 00:01:22.785 [15/740] Generating lib/rte_rcu_def with a custom command 00:01:22.785 [16/740] Generating lib/rte_mempool_def with a custom command 00:01:22.785 [17/740] Generating lib/rte_mbuf_mingw with a custom command 00:01:22.785 [18/740] Generating lib/rte_meter_def with a custom command 00:01:22.785 [19/740] Generating lib/rte_meter_mingw with a custom command 00:01:22.785 [20/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:22.785 [21/740] Generating lib/rte_net_def with a custom command 00:01:22.785 [22/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:22.785 [23/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:22.785 [24/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:22.785 [25/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:22.785 [26/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:22.785 [27/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:22.785 [28/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:22.785 [29/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:22.785 [30/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_log.c.o 00:01:22.785 [31/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:22.785 [32/740] Generating lib/rte_pci_mingw with a custom command 00:01:22.785 [33/740] Generating lib/rte_ethdev_def with a custom command 00:01:22.785 [34/740] Generating lib/rte_ethdev_mingw with a custom command 00:01:22.785 [35/740] Generating lib/rte_pci_def with a custom command 00:01:22.785 [36/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:22.785 [37/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:22.785 [38/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:22.785 [39/740] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:22.785 [40/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:22.785 [41/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:22.785 [42/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:22.785 [43/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:22.785 [44/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:22.785 [45/740] Generating lib/rte_cmdline_mingw with a custom command 00:01:22.785 [46/740] Generating lib/rte_metrics_mingw with a custom command 00:01:22.785 [47/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:22.785 [48/740] Linking static target lib/librte_kvargs.a 00:01:22.785 [49/740] Generating lib/rte_cmdline_def with a custom command 00:01:22.785 [50/740] Generating lib/rte_metrics_def with a custom command 00:01:22.785 [51/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:22.785 [52/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:22.785 [53/740] Generating lib/rte_hash_def with a custom command 00:01:22.785 [54/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:22.785 [55/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:22.785 [56/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:22.785 [57/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:22.785 [58/740] Generating lib/rte_hash_mingw with a custom command 00:01:22.785 [59/740] Generating lib/rte_timer_def with a custom command 00:01:22.785 [60/740] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:22.785 [61/740] Generating lib/rte_timer_mingw with a custom command 00:01:22.785 [62/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:22.785 [63/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:22.785 [64/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:22.785 [65/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:22.785 [66/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:22.785 [67/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:22.785 [68/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:22.785 [69/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:22.785 [70/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:22.785 [71/740] Generating lib/rte_acl_def with a custom command 00:01:22.785 [72/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:22.785 [73/740] Generating lib/rte_acl_mingw with a custom command 00:01:22.785 [74/740] Generating lib/rte_bbdev_def with a custom command 00:01:22.785 [75/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:22.785 [76/740] Generating lib/rte_bbdev_mingw with a custom command 00:01:22.785 [77/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:22.785 [78/740] Generating lib/rte_bitratestats_def with a custom command 00:01:22.785 [79/740] Generating lib/rte_bitratestats_mingw with a custom command 00:01:22.785 [80/740] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:23.047 [81/740] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:23.048 [82/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:23.048 [83/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:23.048 [84/740] Linking static target lib/librte_pci.a 00:01:23.048 [85/740] Generating lib/rte_cfgfile_mingw with a custom command 00:01:23.048 [86/740] Linking static target lib/librte_meter.a 00:01:23.048 [87/740] Generating lib/rte_bpf_def with a custom command 00:01:23.048 [88/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:23.048 [89/740] Generating lib/rte_bpf_mingw with a custom command 00:01:23.048 [90/740] Generating lib/rte_cfgfile_def with a custom command 00:01:23.048 [91/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:23.048 [92/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:23.048 [93/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:23.048 [94/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:23.048 [95/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:23.048 [96/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:23.048 [97/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:23.048 [98/740] Generating lib/rte_compressdev_def with a custom command 00:01:23.048 [99/740] Generating lib/rte_compressdev_mingw with a custom command 00:01:23.048 [100/740] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:23.048 [101/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:23.048 [102/740] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:23.048 [103/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:23.048 [104/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_log.c.o 00:01:23.048 [105/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:23.048 [106/740] Linking static target lib/librte_ring.a 00:01:23.048 [107/740] Generating lib/rte_cryptodev_mingw with a custom command 00:01:23.048 [108/740] Generating lib/rte_distributor_def with a custom command 00:01:23.048 [109/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:23.048 [110/740] Generating lib/rte_cryptodev_def with a custom command 00:01:23.048 [111/740] Generating lib/rte_distributor_mingw with a custom command 00:01:23.048 [112/740] Generating lib/rte_efd_def with a custom command 00:01:23.048 [113/740] Generating lib/rte_efd_mingw with a custom command 00:01:23.048 [114/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:23.048 [115/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:23.048 [116/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:23.048 [117/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:23.048 [118/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:23.048 [119/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:23.048 [120/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:23.048 [121/740] Generating lib/rte_gpudev_def with a custom command 00:01:23.048 [122/740] Generating lib/rte_eventdev_def with a custom command 00:01:23.048 [123/740] Generating lib/rte_gpudev_mingw with a custom command 00:01:23.048 [124/740] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:23.048 [125/740] Generating lib/rte_eventdev_mingw with a custom command 00:01:23.048 [126/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:01:23.048 [127/740] Generating lib/rte_gro_def with a custom command 00:01:23.048 [128/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:23.048 [129/740] Generating lib/rte_gro_mingw with a custom command 00:01:23.048 [130/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:23.048 [131/740] Generating lib/rte_gso_def with a custom command 00:01:23.048 [132/740] Generating lib/rte_gso_mingw with a custom command 00:01:23.307 [133/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:23.307 [134/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:23.307 [135/740] Generating lib/rte_ip_frag_def with a custom command 00:01:23.307 [136/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:23.307 [137/740] Generating lib/rte_ip_frag_mingw with a custom command 00:01:23.307 [138/740] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.307 [139/740] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.307 [140/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:23.307 [141/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:23.307 [142/740] Generating lib/rte_jobstats_def with a custom command 00:01:23.307 [143/740] Generating lib/rte_jobstats_mingw with a custom command 00:01:23.307 [144/740] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.307 [145/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:23.307 [146/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:23.307 [147/740] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:01:23.307 [148/740] Linking target lib/librte_kvargs.so.23.0 00:01:23.307 [149/740] Generating lib/rte_latencystats_def with a custom command 00:01:23.307 [150/740] Generating lib/rte_latencystats_mingw with a custom command 00:01:23.307 [151/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:23.307 [152/740] Linking static target lib/librte_cfgfile.a 00:01:23.307 [153/740] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:23.307 [154/740] Generating lib/rte_lpm_def with a custom command 00:01:23.307 [155/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:23.307 [156/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:23.307 [157/740] Generating lib/rte_lpm_mingw with a custom command 00:01:23.307 [158/740] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:23.307 [159/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:23.307 [160/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:23.307 [161/740] Generating lib/rte_member_def with a custom command 00:01:23.307 [162/740] Generating lib/rte_pcapng_def with a custom command 00:01:23.307 [163/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:23.567 [164/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:23.567 [165/740] Generating lib/rte_pcapng_mingw with a custom command 00:01:23.567 [166/740] Generating lib/rte_member_mingw with a custom command 00:01:23.567 [167/740] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:01:23.567 [168/740] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.567 [169/740] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:23.567 [170/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:23.567 [171/740] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:01:23.567 [172/740] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:23.567 [173/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:23.567 [174/740] Linking static target lib/librte_jobstats.a 00:01:23.567 [175/740] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:23.567 [176/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:23.567 [177/740] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:23.567 [178/740] Linking static target lib/librte_cmdline.a 00:01:23.567 [179/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:23.567 [180/740] Generating lib/rte_power_def with a custom command 00:01:23.567 [181/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:23.567 [182/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:23.567 [183/740] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:23.567 [184/740] Generating lib/rte_power_mingw with a custom command 00:01:23.567 [185/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:23.567 [186/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:23.567 [187/740] Generating lib/rte_rawdev_def with a custom command 00:01:23.567 [188/740] Linking static target lib/librte_telemetry.a 00:01:23.567 [189/740] Linking static target lib/librte_timer.a 00:01:23.567 [190/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:01:23.567 [191/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:23.567 [192/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:01:23.567 [193/740] Generating lib/rte_rawdev_mingw with a custom command 00:01:23.567 [194/740] Linking static target lib/librte_metrics.a 00:01:23.567 [195/740] Generating lib/rte_regexdev_def with a custom command 00:01:23.567 [196/740] Generating lib/rte_regexdev_mingw with a custom command 00:01:23.567 [197/740] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:23.567 [198/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:23.567 [199/740] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:01:23.567 [200/740] Generating lib/rte_dmadev_def with a custom command 00:01:23.567 [201/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:01:23.567 [202/740] Generating lib/rte_dmadev_mingw with a custom command 00:01:23.567 [203/740] Generating symbol file lib/librte_kvargs.so.23.0.p/librte_kvargs.so.23.0.symbols 00:01:23.567 [204/740] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:23.567 [205/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:01:23.567 [206/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:23.567 [207/740] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:23.567 [208/740] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:23.567 [209/740] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:23.567 [210/740] Generating lib/rte_rib_def with a custom command 00:01:23.567 [211/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:01:23.567 [212/740] Generating lib/rte_rib_mingw with a custom command 00:01:23.567 [213/740] Linking static target lib/librte_net.a 00:01:23.567 [214/740] Generating lib/rte_reorder_def with a custom command 00:01:23.567 [215/740] Generating lib/rte_sched_def with a custom command 00:01:23.567 [216/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:23.567 [217/740] Generating lib/rte_reorder_mingw with a custom command 00:01:23.567 [218/740] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:01:23.567 [219/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:01:23.567 [220/740] Generating lib/rte_sched_mingw with a custom command 00:01:23.567 [221/740] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:23.567 [222/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:01:23.567 [223/740] Linking static target lib/librte_bitratestats.a 00:01:23.567 [224/740] Generating lib/rte_security_mingw with a custom command 00:01:23.567 [225/740] Generating lib/rte_security_def with a custom command 00:01:23.567 [226/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:23.567 [227/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:23.567 [228/740] Generating lib/rte_stack_mingw with a custom command 00:01:23.567 [229/740] Generating lib/rte_stack_def with a custom command 00:01:23.567 [230/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:23.567 [231/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:23.567 [232/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:01:23.567 [233/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:23.832 [234/740] Generating lib/rte_vhost_def with a custom command 00:01:23.832 [235/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:23.832 [236/740] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:01:23.832 [237/740] Generating lib/rte_vhost_mingw with a custom command 00:01:23.832 [238/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:23.832 [239/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:23.832 [240/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:01:23.832 [241/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:01:23.832 [242/740] Generating lib/rte_ipsec_def with a custom command 00:01:23.832 [243/740] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:01:23.832 [244/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:01:23.832 [245/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:23.832 [246/740] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:01:23.832 [247/740] Generating lib/rte_ipsec_mingw with a custom command 00:01:23.832 [248/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:01:23.832 [249/740] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:01:23.832 [250/740] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:01:23.832 [251/740] Compiling C object lib/librte_power.a.p/power_rte_power_empty_poll.c.o 00:01:23.832 [252/740] Generating lib/rte_fib_def with a custom command 00:01:23.832 [253/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:01:23.832 [254/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:01:23.832 [255/740] Generating lib/rte_fib_mingw with a custom command 00:01:23.832 [256/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:23.832 [257/740] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:01:23.832 [258/740] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:01:23.832 [259/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:01:23.832 [260/740] Linking static target lib/librte_stack.a 00:01:23.832 [261/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:01:23.832 [262/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:01:23.832 [263/740] Generating lib/rte_port_def with a custom command 00:01:23.832 [264/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:01:23.832 [265/740] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:23.832 [266/740] Generating lib/rte_port_mingw with a custom command 00:01:23.832 [267/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:23.832 [268/740] Generating lib/rte_pdump_def with a custom command 00:01:23.832 [269/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:01:23.832 [270/740] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.832 [271/740] Linking static target lib/librte_compressdev.a 00:01:23.832 [272/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:23.832 [273/740] Generating lib/rte_pdump_mingw with a custom command 00:01:23.832 [274/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:01:23.832 [275/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:23.832 [276/740] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:23.832 [277/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:01:24.096 [278/740] Linking static target lib/librte_rcu.a 00:01:24.096 [279/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:01:24.096 [280/740] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:01:24.096 [281/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:24.096 [282/740] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.096 [283/740] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:01:24.096 [284/740] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:01:24.096 [285/740] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.096 [286/740] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:01:24.096 [287/740] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:01:24.096 [288/740] Linking static target lib/librte_mempool.a 00:01:24.096 [289/740] Linking static target lib/librte_bbdev.a 00:01:24.096 [290/740] Linking static target lib/librte_rawdev.a 00:01:24.096 [291/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:01:24.096 [292/740] Generating lib/rte_table_def with a custom command 00:01:24.096 [293/740] Generating lib/rte_table_mingw with a custom command 00:01:24.096 [294/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:01:24.096 [295/740] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.096 [296/740] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:24.096 [297/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:01:24.096 [298/740] Linking static target lib/librte_gro.a 00:01:24.096 [299/740] Linking static target lib/librte_dmadev.a 00:01:24.096 [300/740] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:01:24.096 [301/740] Linking static target lib/librte_gpudev.a 00:01:24.096 [302/740] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.096 [303/740] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:01:24.096 [304/740] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.096 [305/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:01:24.096 [306/740] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:24.096 [307/740] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:01:24.096 [308/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:01:24.096 [309/740] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:01:24.096 [310/740] Compiling C object lib/librte_power.a.p/power_rte_power_intel_uncore.c.o 00:01:24.096 [311/740] Linking static target lib/librte_gso.a 00:01:24.096 [312/740] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.096 [313/740] Generating lib/rte_pipeline_def with a custom command 00:01:24.096 [314/740] Generating lib/rte_pipeline_mingw with a custom command 00:01:24.096 [315/740] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:01:24.096 [316/740] Linking static target lib/librte_latencystats.a 00:01:24.096 [317/740] Linking target lib/librte_telemetry.so.23.0 00:01:24.096 [318/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:01:24.096 [319/740] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.096 [320/740] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:01:24.096 [321/740] Generating lib/rte_graph_def with a custom command 00:01:24.096 [322/740] Generating lib/rte_graph_mingw with a custom command 00:01:24.096 [323/740] Linking static target lib/member/libsketch_avx512_tmp.a 00:01:24.357 [324/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:01:24.357 [325/740] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:24.357 [326/740] Linking static target lib/librte_distributor.a 00:01:24.357 [327/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:01:24.357 [328/740] Linking static target lib/librte_ip_frag.a 00:01:24.357 [329/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:01:24.357 [330/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:01:24.357 [331/740] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:24.357 [332/740] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:01:24.357 [333/740] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:01:24.357 [334/740] Compiling C object lib/librte_node.a.p/node_null.c.o 00:01:24.357 [335/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:01:24.357 [336/740] Linking static target lib/librte_regexdev.a 00:01:24.357 [337/740] Generating symbol file lib/librte_telemetry.so.23.0.p/librte_telemetry.so.23.0.symbols 00:01:24.357 [338/740] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.357 [339/740] Generating lib/rte_node_def with a custom command 00:01:24.357 [340/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:01:24.357 [341/740] Generating lib/rte_node_mingw with a custom command 00:01:24.357 [342/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:01:24.357 [343/740] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.357 [344/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:01:24.357 [345/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:24.357 [346/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:01:24.357 [347/740] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:24.357 [348/740] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.357 [349/740] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:24.357 [350/740] Generating drivers/rte_bus_pci_def with a custom command 00:01:24.357 [351/740] Generating drivers/rte_bus_pci_mingw with a custom command 00:01:24.357 [352/740] Linking static target lib/librte_eal.a 00:01:24.357 [353/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:01:24.357 [354/740] Linking static target lib/librte_power.a 00:01:24.619 [355/740] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:24.619 [356/740] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:24.619 [357/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:24.619 [358/740] Generating drivers/rte_bus_vdev_def with a custom command 00:01:24.619 [359/740] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.619 [360/740] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:01:24.619 [361/740] Linking static target lib/librte_reorder.a 00:01:24.619 [362/740] Generating drivers/rte_bus_vdev_mingw with a custom command 00:01:24.619 [363/740] Generating drivers/rte_mempool_ring_def with a custom command 00:01:24.619 [364/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:24.619 [365/740] Generating drivers/rte_mempool_ring_mingw with a custom command 00:01:24.619 [366/740] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:24.619 [367/740] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:01:24.619 [368/740] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:01:24.619 [369/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:01:24.619 [370/740] Linking static target lib/librte_pcapng.a 00:01:24.619 [371/740] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:01:24.619 [372/740] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:24.619 [373/740] Linking static target lib/librte_security.a 00:01:24.619 [374/740] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.619 [375/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:01:24.619 [376/740] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:01:24.619 [377/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:24.619 [378/740] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:01:24.619 [379/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:01:24.619 [380/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:01:24.619 [381/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:01:24.619 [382/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:24.619 [383/740] Linking static target lib/librte_bpf.a 00:01:24.619 [384/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:01:24.619 [385/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:01:24.619 [386/740] Linking static target lib/librte_mbuf.a 00:01:24.619 [387/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:01:24.619 [388/740] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:01:24.619 [389/740] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:01:24.619 [390/740] Generating drivers/rte_net_i40e_def with a custom command 00:01:24.619 [391/740] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.882 [392/740] Generating drivers/rte_net_i40e_mingw with a custom command 00:01:24.882 [393/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:24.882 [394/740] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.882 [395/740] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:24.882 [396/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:24.882 [397/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:01:24.882 [398/740] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:01:24.882 [399/740] Compiling C object lib/librte_node.a.p/node_log.c.o 00:01:24.882 [400/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:01:24.882 [401/740] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:24.882 [402/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:24.883 [403/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:01:24.883 [404/740] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:01:24.883 [405/740] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:01:24.883 [406/740] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:01:24.883 [407/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:01:24.883 [408/740] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.883 [409/740] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:01:24.883 [410/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:01:24.883 [411/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:01:24.883 [412/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:01:24.883 [413/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:01:24.883 [414/740] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:01:24.883 [415/740] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:01:24.883 [416/740] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.883 [417/740] Linking static target lib/librte_lpm.a 00:01:24.883 [418/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:01:24.883 [419/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:01:24.883 [420/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:01:24.883 [421/740] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:01:24.883 [422/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:24.883 [423/740] Linking static target lib/librte_rib.a 00:01:24.883 [424/740] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.883 [425/740] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:01:24.883 [426/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:24.883 [427/740] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.883 [428/740] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:01:24.883 [429/740] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:01:25.147 [430/740] Linking static target lib/librte_graph.a 00:01:25.147 [431/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:01:25.147 [432/740] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:01:25.147 [433/740] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:01:25.147 [434/740] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:01:25.147 [435/740] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:01:25.147 [436/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:01:25.147 [437/740] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:01:25.147 [438/740] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.147 [439/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:01:25.147 [440/740] Linking static target lib/librte_efd.a 00:01:25.147 [441/740] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:01:25.147 [442/740] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:01:25.147 [443/740] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:01:25.147 [444/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:01:25.147 [445/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:25.147 [446/740] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:25.147 [447/740] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:25.147 [448/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:01:25.147 [449/740] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.147 [450/740] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:25.147 [451/740] Compiling C object drivers/librte_bus_vdev.so.23.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:25.147 [452/740] Linking static target drivers/librte_bus_vdev.a 00:01:25.147 [453/740] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.407 [454/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:01:25.407 [455/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:01:25.407 [456/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:01:25.407 [457/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:01:25.407 [458/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:01:25.407 [459/740] Linking static target lib/librte_fib.a 00:01:25.407 [460/740] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.407 [461/740] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.407 [462/740] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:01:25.407 [463/740] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:01:25.407 [464/740] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.407 [465/740] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.407 [466/740] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.407 [467/740] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:01:25.670 [468/740] Linking static target lib/librte_pdump.a 00:01:25.670 [469/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:01:25.670 [470/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:01:25.670 [471/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:25.670 [472/740] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.670 [473/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:01:25.670 [474/740] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:25.670 [475/740] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:25.671 [476/740] Linking static target drivers/librte_bus_pci.a 00:01:25.671 [477/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:25.671 [478/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:01:25.671 [479/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:01:25.671 [480/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:01:25.671 [481/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:01:25.671 [482/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:01:25.671 [483/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:01:25.671 [484/740] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.671 [485/740] Compiling C object drivers/librte_bus_pci.so.23.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:25.671 [486/740] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.671 [487/740] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.671 [488/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:01:25.671 [489/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:01:25.929 [490/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:01:25.929 [491/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:01:25.929 [492/740] Linking static target lib/librte_table.a 00:01:25.929 [493/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:01:25.930 [494/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:01:25.930 [495/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:01:25.930 [496/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:01:25.930 [497/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:01:25.930 [498/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:01:25.930 [499/740] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:01:25.930 [500/740] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.930 [501/740] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:01:25.930 [502/740] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:01:25.930 [503/740] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:01:25.930 [504/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:01:25.930 [505/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:01:25.930 [506/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:01:25.930 [507/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:01:25.930 [508/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:01:25.930 [509/740] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.930 [510/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:01:25.930 [511/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:01:25.930 [512/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:01:25.930 [513/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:01:25.930 [514/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:01:25.930 [515/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:01:25.930 [516/740] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:26.189 [517/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:01:26.189 [518/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:01:26.189 [519/740] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:01:26.189 [520/740] Linking static target lib/librte_sched.a 00:01:26.189 [521/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:01:26.189 [522/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:01:26.189 [523/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:01:26.189 [524/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:26.189 [525/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:01:26.189 [526/740] Linking static target lib/librte_cryptodev.a 00:01:26.189 [527/740] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:26.189 [528/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:01:26.189 [529/740] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:01:26.189 [530/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:01:26.189 [531/740] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:26.189 [532/740] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:01:26.189 [533/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:01:26.189 [534/740] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:26.189 [535/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:01:26.189 [536/740] Linking static target lib/librte_node.a 00:01:26.189 [537/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:01:26.189 [538/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:01:26.189 [539/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:01:26.189 [540/740] Linking static target lib/librte_ipsec.a 00:01:26.189 [541/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:01:26.189 [542/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:01:26.449 [543/740] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:01:26.449 [544/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:01:26.449 [545/740] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:26.449 [546/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:01:26.449 [547/740] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:01:26.449 [548/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:01:26.449 [549/740] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:26.449 [550/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:26.449 [551/740] Compiling C object drivers/librte_mempool_ring.so.23.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:26.449 [552/740] Linking static target drivers/librte_mempool_ring.a 00:01:26.449 [553/740] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:01:26.449 [554/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:01:26.449 [555/740] Linking static target lib/librte_ethdev.a 00:01:26.449 [556/740] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:01:26.449 [557/740] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:01:26.449 [558/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:01:26.449 [559/740] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:01:26.449 [560/740] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:01:26.449 [561/740] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:01:26.449 [562/740] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:01:26.449 [563/740] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:01:26.449 [564/740] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:01:26.449 [565/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:01:26.449 [566/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:01:26.449 [567/740] Linking static target lib/librte_port.a 00:01:26.449 [568/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:01:26.449 [569/740] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:01:26.449 [570/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:01:26.449 [571/740] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:01:26.449 [572/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:01:26.449 [573/740] Linking static target lib/librte_member.a 00:01:26.449 [574/740] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:01:26.449 [575/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:01:26.449 [576/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:01:26.449 [577/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:01:26.708 [578/740] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:01:26.708 [579/740] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:01:26.708 [580/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:01:26.708 [581/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:01:26.708 [582/740] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:01:26.708 [583/740] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:01:26.708 [584/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:01:26.708 [585/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:01:26.708 [586/740] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:01:26.708 [587/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:01:26.708 [588/740] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:26.708 [589/740] Linking static target lib/librte_hash.a 00:01:26.708 [590/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:01:26.708 [591/740] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:01:26.708 [592/740] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:01:26.708 [593/740] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:01:26.708 [594/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:01:26.708 [595/740] Linking static target lib/librte_eventdev.a 00:01:26.968 [596/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx2.c.o 00:01:26.968 [597/740] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:01:26.968 [598/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:01:26.968 [599/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:01:26.968 [600/740] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:01:26.968 [601/740] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:01:26.968 [602/740] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:01:26.968 [603/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:01:26.968 [604/740] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:01:26.968 [605/740] Linking static target drivers/net/i40e/base/libi40e_base.a 00:01:27.230 [606/740] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:01:27.230 [607/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:01:27.230 [608/740] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:01:27.230 [609/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:01:27.230 [610/740] Linking static target lib/librte_acl.a 00:01:27.230 [611/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_avx2.c.o 00:01:27.230 [612/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:01:27.230 [613/740] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:01:27.496 [614/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:01:27.496 [615/740] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:01:27.755 [616/740] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:01:27.755 [617/740] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:01:27.755 [618/740] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:28.323 [619/740] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:01:28.324 [620/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:01:28.583 [621/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:01:29.151 [622/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:01:29.151 [623/740] Linking static target drivers/libtmp_rte_net_i40e.a 00:01:29.410 [624/740] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:01:29.410 [625/740] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:29.410 [626/740] Compiling C object drivers/librte_net_i40e.so.23.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:29.410 [627/740] Linking static target drivers/librte_net_i40e.a 00:01:29.669 [628/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:29.928 [629/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:01:29.928 [630/740] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:30.187 [631/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:01:30.187 [632/740] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:30.446 [633/740] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.732 [634/740] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.732 [635/740] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:35.732 [636/740] Linking static target lib/librte_vhost.a 00:01:36.669 [637/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:01:36.669 [638/740] Linking static target lib/librte_pipeline.a 00:01:37.236 [639/740] Linking target app/dpdk-test-acl 00:01:37.236 [640/740] Linking target app/dpdk-dumpcap 00:01:37.236 [641/740] Linking target app/dpdk-pdump 00:01:37.236 [642/740] Linking target app/dpdk-test-gpudev 00:01:37.236 [643/740] Linking target app/dpdk-test-cmdline 00:01:37.236 [644/740] Linking target app/dpdk-proc-info 00:01:37.236 [645/740] Linking target app/dpdk-test-compress-perf 00:01:37.236 [646/740] Linking target app/dpdk-test-pipeline 00:01:37.236 [647/740] Linking target app/dpdk-test-flow-perf 00:01:37.236 [648/740] Linking target app/dpdk-test-crypto-perf 00:01:37.236 [649/740] Linking target app/dpdk-test-security-perf 00:01:37.236 [650/740] Linking target app/dpdk-test-regex 00:01:37.236 [651/740] Linking target app/dpdk-test-eventdev 00:01:37.236 [652/740] Linking target app/dpdk-test-sad 00:01:37.236 [653/740] Linking target app/dpdk-test-bbdev 00:01:37.236 [654/740] Linking target app/dpdk-test-fib 00:01:37.236 [655/740] Linking target app/dpdk-testpmd 00:01:38.176 [656/740] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.745 [657/740] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.745 [658/740] Linking target lib/librte_eal.so.23.0 00:01:39.005 [659/740] Generating symbol file lib/librte_eal.so.23.0.p/librte_eal.so.23.0.symbols 00:01:39.005 [660/740] Linking target lib/librte_ring.so.23.0 00:01:39.005 [661/740] Linking target lib/librte_pci.so.23.0 00:01:39.005 [662/740] Linking target lib/librte_timer.so.23.0 00:01:39.005 [663/740] Linking target lib/librte_meter.so.23.0 00:01:39.005 [664/740] Linking target lib/librte_cfgfile.so.23.0 00:01:39.005 [665/740] Linking target drivers/librte_bus_vdev.so.23.0 00:01:39.005 [666/740] Linking target lib/librte_jobstats.so.23.0 00:01:39.005 [667/740] Linking target lib/librte_dmadev.so.23.0 00:01:39.005 [668/740] Linking target lib/librte_stack.so.23.0 00:01:39.005 [669/740] Linking target lib/librte_rawdev.so.23.0 00:01:39.005 [670/740] Linking target lib/librte_acl.so.23.0 00:01:39.005 [671/740] Linking target lib/librte_graph.so.23.0 00:01:39.005 [672/740] Generating symbol file lib/librte_pci.so.23.0.p/librte_pci.so.23.0.symbols 00:01:39.005 [673/740] Generating symbol file lib/librte_meter.so.23.0.p/librte_meter.so.23.0.symbols 00:01:39.005 [674/740] Generating symbol file lib/librte_timer.so.23.0.p/librte_timer.so.23.0.symbols 00:01:39.005 [675/740] Generating symbol file drivers/librte_bus_vdev.so.23.0.p/librte_bus_vdev.so.23.0.symbols 00:01:39.005 [676/740] Generating symbol file lib/librte_acl.so.23.0.p/librte_acl.so.23.0.symbols 00:01:39.005 [677/740] Generating symbol file lib/librte_ring.so.23.0.p/librte_ring.so.23.0.symbols 00:01:39.005 [678/740] Generating symbol file lib/librte_graph.so.23.0.p/librte_graph.so.23.0.symbols 00:01:39.005 [679/740] Generating symbol file lib/librte_dmadev.so.23.0.p/librte_dmadev.so.23.0.symbols 00:01:39.005 [680/740] Linking target drivers/librte_bus_pci.so.23.0 00:01:39.265 [681/740] Linking target lib/librte_rcu.so.23.0 00:01:39.265 [682/740] Linking target lib/librte_mempool.so.23.0 00:01:39.265 [683/740] Generating symbol file drivers/librte_bus_pci.so.23.0.p/librte_bus_pci.so.23.0.symbols 00:01:39.265 [684/740] Generating symbol file lib/librte_mempool.so.23.0.p/librte_mempool.so.23.0.symbols 00:01:39.265 [685/740] Generating symbol file lib/librte_rcu.so.23.0.p/librte_rcu.so.23.0.symbols 00:01:39.265 [686/740] Linking target lib/librte_rib.so.23.0 00:01:39.265 [687/740] Linking target lib/librte_mbuf.so.23.0 00:01:39.265 [688/740] Linking target drivers/librte_mempool_ring.so.23.0 00:01:39.524 [689/740] Generating symbol file lib/librte_mbuf.so.23.0.p/librte_mbuf.so.23.0.symbols 00:01:39.524 [690/740] Generating symbol file lib/librte_rib.so.23.0.p/librte_rib.so.23.0.symbols 00:01:39.524 [691/740] Linking target lib/librte_net.so.23.0 00:01:39.524 [692/740] Linking target lib/librte_bbdev.so.23.0 00:01:39.524 [693/740] Linking target lib/librte_sched.so.23.0 00:01:39.524 [694/740] Linking target lib/librte_reorder.so.23.0 00:01:39.524 [695/740] Linking target lib/librte_gpudev.so.23.0 00:01:39.524 [696/740] Linking target lib/librte_cryptodev.so.23.0 00:01:39.524 [697/740] Linking target lib/librte_regexdev.so.23.0 00:01:39.524 [698/740] Linking target lib/librte_compressdev.so.23.0 00:01:39.524 [699/740] Linking target lib/librte_distributor.so.23.0 00:01:39.524 [700/740] Linking target lib/librte_fib.so.23.0 00:01:39.524 [701/740] Generating symbol file lib/librte_sched.so.23.0.p/librte_sched.so.23.0.symbols 00:01:39.524 [702/740] Generating symbol file lib/librte_net.so.23.0.p/librte_net.so.23.0.symbols 00:01:39.524 [703/740] Generating symbol file lib/librte_cryptodev.so.23.0.p/librte_cryptodev.so.23.0.symbols 00:01:39.784 [704/740] Linking target lib/librte_cmdline.so.23.0 00:01:39.784 [705/740] Linking target lib/librte_hash.so.23.0 00:01:39.784 [706/740] Linking target lib/librte_security.so.23.0 00:01:39.784 [707/740] Linking target lib/librte_ethdev.so.23.0 00:01:39.784 [708/740] Generating symbol file lib/librte_hash.so.23.0.p/librte_hash.so.23.0.symbols 00:01:39.784 [709/740] Generating symbol file lib/librte_security.so.23.0.p/librte_security.so.23.0.symbols 00:01:39.784 [710/740] Generating symbol file lib/librte_ethdev.so.23.0.p/librte_ethdev.so.23.0.symbols 00:01:39.784 [711/740] Linking target lib/librte_efd.so.23.0 00:01:39.784 [712/740] Linking target lib/librte_lpm.so.23.0 00:01:39.784 [713/740] Linking target lib/librte_member.so.23.0 00:01:39.784 [714/740] Linking target lib/librte_ipsec.so.23.0 00:01:39.784 [715/740] Linking target lib/librte_pcapng.so.23.0 00:01:40.043 [716/740] Linking target lib/librte_gro.so.23.0 00:01:40.043 [717/740] Linking target lib/librte_metrics.so.23.0 00:01:40.043 [718/740] Linking target lib/librte_bpf.so.23.0 00:01:40.043 [719/740] Linking target lib/librte_gso.so.23.0 00:01:40.043 [720/740] Linking target lib/librte_ip_frag.so.23.0 00:01:40.043 [721/740] Linking target lib/librte_power.so.23.0 00:01:40.043 [722/740] Linking target lib/librte_eventdev.so.23.0 00:01:40.043 [723/740] Linking target lib/librte_vhost.so.23.0 00:01:40.043 [724/740] Linking target drivers/librte_net_i40e.so.23.0 00:01:40.043 [725/740] Generating symbol file lib/librte_lpm.so.23.0.p/librte_lpm.so.23.0.symbols 00:01:40.043 [726/740] Generating symbol file lib/librte_pcapng.so.23.0.p/librte_pcapng.so.23.0.symbols 00:01:40.043 [727/740] Generating symbol file lib/librte_metrics.so.23.0.p/librte_metrics.so.23.0.symbols 00:01:40.043 [728/740] Generating symbol file lib/librte_bpf.so.23.0.p/librte_bpf.so.23.0.symbols 00:01:40.044 [729/740] Generating symbol file lib/librte_ip_frag.so.23.0.p/librte_ip_frag.so.23.0.symbols 00:01:40.044 [730/740] Generating symbol file lib/librte_eventdev.so.23.0.p/librte_eventdev.so.23.0.symbols 00:01:40.044 [731/740] Linking target lib/librte_node.so.23.0 00:01:40.044 [732/740] Linking target lib/librte_latencystats.so.23.0 00:01:40.044 [733/740] Linking target lib/librte_bitratestats.so.23.0 00:01:40.044 [734/740] Linking target lib/librte_pdump.so.23.0 00:01:40.044 [735/740] Linking target lib/librte_port.so.23.0 00:01:40.303 [736/740] Generating symbol file lib/librte_port.so.23.0.p/librte_port.so.23.0.symbols 00:01:40.303 [737/740] Linking target lib/librte_table.so.23.0 00:01:40.563 [738/740] Generating symbol file lib/librte_table.so.23.0.p/librte_table.so.23.0.symbols 00:01:41.943 [739/740] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:41.943 [740/740] Linking target lib/librte_pipeline.so.23.0 00:01:41.943 06:57:00 -- common/autobuild_common.sh@190 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:01:42.203 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:42.203 [0/1] Installing files. 00:01:42.467 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.467 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/flow_classify.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/ipv4_rules_file.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:42.468 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:42.469 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.470 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.471 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/kni.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:01:42.472 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:42.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:42.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:42.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:42.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:42.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:01:42.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:01:42.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:01:42.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:01:42.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:01:42.473 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:01:42.473 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.473 Installing lib/librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing lib/librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing drivers/librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:01:42.737 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing drivers/librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:01:42.737 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing drivers/librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:01:42.737 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.737 Installing drivers/librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:01:42.737 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.737 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.737 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.737 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.737 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.737 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.737 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.737 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.737 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.737 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.737 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.737 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.737 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.737 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.737 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.737 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.737 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.737 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.737 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.737 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.737 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.737 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.737 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.737 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.737 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.737 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.737 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.737 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.737 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.737 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.737 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.737 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.737 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.737 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.737 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.737 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.737 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.738 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.739 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_empty_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_intel_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.740 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:01:42.741 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:01:42.741 Installing symlink pointing to librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.23 00:01:42.741 Installing symlink pointing to librte_kvargs.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:01:42.741 Installing symlink pointing to librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.23 00:01:42.741 Installing symlink pointing to librte_telemetry.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:01:42.741 Installing symlink pointing to librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.23 00:01:42.741 Installing symlink pointing to librte_eal.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:01:42.741 Installing symlink pointing to librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.23 00:01:42.741 Installing symlink pointing to librte_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:01:42.741 Installing symlink pointing to librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.23 00:01:42.741 Installing symlink pointing to librte_rcu.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:01:42.741 Installing symlink pointing to librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.23 00:01:42.741 Installing symlink pointing to librte_mempool.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:01:42.741 Installing symlink pointing to librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.23 00:01:42.741 Installing symlink pointing to librte_mbuf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:01:42.741 Installing symlink pointing to librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.23 00:01:42.741 Installing symlink pointing to librte_net.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:01:42.741 Installing symlink pointing to librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.23 00:01:42.741 Installing symlink pointing to librte_meter.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:01:42.741 Installing symlink pointing to librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.23 00:01:42.741 Installing symlink pointing to librte_ethdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:01:42.741 Installing symlink pointing to librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.23 00:01:42.741 Installing symlink pointing to librte_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:01:42.741 Installing symlink pointing to librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.23 00:01:42.741 Installing symlink pointing to librte_cmdline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:01:42.741 Installing symlink pointing to librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.23 00:01:42.741 Installing symlink pointing to librte_metrics.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:01:42.741 Installing symlink pointing to librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.23 00:01:42.741 Installing symlink pointing to librte_hash.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:01:42.741 Installing symlink pointing to librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.23 00:01:42.741 Installing symlink pointing to librte_timer.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:01:42.741 Installing symlink pointing to librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.23 00:01:42.741 Installing symlink pointing to librte_acl.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:01:42.741 Installing symlink pointing to librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.23 00:01:42.741 Installing symlink pointing to librte_bbdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:01:42.741 Installing symlink pointing to librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.23 00:01:42.741 Installing symlink pointing to librte_bitratestats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:01:42.741 Installing symlink pointing to librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.23 00:01:42.741 Installing symlink pointing to librte_bpf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:01:42.742 Installing symlink pointing to librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.23 00:01:42.742 Installing symlink pointing to librte_cfgfile.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:01:42.742 Installing symlink pointing to librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.23 00:01:42.742 Installing symlink pointing to librte_compressdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:01:42.742 Installing symlink pointing to librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.23 00:01:42.742 Installing symlink pointing to librte_cryptodev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:01:42.742 Installing symlink pointing to librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.23 00:01:42.742 Installing symlink pointing to librte_distributor.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:01:42.742 Installing symlink pointing to librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.23 00:01:42.742 Installing symlink pointing to librte_efd.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:01:42.742 Installing symlink pointing to librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.23 00:01:42.742 Installing symlink pointing to librte_eventdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:01:42.742 Installing symlink pointing to librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.23 00:01:42.742 Installing symlink pointing to librte_gpudev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:01:42.742 Installing symlink pointing to librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.23 00:01:42.742 Installing symlink pointing to librte_gro.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:01:42.742 Installing symlink pointing to librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.23 00:01:42.742 Installing symlink pointing to librte_gso.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:01:42.742 Installing symlink pointing to librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.23 00:01:42.742 Installing symlink pointing to librte_ip_frag.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:01:42.742 Installing symlink pointing to librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.23 00:01:42.742 Installing symlink pointing to librte_jobstats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:01:42.742 Installing symlink pointing to librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.23 00:01:42.742 Installing symlink pointing to librte_latencystats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:01:42.742 Installing symlink pointing to librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.23 00:01:42.742 Installing symlink pointing to librte_lpm.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:01:42.742 Installing symlink pointing to librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.23 00:01:42.742 Installing symlink pointing to librte_member.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:01:42.742 Installing symlink pointing to librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.23 00:01:42.742 Installing symlink pointing to librte_pcapng.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:01:42.742 Installing symlink pointing to librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.23 00:01:42.742 Installing symlink pointing to librte_power.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:01:42.742 Installing symlink pointing to librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.23 00:01:42.742 Installing symlink pointing to librte_rawdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:01:42.742 Installing symlink pointing to librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.23 00:01:42.742 Installing symlink pointing to librte_regexdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:01:42.742 Installing symlink pointing to librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.23 00:01:42.742 Installing symlink pointing to librte_dmadev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:01:42.742 Installing symlink pointing to librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.23 00:01:42.742 Installing symlink pointing to librte_rib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:01:42.742 Installing symlink pointing to librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.23 00:01:42.742 Installing symlink pointing to librte_reorder.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:01:42.742 Installing symlink pointing to librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.23 00:01:42.742 Installing symlink pointing to librte_sched.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:01:42.742 Installing symlink pointing to librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.23 00:01:42.742 Installing symlink pointing to librte_security.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:01:42.742 Installing symlink pointing to librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.23 00:01:42.742 Installing symlink pointing to librte_stack.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:01:42.742 Installing symlink pointing to librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.23 00:01:42.742 Installing symlink pointing to librte_vhost.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:01:42.742 './librte_bus_pci.so' -> 'dpdk/pmds-23.0/librte_bus_pci.so' 00:01:42.742 './librte_bus_pci.so.23' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23' 00:01:42.742 './librte_bus_pci.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23.0' 00:01:42.742 './librte_bus_vdev.so' -> 'dpdk/pmds-23.0/librte_bus_vdev.so' 00:01:42.742 './librte_bus_vdev.so.23' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23' 00:01:42.742 './librte_bus_vdev.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23.0' 00:01:42.742 './librte_mempool_ring.so' -> 'dpdk/pmds-23.0/librte_mempool_ring.so' 00:01:42.742 './librte_mempool_ring.so.23' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23' 00:01:42.742 './librte_mempool_ring.so.23.0' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23.0' 00:01:42.742 './librte_net_i40e.so' -> 'dpdk/pmds-23.0/librte_net_i40e.so' 00:01:42.742 './librte_net_i40e.so.23' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23' 00:01:42.742 './librte_net_i40e.so.23.0' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23.0' 00:01:42.742 Installing symlink pointing to librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.23 00:01:42.742 Installing symlink pointing to librte_ipsec.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:01:42.742 Installing symlink pointing to librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.23 00:01:42.742 Installing symlink pointing to librte_fib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:01:42.742 Installing symlink pointing to librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.23 00:01:42.742 Installing symlink pointing to librte_port.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:01:42.742 Installing symlink pointing to librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.23 00:01:42.742 Installing symlink pointing to librte_pdump.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:01:42.742 Installing symlink pointing to librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.23 00:01:42.742 Installing symlink pointing to librte_table.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:01:42.742 Installing symlink pointing to librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.23 00:01:42.742 Installing symlink pointing to librte_pipeline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:01:42.742 Installing symlink pointing to librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.23 00:01:42.742 Installing symlink pointing to librte_graph.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:01:42.742 Installing symlink pointing to librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.23 00:01:42.742 Installing symlink pointing to librte_node.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:01:42.742 Installing symlink pointing to librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so.23 00:01:42.742 Installing symlink pointing to librte_bus_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so 00:01:42.742 Installing symlink pointing to librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so.23 00:01:42.742 Installing symlink pointing to librte_bus_vdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so 00:01:42.742 Installing symlink pointing to librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so.23 00:01:42.742 Installing symlink pointing to librte_mempool_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so 00:01:42.742 Installing symlink pointing to librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so.23 00:01:42.742 Installing symlink pointing to librte_net_i40e.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so 00:01:42.742 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-23.0' 00:01:42.742 06:57:00 -- common/autobuild_common.sh@192 -- $ uname -s 00:01:42.742 06:57:00 -- common/autobuild_common.sh@192 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:01:42.742 06:57:00 -- common/autobuild_common.sh@203 -- $ cat 00:01:42.742 06:57:00 -- common/autobuild_common.sh@208 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:42.742 00:01:42.742 real 0m26.302s 00:01:42.742 user 6m35.697s 00:01:42.742 sys 2m23.211s 00:01:42.742 06:57:00 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:01:42.742 06:57:00 -- common/autotest_common.sh@10 -- $ set +x 00:01:42.742 ************************************ 00:01:42.742 END TEST build_native_dpdk 00:01:42.742 ************************************ 00:01:42.742 06:57:00 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:42.742 06:57:00 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:42.742 06:57:00 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:01:42.742 06:57:00 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:01:42.742 06:57:00 -- common/autobuild_common.sh@428 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:01:42.742 06:57:00 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:01:42.742 06:57:00 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:42.743 06:57:00 -- common/autotest_common.sh@10 -- $ set +x 00:01:42.743 ************************************ 00:01:42.743 START TEST autobuild_llvm_precompile 00:01:42.743 ************************************ 00:01:42.743 06:57:00 -- common/autotest_common.sh@1114 -- $ _llvm_precompile 00:01:42.743 06:57:00 -- common/autobuild_common.sh@32 -- $ clang --version 00:01:42.743 06:57:00 -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:01:42.743 Target: x86_64-redhat-linux-gnu 00:01:42.743 Thread model: posix 00:01:42.743 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:01:42.743 06:57:00 -- common/autobuild_common.sh@33 -- $ clang_num=17 00:01:42.743 06:57:00 -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:01:42.743 06:57:00 -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:01:42.743 06:57:00 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:01:42.743 06:57:00 -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:01:42.743 06:57:00 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:01:42.743 06:57:00 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:42.743 06:57:00 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:01:42.743 06:57:00 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:01:42.743 06:57:00 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:43.002 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:01:43.262 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:43.262 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:43.262 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:43.831 Using 'verbs' RDMA provider 00:01:59.292 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:14.210 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:14.210 Creating mk/config.mk...done. 00:02:14.210 Creating mk/cc.flags.mk...done. 00:02:14.210 Type 'make' to build. 00:02:14.210 00:02:14.210 real 0m29.425s 00:02:14.210 user 0m12.904s 00:02:14.210 sys 0m15.998s 00:02:14.210 06:57:30 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:02:14.210 06:57:30 -- common/autotest_common.sh@10 -- $ set +x 00:02:14.210 ************************************ 00:02:14.210 END TEST autobuild_llvm_precompile 00:02:14.210 ************************************ 00:02:14.210 06:57:30 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:14.210 06:57:30 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:14.210 06:57:30 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:14.210 06:57:30 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:14.210 06:57:30 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:14.210 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:14.210 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:14.210 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:14.210 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:14.210 Using 'verbs' RDMA provider 00:02:26.505 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:38.717 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:38.717 Creating mk/config.mk...done. 00:02:38.717 Creating mk/cc.flags.mk...done. 00:02:38.717 Type 'make' to build. 00:02:38.717 06:57:55 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:02:38.717 06:57:55 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:02:38.717 06:57:55 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:02:38.717 06:57:55 -- common/autotest_common.sh@10 -- $ set +x 00:02:38.717 ************************************ 00:02:38.717 START TEST make 00:02:38.717 ************************************ 00:02:38.717 06:57:55 -- common/autotest_common.sh@1114 -- $ make -j112 00:02:38.717 make[1]: Nothing to be done for 'all'. 00:02:39.284 The Meson build system 00:02:39.284 Version: 1.5.0 00:02:39.284 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:02:39.284 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:39.284 Build type: native build 00:02:39.284 Project name: libvfio-user 00:02:39.284 Project version: 0.0.1 00:02:39.284 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:02:39.284 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:02:39.284 Host machine cpu family: x86_64 00:02:39.284 Host machine cpu: x86_64 00:02:39.284 Run-time dependency threads found: YES 00:02:39.284 Library dl found: YES 00:02:39.284 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:39.284 Run-time dependency json-c found: YES 0.17 00:02:39.284 Run-time dependency cmocka found: YES 1.1.7 00:02:39.284 Program pytest-3 found: NO 00:02:39.284 Program flake8 found: NO 00:02:39.284 Program misspell-fixer found: NO 00:02:39.284 Program restructuredtext-lint found: NO 00:02:39.284 Program valgrind found: YES (/usr/bin/valgrind) 00:02:39.284 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:39.284 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:39.284 Compiler for C supports arguments -Wwrite-strings: YES 00:02:39.284 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:39.284 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:02:39.284 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:02:39.284 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:39.284 Build targets in project: 8 00:02:39.284 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:02:39.284 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:02:39.284 00:02:39.284 libvfio-user 0.0.1 00:02:39.284 00:02:39.284 User defined options 00:02:39.284 buildtype : debug 00:02:39.284 default_library: static 00:02:39.284 libdir : /usr/local/lib 00:02:39.284 00:02:39.284 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:39.853 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:39.853 [1/36] Compiling C object samples/lspci.p/lspci.c.o 00:02:39.853 [2/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:02:39.853 [3/36] Compiling C object samples/null.p/null.c.o 00:02:39.853 [4/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:02:39.853 [5/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:02:39.853 [6/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:02:39.853 [7/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:02:39.853 [8/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:02:39.853 [9/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:02:39.853 [10/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:02:39.853 [11/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:02:39.853 [12/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:02:39.853 [13/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:02:39.853 [14/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:02:39.853 [15/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:02:39.853 [16/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:02:39.853 [17/36] Compiling C object test/unit_tests.p/mocks.c.o 00:02:39.853 [18/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:02:39.853 [19/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:02:39.853 [20/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:02:39.853 [21/36] Compiling C object samples/server.p/server.c.o 00:02:39.853 [22/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:02:39.853 [23/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:02:39.853 [24/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:02:39.853 [25/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:02:39.853 [26/36] Compiling C object samples/client.p/client.c.o 00:02:39.853 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:02:39.853 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:02:39.853 [29/36] Linking static target lib/libvfio-user.a 00:02:39.853 [30/36] Linking target samples/client 00:02:39.853 [31/36] Linking target samples/null 00:02:39.853 [32/36] Linking target test/unit_tests 00:02:39.853 [33/36] Linking target samples/shadow_ioeventfd_server 00:02:39.853 [34/36] Linking target samples/gpio-pci-idio-16 00:02:39.853 [35/36] Linking target samples/server 00:02:40.112 [36/36] Linking target samples/lspci 00:02:40.112 INFO: autodetecting backend as ninja 00:02:40.112 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:40.112 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:40.371 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:40.371 ninja: no work to do. 00:02:43.660 CC lib/ut_mock/mock.o 00:02:43.660 CC lib/log/log.o 00:02:43.660 CC lib/log/log_flags.o 00:02:43.660 CC lib/log/log_deprecated.o 00:02:43.660 CC lib/ut/ut.o 00:02:43.660 LIB libspdk_ut_mock.a 00:02:43.660 LIB libspdk_log.a 00:02:43.660 LIB libspdk_ut.a 00:02:43.918 CC lib/dma/dma.o 00:02:43.918 CC lib/util/base64.o 00:02:43.918 CC lib/util/bit_array.o 00:02:43.918 CC lib/util/cpuset.o 00:02:43.918 CC lib/util/crc16.o 00:02:43.918 CC lib/util/crc32.o 00:02:43.918 CC lib/ioat/ioat.o 00:02:43.918 CC lib/util/crc32c.o 00:02:43.918 CC lib/util/crc32_ieee.o 00:02:43.918 CC lib/util/crc64.o 00:02:43.918 CC lib/util/dif.o 00:02:43.918 CC lib/util/file.o 00:02:43.918 CC lib/util/fd.o 00:02:43.919 CXX lib/trace_parser/trace.o 00:02:43.919 CC lib/util/hexlify.o 00:02:43.919 CC lib/util/iov.o 00:02:43.919 CC lib/util/pipe.o 00:02:43.919 CC lib/util/math.o 00:02:43.919 CC lib/util/strerror_tls.o 00:02:43.919 CC lib/util/string.o 00:02:43.919 CC lib/util/fd_group.o 00:02:43.919 CC lib/util/uuid.o 00:02:43.919 CC lib/util/xor.o 00:02:43.919 CC lib/util/zipf.o 00:02:43.919 LIB libspdk_dma.a 00:02:43.919 CC lib/vfio_user/host/vfio_user_pci.o 00:02:43.919 CC lib/vfio_user/host/vfio_user.o 00:02:43.919 LIB libspdk_ioat.a 00:02:44.177 LIB libspdk_vfio_user.a 00:02:44.177 LIB libspdk_util.a 00:02:44.177 LIB libspdk_trace_parser.a 00:02:44.435 CC lib/json/json_parse.o 00:02:44.435 CC lib/conf/conf.o 00:02:44.435 CC lib/json/json_util.o 00:02:44.435 CC lib/json/json_write.o 00:02:44.435 CC lib/rdma/common.o 00:02:44.435 CC lib/rdma/rdma_verbs.o 00:02:44.435 CC lib/env_dpdk/env.o 00:02:44.435 CC lib/env_dpdk/memory.o 00:02:44.435 CC lib/vmd/vmd.o 00:02:44.435 CC lib/idxd/idxd.o 00:02:44.435 CC lib/env_dpdk/pci.o 00:02:44.435 CC lib/vmd/led.o 00:02:44.435 CC lib/env_dpdk/init.o 00:02:44.435 CC lib/idxd/idxd_user.o 00:02:44.435 CC lib/env_dpdk/threads.o 00:02:44.435 CC lib/idxd/idxd_kernel.o 00:02:44.435 CC lib/env_dpdk/pci_ioat.o 00:02:44.435 CC lib/env_dpdk/pci_virtio.o 00:02:44.435 CC lib/env_dpdk/pci_vmd.o 00:02:44.435 CC lib/env_dpdk/pci_idxd.o 00:02:44.435 CC lib/env_dpdk/pci_event.o 00:02:44.435 CC lib/env_dpdk/sigbus_handler.o 00:02:44.435 CC lib/env_dpdk/pci_dpdk.o 00:02:44.435 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:44.435 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:44.694 LIB libspdk_conf.a 00:02:44.694 LIB libspdk_json.a 00:02:44.694 LIB libspdk_rdma.a 00:02:44.694 LIB libspdk_idxd.a 00:02:44.694 LIB libspdk_vmd.a 00:02:44.955 CC lib/jsonrpc/jsonrpc_server.o 00:02:44.955 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:44.955 CC lib/jsonrpc/jsonrpc_client.o 00:02:44.955 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:45.214 LIB libspdk_jsonrpc.a 00:02:45.473 LIB libspdk_env_dpdk.a 00:02:45.473 CC lib/rpc/rpc.o 00:02:45.473 LIB libspdk_rpc.a 00:02:46.042 CC lib/notify/notify.o 00:02:46.042 CC lib/notify/notify_rpc.o 00:02:46.042 CC lib/sock/sock.o 00:02:46.042 CC lib/sock/sock_rpc.o 00:02:46.042 CC lib/trace/trace.o 00:02:46.042 CC lib/trace/trace_flags.o 00:02:46.042 CC lib/trace/trace_rpc.o 00:02:46.042 LIB libspdk_notify.a 00:02:46.042 LIB libspdk_trace.a 00:02:46.042 LIB libspdk_sock.a 00:02:46.301 CC lib/thread/thread.o 00:02:46.301 CC lib/thread/iobuf.o 00:02:46.301 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:46.301 CC lib/nvme/nvme_ctrlr.o 00:02:46.301 CC lib/nvme/nvme_fabric.o 00:02:46.301 CC lib/nvme/nvme_ns_cmd.o 00:02:46.560 CC lib/nvme/nvme_ns.o 00:02:46.560 CC lib/nvme/nvme_pcie_common.o 00:02:46.560 CC lib/nvme/nvme_pcie.o 00:02:46.560 CC lib/nvme/nvme_qpair.o 00:02:46.560 CC lib/nvme/nvme_quirks.o 00:02:46.560 CC lib/nvme/nvme.o 00:02:46.560 CC lib/nvme/nvme_transport.o 00:02:46.560 CC lib/nvme/nvme_discovery.o 00:02:46.560 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:46.560 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:46.560 CC lib/nvme/nvme_tcp.o 00:02:46.560 CC lib/nvme/nvme_opal.o 00:02:46.560 CC lib/nvme/nvme_io_msg.o 00:02:46.560 CC lib/nvme/nvme_poll_group.o 00:02:46.560 CC lib/nvme/nvme_zns.o 00:02:46.560 CC lib/nvme/nvme_cuse.o 00:02:46.560 CC lib/nvme/nvme_vfio_user.o 00:02:46.561 CC lib/nvme/nvme_rdma.o 00:02:47.130 LIB libspdk_thread.a 00:02:47.390 CC lib/init/json_config.o 00:02:47.390 CC lib/vfu_tgt/tgt_endpoint.o 00:02:47.390 CC lib/init/subsystem.o 00:02:47.390 CC lib/vfu_tgt/tgt_rpc.o 00:02:47.390 CC lib/accel/accel.o 00:02:47.390 CC lib/init/subsystem_rpc.o 00:02:47.390 CC lib/accel/accel_rpc.o 00:02:47.390 CC lib/init/rpc.o 00:02:47.390 CC lib/accel/accel_sw.o 00:02:47.390 CC lib/blob/blobstore.o 00:02:47.390 CC lib/virtio/virtio.o 00:02:47.390 CC lib/virtio/virtio_vhost_user.o 00:02:47.390 CC lib/blob/request.o 00:02:47.390 CC lib/blob/zeroes.o 00:02:47.390 CC lib/virtio/virtio_vfio_user.o 00:02:47.390 CC lib/virtio/virtio_pci.o 00:02:47.390 CC lib/blob/blob_bs_dev.o 00:02:47.649 LIB libspdk_init.a 00:02:47.649 LIB libspdk_vfu_tgt.a 00:02:47.649 LIB libspdk_virtio.a 00:02:47.649 LIB libspdk_nvme.a 00:02:47.908 CC lib/event/app.o 00:02:47.908 CC lib/event/reactor.o 00:02:47.908 CC lib/event/log_rpc.o 00:02:47.908 CC lib/event/app_rpc.o 00:02:47.908 CC lib/event/scheduler_static.o 00:02:48.166 LIB libspdk_accel.a 00:02:48.166 LIB libspdk_event.a 00:02:48.425 CC lib/bdev/bdev.o 00:02:48.425 CC lib/bdev/bdev_rpc.o 00:02:48.425 CC lib/bdev/bdev_zone.o 00:02:48.425 CC lib/bdev/part.o 00:02:48.425 CC lib/bdev/scsi_nvme.o 00:02:48.994 LIB libspdk_blob.a 00:02:49.253 CC lib/lvol/lvol.o 00:02:49.253 CC lib/blobfs/blobfs.o 00:02:49.253 CC lib/blobfs/tree.o 00:02:49.822 LIB libspdk_lvol.a 00:02:49.822 LIB libspdk_blobfs.a 00:02:50.081 LIB libspdk_bdev.a 00:02:50.339 CC lib/scsi/dev.o 00:02:50.339 CC lib/scsi/lun.o 00:02:50.339 CC lib/scsi/port.o 00:02:50.339 CC lib/scsi/scsi.o 00:02:50.339 CC lib/scsi/scsi_bdev.o 00:02:50.339 CC lib/nvmf/ctrlr.o 00:02:50.339 CC lib/scsi/scsi_pr.o 00:02:50.339 CC lib/nvmf/ctrlr_discovery.o 00:02:50.339 CC lib/nvmf/ctrlr_bdev.o 00:02:50.339 CC lib/scsi/scsi_rpc.o 00:02:50.339 CC lib/nvmf/nvmf.o 00:02:50.339 CC lib/nvmf/subsystem.o 00:02:50.339 CC lib/scsi/task.o 00:02:50.339 CC lib/nvmf/nvmf_rpc.o 00:02:50.339 CC lib/nvmf/transport.o 00:02:50.339 CC lib/nvmf/tcp.o 00:02:50.339 CC lib/nvmf/vfio_user.o 00:02:50.339 CC lib/nvmf/rdma.o 00:02:50.339 CC lib/nbd/nbd.o 00:02:50.339 CC lib/ftl/ftl_core.o 00:02:50.339 CC lib/ublk/ublk.o 00:02:50.339 CC lib/ftl/ftl_init.o 00:02:50.339 CC lib/nbd/nbd_rpc.o 00:02:50.339 CC lib/ublk/ublk_rpc.o 00:02:50.339 CC lib/ftl/ftl_layout.o 00:02:50.339 CC lib/ftl/ftl_debug.o 00:02:50.339 CC lib/ftl/ftl_io.o 00:02:50.339 CC lib/ftl/ftl_sb.o 00:02:50.339 CC lib/ftl/ftl_l2p.o 00:02:50.339 CC lib/ftl/ftl_l2p_flat.o 00:02:50.339 CC lib/ftl/ftl_nv_cache.o 00:02:50.339 CC lib/ftl/ftl_band.o 00:02:50.339 CC lib/ftl/ftl_band_ops.o 00:02:50.339 CC lib/ftl/ftl_writer.o 00:02:50.339 CC lib/ftl/ftl_rq.o 00:02:50.339 CC lib/ftl/ftl_l2p_cache.o 00:02:50.339 CC lib/ftl/ftl_reloc.o 00:02:50.339 CC lib/ftl/ftl_p2l.o 00:02:50.339 CC lib/ftl/mngt/ftl_mngt.o 00:02:50.339 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:50.339 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:50.339 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:50.339 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:50.339 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:50.339 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:50.339 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:50.339 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:50.339 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:50.339 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:50.339 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:50.339 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:50.339 CC lib/ftl/utils/ftl_conf.o 00:02:50.339 CC lib/ftl/utils/ftl_md.o 00:02:50.339 CC lib/ftl/utils/ftl_mempool.o 00:02:50.339 CC lib/ftl/utils/ftl_bitmap.o 00:02:50.339 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:50.339 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:50.339 CC lib/ftl/utils/ftl_property.o 00:02:50.339 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:50.339 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:50.339 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:50.339 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:50.339 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:50.339 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:50.339 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:50.340 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:50.340 CC lib/ftl/base/ftl_base_dev.o 00:02:50.340 CC lib/ftl/base/ftl_base_bdev.o 00:02:50.340 CC lib/ftl/ftl_trace.o 00:02:50.598 LIB libspdk_nbd.a 00:02:50.598 LIB libspdk_scsi.a 00:02:50.856 LIB libspdk_ublk.a 00:02:50.856 CC lib/iscsi/conn.o 00:02:50.856 CC lib/iscsi/init_grp.o 00:02:50.856 CC lib/iscsi/iscsi.o 00:02:50.856 CC lib/iscsi/md5.o 00:02:50.856 CC lib/iscsi/param.o 00:02:50.856 CC lib/iscsi/iscsi_subsystem.o 00:02:50.856 CC lib/iscsi/portal_grp.o 00:02:50.856 CC lib/iscsi/tgt_node.o 00:02:50.856 CC lib/iscsi/iscsi_rpc.o 00:02:50.856 CC lib/iscsi/task.o 00:02:50.856 CC lib/vhost/vhost.o 00:02:50.856 CC lib/vhost/vhost_rpc.o 00:02:50.856 CC lib/vhost/vhost_scsi.o 00:02:50.856 CC lib/vhost/vhost_blk.o 00:02:50.856 CC lib/vhost/rte_vhost_user.o 00:02:51.115 LIB libspdk_ftl.a 00:02:51.373 LIB libspdk_nvmf.a 00:02:51.633 LIB libspdk_vhost.a 00:02:51.633 LIB libspdk_iscsi.a 00:02:52.200 CC module/vfu_device/vfu_virtio.o 00:02:52.200 CC module/vfu_device/vfu_virtio_blk.o 00:02:52.200 CC module/vfu_device/vfu_virtio_scsi.o 00:02:52.200 CC module/vfu_device/vfu_virtio_rpc.o 00:02:52.200 CC module/env_dpdk/env_dpdk_rpc.o 00:02:52.200 LIB libspdk_env_dpdk_rpc.a 00:02:52.200 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:52.200 CC module/sock/posix/posix.o 00:02:52.200 CC module/accel/error/accel_error.o 00:02:52.200 CC module/accel/error/accel_error_rpc.o 00:02:52.200 CC module/scheduler/gscheduler/gscheduler.o 00:02:52.200 CC module/accel/iaa/accel_iaa.o 00:02:52.200 CC module/accel/iaa/accel_iaa_rpc.o 00:02:52.200 CC module/accel/ioat/accel_ioat.o 00:02:52.200 CC module/accel/dsa/accel_dsa.o 00:02:52.200 CC module/accel/ioat/accel_ioat_rpc.o 00:02:52.200 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:52.200 CC module/accel/dsa/accel_dsa_rpc.o 00:02:52.200 CC module/blob/bdev/blob_bdev.o 00:02:52.459 LIB libspdk_scheduler_dpdk_governor.a 00:02:52.459 LIB libspdk_scheduler_gscheduler.a 00:02:52.459 LIB libspdk_accel_error.a 00:02:52.459 LIB libspdk_scheduler_dynamic.a 00:02:52.459 LIB libspdk_accel_ioat.a 00:02:52.459 LIB libspdk_accel_iaa.a 00:02:52.459 LIB libspdk_accel_dsa.a 00:02:52.459 LIB libspdk_blob_bdev.a 00:02:52.459 LIB libspdk_vfu_device.a 00:02:52.718 LIB libspdk_sock_posix.a 00:02:52.976 CC module/bdev/null/bdev_null.o 00:02:52.976 CC module/bdev/lvol/vbdev_lvol.o 00:02:52.976 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:52.976 CC module/bdev/null/bdev_null_rpc.o 00:02:52.976 CC module/bdev/passthru/vbdev_passthru.o 00:02:52.976 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:52.976 CC module/blobfs/bdev/blobfs_bdev.o 00:02:52.976 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:52.976 CC module/bdev/iscsi/bdev_iscsi.o 00:02:52.976 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:52.976 CC module/bdev/aio/bdev_aio.o 00:02:52.976 CC module/bdev/aio/bdev_aio_rpc.o 00:02:52.976 CC module/bdev/gpt/gpt.o 00:02:52.976 CC module/bdev/gpt/vbdev_gpt.o 00:02:52.976 CC module/bdev/delay/vbdev_delay.o 00:02:52.976 CC module/bdev/error/vbdev_error.o 00:02:52.976 CC module/bdev/nvme/bdev_nvme.o 00:02:52.976 CC module/bdev/error/vbdev_error_rpc.o 00:02:52.976 CC module/bdev/raid/bdev_raid.o 00:02:52.976 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:52.976 CC module/bdev/nvme/nvme_rpc.o 00:02:52.976 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:52.976 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:52.976 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:52.976 CC module/bdev/raid/bdev_raid_rpc.o 00:02:52.976 CC module/bdev/malloc/bdev_malloc.o 00:02:52.976 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:52.976 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:52.976 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:52.976 CC module/bdev/raid/raid0.o 00:02:52.976 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:52.976 CC module/bdev/nvme/bdev_mdns_client.o 00:02:52.976 CC module/bdev/raid/bdev_raid_sb.o 00:02:52.976 CC module/bdev/nvme/vbdev_opal.o 00:02:52.976 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:52.976 CC module/bdev/raid/raid1.o 00:02:52.976 CC module/bdev/split/vbdev_split.o 00:02:52.976 CC module/bdev/split/vbdev_split_rpc.o 00:02:52.976 CC module/bdev/raid/concat.o 00:02:52.976 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:52.976 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:52.976 CC module/bdev/ftl/bdev_ftl.o 00:02:52.976 LIB libspdk_blobfs_bdev.a 00:02:52.976 LIB libspdk_bdev_split.a 00:02:52.976 LIB libspdk_bdev_null.a 00:02:53.235 LIB libspdk_bdev_gpt.a 00:02:53.235 LIB libspdk_bdev_error.a 00:02:53.235 LIB libspdk_bdev_passthru.a 00:02:53.235 LIB libspdk_bdev_ftl.a 00:02:53.235 LIB libspdk_bdev_aio.a 00:02:53.235 LIB libspdk_bdev_zone_block.a 00:02:53.235 LIB libspdk_bdev_delay.a 00:02:53.235 LIB libspdk_bdev_iscsi.a 00:02:53.235 LIB libspdk_bdev_malloc.a 00:02:53.235 LIB libspdk_bdev_lvol.a 00:02:53.235 LIB libspdk_bdev_virtio.a 00:02:53.494 LIB libspdk_bdev_raid.a 00:02:54.061 LIB libspdk_bdev_nvme.a 00:02:54.630 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:54.630 CC module/event/subsystems/scheduler/scheduler.o 00:02:54.630 CC module/event/subsystems/vmd/vmd.o 00:02:54.630 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:54.630 CC module/event/subsystems/sock/sock.o 00:02:54.630 CC module/event/subsystems/iobuf/iobuf.o 00:02:54.630 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:02:54.630 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:54.889 LIB libspdk_event_vhost_blk.a 00:02:54.889 LIB libspdk_event_sock.a 00:02:54.889 LIB libspdk_event_scheduler.a 00:02:54.889 LIB libspdk_event_vmd.a 00:02:54.889 LIB libspdk_event_vfu_tgt.a 00:02:54.889 LIB libspdk_event_iobuf.a 00:02:55.148 CC module/event/subsystems/accel/accel.o 00:02:55.148 LIB libspdk_event_accel.a 00:02:55.407 CC module/event/subsystems/bdev/bdev.o 00:02:55.666 LIB libspdk_event_bdev.a 00:02:55.926 CC module/event/subsystems/scsi/scsi.o 00:02:55.926 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:55.926 CC module/event/subsystems/ublk/ublk.o 00:02:55.926 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:55.926 CC module/event/subsystems/nbd/nbd.o 00:02:56.184 LIB libspdk_event_ublk.a 00:02:56.184 LIB libspdk_event_nbd.a 00:02:56.184 LIB libspdk_event_scsi.a 00:02:56.184 LIB libspdk_event_nvmf.a 00:02:56.444 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:56.444 CC module/event/subsystems/iscsi/iscsi.o 00:02:56.444 LIB libspdk_event_vhost_scsi.a 00:02:56.444 LIB libspdk_event_iscsi.a 00:02:57.014 CC app/spdk_nvme_perf/perf.o 00:02:57.014 CC app/trace_record/trace_record.o 00:02:57.014 CC app/spdk_top/spdk_top.o 00:02:57.014 CXX app/trace/trace.o 00:02:57.014 CC app/spdk_lspci/spdk_lspci.o 00:02:57.014 CC app/spdk_nvme_discover/discovery_aer.o 00:02:57.014 CC app/spdk_nvme_identify/identify.o 00:02:57.014 TEST_HEADER include/spdk/accel.h 00:02:57.014 TEST_HEADER include/spdk/accel_module.h 00:02:57.014 TEST_HEADER include/spdk/assert.h 00:02:57.014 CC test/rpc_client/rpc_client_test.o 00:02:57.014 TEST_HEADER include/spdk/base64.h 00:02:57.014 TEST_HEADER include/spdk/bdev.h 00:02:57.014 TEST_HEADER include/spdk/bdev_module.h 00:02:57.014 TEST_HEADER include/spdk/barrier.h 00:02:57.014 TEST_HEADER include/spdk/bdev_zone.h 00:02:57.014 TEST_HEADER include/spdk/bit_array.h 00:02:57.014 TEST_HEADER include/spdk/bit_pool.h 00:02:57.014 TEST_HEADER include/spdk/blob_bdev.h 00:02:57.014 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:57.014 TEST_HEADER include/spdk/blobfs.h 00:02:57.014 TEST_HEADER include/spdk/blob.h 00:02:57.014 TEST_HEADER include/spdk/config.h 00:02:57.014 TEST_HEADER include/spdk/conf.h 00:02:57.014 TEST_HEADER include/spdk/cpuset.h 00:02:57.014 TEST_HEADER include/spdk/crc16.h 00:02:57.014 TEST_HEADER include/spdk/crc32.h 00:02:57.014 CC app/nvmf_tgt/nvmf_main.o 00:02:57.014 TEST_HEADER include/spdk/crc64.h 00:02:57.014 TEST_HEADER include/spdk/dif.h 00:02:57.014 TEST_HEADER include/spdk/dma.h 00:02:57.014 TEST_HEADER include/spdk/endian.h 00:02:57.014 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:57.014 TEST_HEADER include/spdk/env_dpdk.h 00:02:57.014 TEST_HEADER include/spdk/event.h 00:02:57.014 TEST_HEADER include/spdk/env.h 00:02:57.014 TEST_HEADER include/spdk/fd_group.h 00:02:57.014 TEST_HEADER include/spdk/fd.h 00:02:57.014 TEST_HEADER include/spdk/ftl.h 00:02:57.014 TEST_HEADER include/spdk/file.h 00:02:57.014 TEST_HEADER include/spdk/gpt_spec.h 00:02:57.014 CC app/spdk_dd/spdk_dd.o 00:02:57.015 TEST_HEADER include/spdk/histogram_data.h 00:02:57.015 TEST_HEADER include/spdk/hexlify.h 00:02:57.015 TEST_HEADER include/spdk/idxd.h 00:02:57.015 TEST_HEADER include/spdk/idxd_spec.h 00:02:57.015 TEST_HEADER include/spdk/init.h 00:02:57.015 TEST_HEADER include/spdk/ioat.h 00:02:57.015 TEST_HEADER include/spdk/ioat_spec.h 00:02:57.015 TEST_HEADER include/spdk/iscsi_spec.h 00:02:57.015 TEST_HEADER include/spdk/json.h 00:02:57.015 TEST_HEADER include/spdk/jsonrpc.h 00:02:57.015 TEST_HEADER include/spdk/likely.h 00:02:57.015 TEST_HEADER include/spdk/log.h 00:02:57.015 TEST_HEADER include/spdk/lvol.h 00:02:57.015 TEST_HEADER include/spdk/memory.h 00:02:57.015 TEST_HEADER include/spdk/mmio.h 00:02:57.015 TEST_HEADER include/spdk/nbd.h 00:02:57.015 TEST_HEADER include/spdk/notify.h 00:02:57.015 TEST_HEADER include/spdk/nvme.h 00:02:57.015 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:57.015 TEST_HEADER include/spdk/nvme_intel.h 00:02:57.015 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:57.015 TEST_HEADER include/spdk/nvme_spec.h 00:02:57.015 TEST_HEADER include/spdk/nvme_zns.h 00:02:57.015 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:57.015 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:57.015 TEST_HEADER include/spdk/nvmf_transport.h 00:02:57.015 TEST_HEADER include/spdk/nvmf.h 00:02:57.015 TEST_HEADER include/spdk/nvmf_spec.h 00:02:57.015 TEST_HEADER include/spdk/opal.h 00:02:57.015 TEST_HEADER include/spdk/pci_ids.h 00:02:57.015 TEST_HEADER include/spdk/opal_spec.h 00:02:57.015 TEST_HEADER include/spdk/pipe.h 00:02:57.015 TEST_HEADER include/spdk/queue.h 00:02:57.015 TEST_HEADER include/spdk/reduce.h 00:02:57.015 TEST_HEADER include/spdk/scheduler.h 00:02:57.015 TEST_HEADER include/spdk/rpc.h 00:02:57.015 CC app/vhost/vhost.o 00:02:57.015 TEST_HEADER include/spdk/scsi.h 00:02:57.015 TEST_HEADER include/spdk/scsi_spec.h 00:02:57.015 TEST_HEADER include/spdk/stdinc.h 00:02:57.015 TEST_HEADER include/spdk/sock.h 00:02:57.015 CC app/iscsi_tgt/iscsi_tgt.o 00:02:57.015 TEST_HEADER include/spdk/thread.h 00:02:57.015 TEST_HEADER include/spdk/string.h 00:02:57.015 TEST_HEADER include/spdk/trace.h 00:02:57.015 TEST_HEADER include/spdk/trace_parser.h 00:02:57.015 TEST_HEADER include/spdk/tree.h 00:02:57.015 TEST_HEADER include/spdk/ublk.h 00:02:57.015 TEST_HEADER include/spdk/util.h 00:02:57.015 TEST_HEADER include/spdk/uuid.h 00:02:57.015 TEST_HEADER include/spdk/version.h 00:02:57.015 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:57.015 TEST_HEADER include/spdk/vhost.h 00:02:57.015 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:57.015 TEST_HEADER include/spdk/xor.h 00:02:57.015 TEST_HEADER include/spdk/vmd.h 00:02:57.015 TEST_HEADER include/spdk/zipf.h 00:02:57.015 CXX test/cpp_headers/accel.o 00:02:57.015 CXX test/cpp_headers/accel_module.o 00:02:57.015 CXX test/cpp_headers/barrier.o 00:02:57.015 CXX test/cpp_headers/assert.o 00:02:57.015 CXX test/cpp_headers/base64.o 00:02:57.015 CXX test/cpp_headers/bdev.o 00:02:57.015 CXX test/cpp_headers/bdev_zone.o 00:02:57.015 CXX test/cpp_headers/bdev_module.o 00:02:57.015 CXX test/cpp_headers/bit_array.o 00:02:57.015 CXX test/cpp_headers/bit_pool.o 00:02:57.015 CXX test/cpp_headers/blob_bdev.o 00:02:57.015 CXX test/cpp_headers/blobfs_bdev.o 00:02:57.015 CXX test/cpp_headers/blobfs.o 00:02:57.015 CXX test/cpp_headers/blob.o 00:02:57.015 CXX test/cpp_headers/config.o 00:02:57.015 CXX test/cpp_headers/conf.o 00:02:57.015 CXX test/cpp_headers/cpuset.o 00:02:57.015 CC app/spdk_tgt/spdk_tgt.o 00:02:57.015 CXX test/cpp_headers/crc16.o 00:02:57.015 CXX test/cpp_headers/crc32.o 00:02:57.015 CXX test/cpp_headers/crc64.o 00:02:57.015 CXX test/cpp_headers/dif.o 00:02:57.015 CXX test/cpp_headers/dma.o 00:02:57.015 CXX test/cpp_headers/endian.o 00:02:57.015 CXX test/cpp_headers/env_dpdk.o 00:02:57.015 CXX test/cpp_headers/env.o 00:02:57.015 CXX test/cpp_headers/event.o 00:02:57.015 CXX test/cpp_headers/fd_group.o 00:02:57.015 CXX test/cpp_headers/fd.o 00:02:57.015 CXX test/cpp_headers/file.o 00:02:57.015 CXX test/cpp_headers/ftl.o 00:02:57.015 CC examples/accel/perf/accel_perf.o 00:02:57.015 CXX test/cpp_headers/gpt_spec.o 00:02:57.015 CXX test/cpp_headers/hexlify.o 00:02:57.015 CXX test/cpp_headers/histogram_data.o 00:02:57.015 CXX test/cpp_headers/idxd.o 00:02:57.015 CXX test/cpp_headers/idxd_spec.o 00:02:57.015 CXX test/cpp_headers/init.o 00:02:57.015 CC test/app/stub/stub.o 00:02:57.015 CC examples/vmd/led/led.o 00:02:57.015 CC test/env/pci/pci_ut.o 00:02:57.015 CC app/fio/nvme/fio_plugin.o 00:02:57.015 CC test/app/jsoncat/jsoncat.o 00:02:57.015 CC examples/vmd/lsvmd/lsvmd.o 00:02:57.015 CC test/app/histogram_perf/histogram_perf.o 00:02:57.015 CC examples/nvme/hello_world/hello_world.o 00:02:57.015 CC test/env/vtophys/vtophys.o 00:02:57.015 CC test/env/memory/memory_ut.o 00:02:57.015 CC examples/ioat/verify/verify.o 00:02:57.015 CC test/thread/poller_perf/poller_perf.o 00:02:57.015 CC test/event/reactor_perf/reactor_perf.o 00:02:57.015 CC examples/bdev/hello_world/hello_bdev.o 00:02:57.015 CC examples/nvme/reconnect/reconnect.o 00:02:57.015 CC examples/idxd/perf/perf.o 00:02:57.015 CC examples/blob/hello_world/hello_blob.o 00:02:57.015 CC examples/bdev/bdevperf/bdevperf.o 00:02:57.015 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:57.015 CC test/event/event_perf/event_perf.o 00:02:57.015 CC examples/ioat/perf/perf.o 00:02:57.015 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:57.015 CC examples/nvme/arbitration/arbitration.o 00:02:57.015 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:57.015 CXX test/cpp_headers/ioat.o 00:02:57.015 CC examples/nvme/abort/abort.o 00:02:57.015 CC examples/util/zipf/zipf.o 00:02:57.015 CC test/nvme/e2edp/nvme_dp.o 00:02:57.015 CC test/nvme/fdp/fdp.o 00:02:57.015 CC examples/nvme/hotplug/hotplug.o 00:02:57.015 CC test/nvme/cuse/cuse.o 00:02:57.015 CC test/event/reactor/reactor.o 00:02:57.015 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:57.015 CC test/nvme/reset/reset.o 00:02:57.015 CC test/thread/lock/spdk_lock.o 00:02:57.015 CC test/nvme/aer/aer.o 00:02:57.015 CC examples/sock/hello_world/hello_sock.o 00:02:57.015 CC test/nvme/err_injection/err_injection.o 00:02:57.015 CC test/nvme/reserve/reserve.o 00:02:57.015 CC test/nvme/sgl/sgl.o 00:02:57.015 CC test/nvme/simple_copy/simple_copy.o 00:02:57.015 CC test/nvme/startup/startup.o 00:02:57.015 CC test/nvme/overhead/overhead.o 00:02:57.015 CC test/event/app_repeat/app_repeat.o 00:02:57.015 CC test/nvme/fused_ordering/fused_ordering.o 00:02:57.015 CC test/nvme/connect_stress/connect_stress.o 00:02:57.015 LINK spdk_lspci 00:02:57.015 CC test/nvme/boot_partition/boot_partition.o 00:02:57.015 CC test/accel/dif/dif.o 00:02:57.015 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:57.015 CC test/nvme/compliance/nvme_compliance.o 00:02:57.015 CC examples/blob/cli/blobcli.o 00:02:57.015 CC app/fio/bdev/fio_plugin.o 00:02:57.015 CC test/event/scheduler/scheduler.o 00:02:57.015 CC examples/thread/thread/thread_ex.o 00:02:57.015 CC test/app/bdev_svc/bdev_svc.o 00:02:57.015 CC test/dma/test_dma/test_dma.o 00:02:57.015 CC examples/nvmf/nvmf/nvmf.o 00:02:57.015 CC test/bdev/bdevio/bdevio.o 00:02:57.015 CC test/blobfs/mkfs/mkfs.o 00:02:57.015 LINK rpc_client_test 00:02:57.015 LINK spdk_nvme_discover 00:02:57.015 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:57.276 LINK nvmf_tgt 00:02:57.276 LINK interrupt_tgt 00:02:57.276 CC test/env/mem_callbacks/mem_callbacks.o 00:02:57.276 CC test/lvol/esnap/esnap.o 00:02:57.276 LINK spdk_trace_record 00:02:57.276 CXX test/cpp_headers/ioat_spec.o 00:02:57.276 CXX test/cpp_headers/iscsi_spec.o 00:02:57.276 CXX test/cpp_headers/json.o 00:02:57.276 LINK jsoncat 00:02:57.276 LINK led 00:02:57.276 CXX test/cpp_headers/jsonrpc.o 00:02:57.276 LINK lsvmd 00:02:57.276 CXX test/cpp_headers/likely.o 00:02:57.276 CXX test/cpp_headers/log.o 00:02:57.276 CXX test/cpp_headers/lvol.o 00:02:57.276 LINK histogram_perf 00:02:57.276 LINK vtophys 00:02:57.276 CXX test/cpp_headers/memory.o 00:02:57.276 CXX test/cpp_headers/mmio.o 00:02:57.276 CXX test/cpp_headers/nbd.o 00:02:57.276 LINK reactor_perf 00:02:57.276 CXX test/cpp_headers/notify.o 00:02:57.276 CXX test/cpp_headers/nvme.o 00:02:57.276 CXX test/cpp_headers/nvme_intel.o 00:02:57.276 CXX test/cpp_headers/nvme_ocssd.o 00:02:57.276 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:57.276 CXX test/cpp_headers/nvme_spec.o 00:02:57.276 CXX test/cpp_headers/nvme_zns.o 00:02:57.276 LINK vhost 00:02:57.276 CXX test/cpp_headers/nvmf_cmd.o 00:02:57.276 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:57.276 CXX test/cpp_headers/nvmf.o 00:02:57.276 CXX test/cpp_headers/nvmf_spec.o 00:02:57.276 CXX test/cpp_headers/nvmf_transport.o 00:02:57.276 LINK reactor 00:02:57.276 CXX test/cpp_headers/opal.o 00:02:57.276 CXX test/cpp_headers/opal_spec.o 00:02:57.276 LINK event_perf 00:02:57.276 LINK poller_perf 00:02:57.276 CXX test/cpp_headers/pci_ids.o 00:02:57.276 CXX test/cpp_headers/pipe.o 00:02:57.276 LINK env_dpdk_post_init 00:02:57.276 LINK iscsi_tgt 00:02:57.276 LINK stub 00:02:57.276 CXX test/cpp_headers/queue.o 00:02:57.276 LINK zipf 00:02:57.276 CXX test/cpp_headers/reduce.o 00:02:57.276 LINK app_repeat 00:02:57.276 CXX test/cpp_headers/rpc.o 00:02:57.276 CXX test/cpp_headers/scheduler.o 00:02:57.276 CXX test/cpp_headers/scsi.o 00:02:57.276 CXX test/cpp_headers/scsi_spec.o 00:02:57.276 CXX test/cpp_headers/sock.o 00:02:57.276 LINK spdk_tgt 00:02:57.276 LINK startup 00:02:57.276 LINK boot_partition 00:02:57.276 LINK err_injection 00:02:57.276 LINK pmr_persistence 00:02:57.276 LINK connect_stress 00:02:57.276 CXX test/cpp_headers/stdinc.o 00:02:57.276 LINK cmb_copy 00:02:57.276 LINK verify 00:02:57.276 LINK doorbell_aers 00:02:57.276 LINK hello_world 00:02:57.276 LINK reserve 00:02:57.276 LINK fused_ordering 00:02:57.276 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:57.276 LINK ioat_perf 00:02:57.276 LINK bdev_svc 00:02:57.276 LINK hotplug 00:02:57.276 LINK hello_blob 00:02:57.276 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:57.276 LINK hello_bdev 00:02:57.276 LINK spdk_trace 00:02:57.276 LINK hello_sock 00:02:57.276 LINK simple_copy 00:02:57.276 LINK scheduler 00:02:57.276 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:02:57.276 LINK reset 00:02:57.276 LINK mkfs 00:02:57.276 LINK fdp 00:02:57.276 LINK sgl 00:02:57.276 LINK nvme_dp 00:02:57.537 LINK aer 00:02:57.537 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:02:57.537 LINK thread 00:02:57.537 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:57.537 LINK overhead 00:02:57.537 CXX test/cpp_headers/string.o 00:02:57.537 LINK mem_callbacks 00:02:57.537 CXX test/cpp_headers/thread.o 00:02:57.537 CXX test/cpp_headers/trace.o 00:02:57.537 CXX test/cpp_headers/trace_parser.o 00:02:57.537 CXX test/cpp_headers/tree.o 00:02:57.537 CXX test/cpp_headers/ublk.o 00:02:57.537 CXX test/cpp_headers/util.o 00:02:57.537 CXX test/cpp_headers/uuid.o 00:02:57.537 CXX test/cpp_headers/version.o 00:02:57.537 LINK reconnect 00:02:57.537 CXX test/cpp_headers/vfio_user_pci.o 00:02:57.537 CXX test/cpp_headers/vfio_user_spec.o 00:02:57.537 CXX test/cpp_headers/vhost.o 00:02:57.537 CXX test/cpp_headers/vmd.o 00:02:57.537 CXX test/cpp_headers/xor.o 00:02:57.537 CXX test/cpp_headers/zipf.o 00:02:57.537 LINK idxd_perf 00:02:57.537 LINK arbitration 00:02:57.537 LINK nvmf 00:02:57.537 LINK spdk_dd 00:02:57.537 LINK test_dma 00:02:57.537 LINK abort 00:02:57.537 LINK dif 00:02:57.537 LINK bdevio 00:02:57.537 LINK accel_perf 00:02:57.537 LINK nvme_compliance 00:02:57.537 LINK nvme_manage 00:02:57.537 LINK pci_ut 00:02:57.796 LINK blobcli 00:02:57.796 LINK nvme_fuzz 00:02:57.796 LINK memory_ut 00:02:57.796 LINK spdk_bdev 00:02:57.796 LINK spdk_nvme 00:02:57.796 LINK llvm_vfio_fuzz 00:02:57.796 LINK spdk_nvme_identify 00:02:57.796 LINK vhost_fuzz 00:02:58.054 LINK spdk_nvme_perf 00:02:58.054 LINK bdevperf 00:02:58.054 LINK spdk_top 00:02:58.054 LINK cuse 00:02:58.312 LINK llvm_nvme_fuzz 00:02:58.312 LINK spdk_lock 00:02:58.570 LINK iscsi_fuzz 00:03:01.104 LINK esnap 00:03:01.104 00:03:01.104 real 0m23.820s 00:03:01.104 user 4m13.711s 00:03:01.104 sys 2m14.336s 00:03:01.104 06:58:19 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:03:01.104 06:58:19 -- common/autotest_common.sh@10 -- $ set +x 00:03:01.104 ************************************ 00:03:01.104 END TEST make 00:03:01.104 ************************************ 00:03:01.364 06:58:19 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:01.364 06:58:19 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:01.364 06:58:19 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:01.364 06:58:19 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:01.364 06:58:19 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:01.364 06:58:19 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:01.364 06:58:19 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:01.364 06:58:19 -- scripts/common.sh@335 -- # IFS=.-: 00:03:01.364 06:58:19 -- scripts/common.sh@335 -- # read -ra ver1 00:03:01.364 06:58:19 -- scripts/common.sh@336 -- # IFS=.-: 00:03:01.364 06:58:19 -- scripts/common.sh@336 -- # read -ra ver2 00:03:01.364 06:58:19 -- scripts/common.sh@337 -- # local 'op=<' 00:03:01.364 06:58:19 -- scripts/common.sh@339 -- # ver1_l=2 00:03:01.364 06:58:19 -- scripts/common.sh@340 -- # ver2_l=1 00:03:01.364 06:58:19 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:01.364 06:58:19 -- scripts/common.sh@343 -- # case "$op" in 00:03:01.364 06:58:19 -- scripts/common.sh@344 -- # : 1 00:03:01.364 06:58:19 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:01.364 06:58:19 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:01.364 06:58:19 -- scripts/common.sh@364 -- # decimal 1 00:03:01.364 06:58:19 -- scripts/common.sh@352 -- # local d=1 00:03:01.364 06:58:19 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:01.364 06:58:19 -- scripts/common.sh@354 -- # echo 1 00:03:01.364 06:58:19 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:01.364 06:58:19 -- scripts/common.sh@365 -- # decimal 2 00:03:01.364 06:58:19 -- scripts/common.sh@352 -- # local d=2 00:03:01.364 06:58:19 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:01.364 06:58:19 -- scripts/common.sh@354 -- # echo 2 00:03:01.364 06:58:19 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:01.364 06:58:19 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:01.364 06:58:19 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:01.364 06:58:19 -- scripts/common.sh@367 -- # return 0 00:03:01.364 06:58:19 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:01.364 06:58:19 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:01.364 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:01.364 --rc genhtml_branch_coverage=1 00:03:01.364 --rc genhtml_function_coverage=1 00:03:01.364 --rc genhtml_legend=1 00:03:01.364 --rc geninfo_all_blocks=1 00:03:01.364 --rc geninfo_unexecuted_blocks=1 00:03:01.364 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:01.364 ' 00:03:01.364 06:58:19 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:01.364 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:01.364 --rc genhtml_branch_coverage=1 00:03:01.364 --rc genhtml_function_coverage=1 00:03:01.364 --rc genhtml_legend=1 00:03:01.364 --rc geninfo_all_blocks=1 00:03:01.364 --rc geninfo_unexecuted_blocks=1 00:03:01.364 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:01.364 ' 00:03:01.364 06:58:19 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:01.364 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:01.364 --rc genhtml_branch_coverage=1 00:03:01.364 --rc genhtml_function_coverage=1 00:03:01.364 --rc genhtml_legend=1 00:03:01.364 --rc geninfo_all_blocks=1 00:03:01.364 --rc geninfo_unexecuted_blocks=1 00:03:01.364 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:01.364 ' 00:03:01.364 06:58:19 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:01.364 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:01.364 --rc genhtml_branch_coverage=1 00:03:01.364 --rc genhtml_function_coverage=1 00:03:01.364 --rc genhtml_legend=1 00:03:01.364 --rc geninfo_all_blocks=1 00:03:01.364 --rc geninfo_unexecuted_blocks=1 00:03:01.364 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:01.364 ' 00:03:01.364 06:58:19 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:03:01.364 06:58:19 -- nvmf/common.sh@7 -- # uname -s 00:03:01.364 06:58:19 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:01.364 06:58:19 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:01.364 06:58:19 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:01.364 06:58:19 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:01.364 06:58:19 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:01.364 06:58:19 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:01.364 06:58:19 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:01.364 06:58:19 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:01.364 06:58:19 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:01.364 06:58:19 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:01.364 06:58:19 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:03:01.364 06:58:19 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:03:01.364 06:58:19 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:01.364 06:58:19 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:01.364 06:58:19 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:01.364 06:58:19 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:03:01.364 06:58:19 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:01.364 06:58:19 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:01.364 06:58:19 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:01.364 06:58:19 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:01.364 06:58:19 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:01.365 06:58:19 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:01.365 06:58:19 -- paths/export.sh@5 -- # export PATH 00:03:01.365 06:58:19 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:01.365 06:58:19 -- nvmf/common.sh@46 -- # : 0 00:03:01.365 06:58:19 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:03:01.365 06:58:19 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:03:01.365 06:58:19 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:03:01.365 06:58:19 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:01.365 06:58:19 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:01.365 06:58:19 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:03:01.365 06:58:19 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:03:01.365 06:58:19 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:03:01.365 06:58:19 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:01.365 06:58:19 -- spdk/autotest.sh@32 -- # uname -s 00:03:01.365 06:58:19 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:01.365 06:58:19 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:01.365 06:58:19 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:01.365 06:58:19 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:01.365 06:58:19 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:01.365 06:58:19 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:01.624 06:58:19 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:01.624 06:58:19 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:01.624 06:58:19 -- spdk/autotest.sh@48 -- # udevadm_pid=415425 00:03:01.624 06:58:19 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:01.624 06:58:19 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:01.624 06:58:19 -- spdk/autotest.sh@54 -- # echo 415427 00:03:01.624 06:58:19 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:01.624 06:58:19 -- spdk/autotest.sh@56 -- # echo 415428 00:03:01.624 06:58:19 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:01.624 06:58:19 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:03:01.624 06:58:19 -- spdk/autotest.sh@60 -- # echo 415429 00:03:01.624 06:58:19 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:01.624 06:58:19 -- spdk/autotest.sh@62 -- # echo 415430 00:03:01.624 06:58:19 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:01.625 06:58:19 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:01.625 06:58:19 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:03:01.625 06:58:19 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:01.625 06:58:19 -- common/autotest_common.sh@10 -- # set +x 00:03:01.625 06:58:19 -- spdk/autotest.sh@70 -- # create_test_list 00:03:01.625 06:58:19 -- common/autotest_common.sh@746 -- # xtrace_disable 00:03:01.625 06:58:19 -- common/autotest_common.sh@10 -- # set +x 00:03:01.625 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:03:01.625 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:03:01.625 06:58:19 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:03:01.625 06:58:19 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:01.625 06:58:19 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:01.625 06:58:19 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:03:01.625 06:58:19 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:01.625 06:58:19 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:03:01.625 06:58:19 -- common/autotest_common.sh@1450 -- # uname 00:03:01.625 06:58:19 -- common/autotest_common.sh@1450 -- # '[' Linux = FreeBSD ']' 00:03:01.625 06:58:19 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:03:01.625 06:58:19 -- common/autotest_common.sh@1470 -- # uname 00:03:01.625 06:58:19 -- common/autotest_common.sh@1470 -- # [[ Linux = FreeBSD ]] 00:03:01.625 06:58:19 -- spdk/autotest.sh@79 -- # [[ y == y ]] 00:03:01.625 06:58:19 -- spdk/autotest.sh@81 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:03:01.625 lcov: LCOV version 1.15 00:03:01.625 06:58:19 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:03:03.531 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcno 00:03:03.531 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcno 00:03:03.531 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcno 00:03:15.746 06:58:33 -- spdk/autotest.sh@87 -- # timing_enter pre_cleanup 00:03:15.746 06:58:33 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:15.746 06:58:33 -- common/autotest_common.sh@10 -- # set +x 00:03:15.746 06:58:33 -- spdk/autotest.sh@89 -- # rm -f 00:03:15.746 06:58:33 -- spdk/autotest.sh@92 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:19.037 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:19.037 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:19.037 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:19.037 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:19.297 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:19.297 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:19.297 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:19.297 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:19.297 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:19.297 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:19.297 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:19.297 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:19.297 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:19.556 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:19.556 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:19.556 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:19.556 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:03:19.556 06:58:37 -- spdk/autotest.sh@94 -- # get_zoned_devs 00:03:19.556 06:58:37 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:03:19.556 06:58:37 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:03:19.556 06:58:37 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:03:19.556 06:58:37 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:03:19.556 06:58:37 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:03:19.556 06:58:37 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:03:19.556 06:58:37 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:19.556 06:58:37 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:03:19.556 06:58:37 -- spdk/autotest.sh@96 -- # (( 0 > 0 )) 00:03:19.556 06:58:37 -- spdk/autotest.sh@108 -- # ls /dev/nvme0n1 00:03:19.556 06:58:37 -- spdk/autotest.sh@108 -- # grep -v p 00:03:19.556 06:58:37 -- spdk/autotest.sh@108 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:03:19.556 06:58:37 -- spdk/autotest.sh@110 -- # [[ -z '' ]] 00:03:19.556 06:58:37 -- spdk/autotest.sh@111 -- # block_in_use /dev/nvme0n1 00:03:19.556 06:58:37 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:03:19.556 06:58:37 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:19.556 No valid GPT data, bailing 00:03:19.556 06:58:37 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:19.556 06:58:37 -- scripts/common.sh@393 -- # pt= 00:03:19.556 06:58:37 -- scripts/common.sh@394 -- # return 1 00:03:19.556 06:58:37 -- spdk/autotest.sh@112 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:19.556 1+0 records in 00:03:19.556 1+0 records out 00:03:19.556 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00598335 s, 175 MB/s 00:03:19.556 06:58:37 -- spdk/autotest.sh@116 -- # sync 00:03:19.556 06:58:37 -- spdk/autotest.sh@118 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:19.556 06:58:37 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:19.556 06:58:37 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:27.682 06:58:44 -- spdk/autotest.sh@122 -- # uname -s 00:03:27.682 06:58:44 -- spdk/autotest.sh@122 -- # '[' Linux = Linux ']' 00:03:27.682 06:58:44 -- spdk/autotest.sh@123 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:27.682 06:58:44 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:27.682 06:58:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:27.682 06:58:44 -- common/autotest_common.sh@10 -- # set +x 00:03:27.682 ************************************ 00:03:27.682 START TEST setup.sh 00:03:27.682 ************************************ 00:03:27.682 06:58:44 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:27.682 * Looking for test storage... 00:03:27.682 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:27.682 06:58:44 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:27.682 06:58:44 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:27.682 06:58:44 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:27.682 06:58:45 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:27.682 06:58:45 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:27.682 06:58:45 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:27.682 06:58:45 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:27.682 06:58:45 -- scripts/common.sh@335 -- # IFS=.-: 00:03:27.682 06:58:45 -- scripts/common.sh@335 -- # read -ra ver1 00:03:27.682 06:58:45 -- scripts/common.sh@336 -- # IFS=.-: 00:03:27.682 06:58:45 -- scripts/common.sh@336 -- # read -ra ver2 00:03:27.682 06:58:45 -- scripts/common.sh@337 -- # local 'op=<' 00:03:27.682 06:58:45 -- scripts/common.sh@339 -- # ver1_l=2 00:03:27.682 06:58:45 -- scripts/common.sh@340 -- # ver2_l=1 00:03:27.682 06:58:45 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:27.682 06:58:45 -- scripts/common.sh@343 -- # case "$op" in 00:03:27.682 06:58:45 -- scripts/common.sh@344 -- # : 1 00:03:27.682 06:58:45 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:27.682 06:58:45 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:27.682 06:58:45 -- scripts/common.sh@364 -- # decimal 1 00:03:27.682 06:58:45 -- scripts/common.sh@352 -- # local d=1 00:03:27.682 06:58:45 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:27.682 06:58:45 -- scripts/common.sh@354 -- # echo 1 00:03:27.682 06:58:45 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:27.682 06:58:45 -- scripts/common.sh@365 -- # decimal 2 00:03:27.682 06:58:45 -- scripts/common.sh@352 -- # local d=2 00:03:27.682 06:58:45 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:27.682 06:58:45 -- scripts/common.sh@354 -- # echo 2 00:03:27.682 06:58:45 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:27.682 06:58:45 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:27.682 06:58:45 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:27.682 06:58:45 -- scripts/common.sh@367 -- # return 0 00:03:27.682 06:58:45 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:27.682 06:58:45 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:27.682 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:27.682 --rc genhtml_branch_coverage=1 00:03:27.682 --rc genhtml_function_coverage=1 00:03:27.682 --rc genhtml_legend=1 00:03:27.682 --rc geninfo_all_blocks=1 00:03:27.682 --rc geninfo_unexecuted_blocks=1 00:03:27.682 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:27.682 ' 00:03:27.682 06:58:45 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:27.682 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:27.682 --rc genhtml_branch_coverage=1 00:03:27.682 --rc genhtml_function_coverage=1 00:03:27.682 --rc genhtml_legend=1 00:03:27.682 --rc geninfo_all_blocks=1 00:03:27.682 --rc geninfo_unexecuted_blocks=1 00:03:27.682 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:27.682 ' 00:03:27.682 06:58:45 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:27.682 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:27.682 --rc genhtml_branch_coverage=1 00:03:27.682 --rc genhtml_function_coverage=1 00:03:27.682 --rc genhtml_legend=1 00:03:27.682 --rc geninfo_all_blocks=1 00:03:27.682 --rc geninfo_unexecuted_blocks=1 00:03:27.682 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:27.682 ' 00:03:27.682 06:58:45 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:27.682 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:27.682 --rc genhtml_branch_coverage=1 00:03:27.682 --rc genhtml_function_coverage=1 00:03:27.682 --rc genhtml_legend=1 00:03:27.682 --rc geninfo_all_blocks=1 00:03:27.682 --rc geninfo_unexecuted_blocks=1 00:03:27.682 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:27.682 ' 00:03:27.682 06:58:45 -- setup/test-setup.sh@10 -- # uname -s 00:03:27.682 06:58:45 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:27.682 06:58:45 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:27.682 06:58:45 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:27.682 06:58:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:27.682 06:58:45 -- common/autotest_common.sh@10 -- # set +x 00:03:27.682 ************************************ 00:03:27.682 START TEST acl 00:03:27.682 ************************************ 00:03:27.683 06:58:45 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:27.683 * Looking for test storage... 00:03:27.683 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:27.683 06:58:45 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:27.683 06:58:45 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:27.683 06:58:45 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:27.683 06:58:45 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:27.683 06:58:45 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:27.683 06:58:45 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:27.683 06:58:45 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:27.683 06:58:45 -- scripts/common.sh@335 -- # IFS=.-: 00:03:27.683 06:58:45 -- scripts/common.sh@335 -- # read -ra ver1 00:03:27.683 06:58:45 -- scripts/common.sh@336 -- # IFS=.-: 00:03:27.683 06:58:45 -- scripts/common.sh@336 -- # read -ra ver2 00:03:27.683 06:58:45 -- scripts/common.sh@337 -- # local 'op=<' 00:03:27.683 06:58:45 -- scripts/common.sh@339 -- # ver1_l=2 00:03:27.683 06:58:45 -- scripts/common.sh@340 -- # ver2_l=1 00:03:27.683 06:58:45 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:27.683 06:58:45 -- scripts/common.sh@343 -- # case "$op" in 00:03:27.683 06:58:45 -- scripts/common.sh@344 -- # : 1 00:03:27.683 06:58:45 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:27.683 06:58:45 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:27.683 06:58:45 -- scripts/common.sh@364 -- # decimal 1 00:03:27.683 06:58:45 -- scripts/common.sh@352 -- # local d=1 00:03:27.683 06:58:45 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:27.683 06:58:45 -- scripts/common.sh@354 -- # echo 1 00:03:27.683 06:58:45 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:27.683 06:58:45 -- scripts/common.sh@365 -- # decimal 2 00:03:27.683 06:58:45 -- scripts/common.sh@352 -- # local d=2 00:03:27.683 06:58:45 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:27.683 06:58:45 -- scripts/common.sh@354 -- # echo 2 00:03:27.683 06:58:45 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:27.683 06:58:45 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:27.683 06:58:45 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:27.683 06:58:45 -- scripts/common.sh@367 -- # return 0 00:03:27.683 06:58:45 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:27.683 06:58:45 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:27.683 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:27.683 --rc genhtml_branch_coverage=1 00:03:27.683 --rc genhtml_function_coverage=1 00:03:27.683 --rc genhtml_legend=1 00:03:27.683 --rc geninfo_all_blocks=1 00:03:27.683 --rc geninfo_unexecuted_blocks=1 00:03:27.683 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:27.683 ' 00:03:27.683 06:58:45 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:27.683 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:27.683 --rc genhtml_branch_coverage=1 00:03:27.683 --rc genhtml_function_coverage=1 00:03:27.683 --rc genhtml_legend=1 00:03:27.683 --rc geninfo_all_blocks=1 00:03:27.683 --rc geninfo_unexecuted_blocks=1 00:03:27.683 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:27.683 ' 00:03:27.683 06:58:45 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:27.683 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:27.683 --rc genhtml_branch_coverage=1 00:03:27.683 --rc genhtml_function_coverage=1 00:03:27.683 --rc genhtml_legend=1 00:03:27.683 --rc geninfo_all_blocks=1 00:03:27.683 --rc geninfo_unexecuted_blocks=1 00:03:27.683 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:27.683 ' 00:03:27.683 06:58:45 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:27.683 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:27.683 --rc genhtml_branch_coverage=1 00:03:27.683 --rc genhtml_function_coverage=1 00:03:27.683 --rc genhtml_legend=1 00:03:27.683 --rc geninfo_all_blocks=1 00:03:27.683 --rc geninfo_unexecuted_blocks=1 00:03:27.683 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:27.683 ' 00:03:27.683 06:58:45 -- setup/acl.sh@10 -- # get_zoned_devs 00:03:27.683 06:58:45 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:03:27.683 06:58:45 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:03:27.683 06:58:45 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:03:27.683 06:58:45 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:03:27.683 06:58:45 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:03:27.683 06:58:45 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:03:27.683 06:58:45 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:27.683 06:58:45 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:03:27.683 06:58:45 -- setup/acl.sh@12 -- # devs=() 00:03:27.683 06:58:45 -- setup/acl.sh@12 -- # declare -a devs 00:03:27.683 06:58:45 -- setup/acl.sh@13 -- # drivers=() 00:03:27.683 06:58:45 -- setup/acl.sh@13 -- # declare -A drivers 00:03:27.683 06:58:45 -- setup/acl.sh@51 -- # setup reset 00:03:27.683 06:58:45 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:27.683 06:58:45 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:30.976 06:58:49 -- setup/acl.sh@52 -- # collect_setup_devs 00:03:30.976 06:58:49 -- setup/acl.sh@16 -- # local dev driver 00:03:30.976 06:58:49 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:30.976 06:58:49 -- setup/acl.sh@15 -- # setup output status 00:03:30.976 06:58:49 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:30.976 06:58:49 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:03:34.267 Hugepages 00:03:34.267 node hugesize free / total 00:03:34.267 06:58:52 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:34.267 06:58:52 -- setup/acl.sh@19 -- # continue 00:03:34.267 06:58:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.267 06:58:52 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:34.267 06:58:52 -- setup/acl.sh@19 -- # continue 00:03:34.267 06:58:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.267 06:58:52 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:34.267 06:58:52 -- setup/acl.sh@19 -- # continue 00:03:34.267 06:58:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.267 00:03:34.267 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:34.267 06:58:52 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:34.267 06:58:52 -- setup/acl.sh@19 -- # continue 00:03:34.267 06:58:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.267 06:58:52 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # continue 00:03:34.267 06:58:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.267 06:58:52 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # continue 00:03:34.267 06:58:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.267 06:58:52 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # continue 00:03:34.267 06:58:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.267 06:58:52 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # continue 00:03:34.267 06:58:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.267 06:58:52 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # continue 00:03:34.267 06:58:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.267 06:58:52 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # continue 00:03:34.267 06:58:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.267 06:58:52 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # continue 00:03:34.267 06:58:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.267 06:58:52 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # continue 00:03:34.267 06:58:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.267 06:58:52 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # continue 00:03:34.267 06:58:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.267 06:58:52 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # continue 00:03:34.267 06:58:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.267 06:58:52 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # continue 00:03:34.267 06:58:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.267 06:58:52 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # continue 00:03:34.267 06:58:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.267 06:58:52 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.267 06:58:52 -- setup/acl.sh@20 -- # continue 00:03:34.267 06:58:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.527 06:58:52 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:34.527 06:58:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.527 06:58:52 -- setup/acl.sh@20 -- # continue 00:03:34.527 06:58:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.527 06:58:52 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:34.527 06:58:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.527 06:58:52 -- setup/acl.sh@20 -- # continue 00:03:34.527 06:58:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.527 06:58:52 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:34.527 06:58:52 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:34.527 06:58:52 -- setup/acl.sh@20 -- # continue 00:03:34.527 06:58:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.527 06:58:52 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:03:34.527 06:58:52 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:34.527 06:58:52 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:03:34.527 06:58:52 -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:34.527 06:58:52 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:34.527 06:58:52 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:34.527 06:58:52 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:34.527 06:58:52 -- setup/acl.sh@54 -- # run_test denied denied 00:03:34.527 06:58:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:34.527 06:58:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:34.527 06:58:52 -- common/autotest_common.sh@10 -- # set +x 00:03:34.527 ************************************ 00:03:34.527 START TEST denied 00:03:34.527 ************************************ 00:03:34.527 06:58:52 -- common/autotest_common.sh@1114 -- # denied 00:03:34.527 06:58:52 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:03:34.527 06:58:52 -- setup/acl.sh@38 -- # setup output config 00:03:34.527 06:58:52 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:03:34.527 06:58:52 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:34.527 06:58:52 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:38.724 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:03:38.724 06:58:56 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:03:38.724 06:58:56 -- setup/acl.sh@28 -- # local dev driver 00:03:38.724 06:58:56 -- setup/acl.sh@30 -- # for dev in "$@" 00:03:38.724 06:58:56 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:03:38.724 06:58:56 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:03:38.724 06:58:56 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:38.724 06:58:56 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:38.724 06:58:56 -- setup/acl.sh@41 -- # setup reset 00:03:38.724 06:58:56 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:38.724 06:58:56 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:44.002 00:03:44.002 real 0m8.589s 00:03:44.002 user 0m2.770s 00:03:44.002 sys 0m5.129s 00:03:44.002 06:59:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:44.002 06:59:01 -- common/autotest_common.sh@10 -- # set +x 00:03:44.002 ************************************ 00:03:44.002 END TEST denied 00:03:44.002 ************************************ 00:03:44.002 06:59:01 -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:44.002 06:59:01 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:44.002 06:59:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:44.002 06:59:01 -- common/autotest_common.sh@10 -- # set +x 00:03:44.002 ************************************ 00:03:44.002 START TEST allowed 00:03:44.002 ************************************ 00:03:44.002 06:59:01 -- common/autotest_common.sh@1114 -- # allowed 00:03:44.002 06:59:01 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:03:44.002 06:59:01 -- setup/acl.sh@45 -- # setup output config 00:03:44.002 06:59:01 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:03:44.002 06:59:01 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:44.002 06:59:01 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:48.195 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:48.195 06:59:06 -- setup/acl.sh@47 -- # verify 00:03:48.195 06:59:06 -- setup/acl.sh@28 -- # local dev driver 00:03:48.195 06:59:06 -- setup/acl.sh@48 -- # setup reset 00:03:48.195 06:59:06 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:48.195 06:59:06 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:52.386 00:03:52.386 real 0m9.235s 00:03:52.386 user 0m2.738s 00:03:52.386 sys 0m5.058s 00:03:52.386 06:59:10 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:52.386 06:59:10 -- common/autotest_common.sh@10 -- # set +x 00:03:52.386 ************************************ 00:03:52.386 END TEST allowed 00:03:52.386 ************************************ 00:03:52.386 00:03:52.386 real 0m25.520s 00:03:52.386 user 0m8.341s 00:03:52.386 sys 0m15.360s 00:03:52.386 06:59:10 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:52.386 06:59:10 -- common/autotest_common.sh@10 -- # set +x 00:03:52.386 ************************************ 00:03:52.386 END TEST acl 00:03:52.386 ************************************ 00:03:52.386 06:59:10 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:52.386 06:59:10 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:52.386 06:59:10 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:52.386 06:59:10 -- common/autotest_common.sh@10 -- # set +x 00:03:52.386 ************************************ 00:03:52.386 START TEST hugepages 00:03:52.386 ************************************ 00:03:52.386 06:59:10 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:52.647 * Looking for test storage... 00:03:52.647 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:52.647 06:59:10 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:52.647 06:59:10 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:52.647 06:59:10 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:52.647 06:59:10 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:52.647 06:59:10 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:52.647 06:59:10 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:52.647 06:59:10 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:52.647 06:59:10 -- scripts/common.sh@335 -- # IFS=.-: 00:03:52.647 06:59:10 -- scripts/common.sh@335 -- # read -ra ver1 00:03:52.647 06:59:10 -- scripts/common.sh@336 -- # IFS=.-: 00:03:52.647 06:59:10 -- scripts/common.sh@336 -- # read -ra ver2 00:03:52.647 06:59:10 -- scripts/common.sh@337 -- # local 'op=<' 00:03:52.647 06:59:10 -- scripts/common.sh@339 -- # ver1_l=2 00:03:52.647 06:59:10 -- scripts/common.sh@340 -- # ver2_l=1 00:03:52.647 06:59:10 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:52.647 06:59:10 -- scripts/common.sh@343 -- # case "$op" in 00:03:52.647 06:59:10 -- scripts/common.sh@344 -- # : 1 00:03:52.647 06:59:10 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:52.647 06:59:10 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:52.647 06:59:10 -- scripts/common.sh@364 -- # decimal 1 00:03:52.647 06:59:10 -- scripts/common.sh@352 -- # local d=1 00:03:52.647 06:59:10 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:52.647 06:59:10 -- scripts/common.sh@354 -- # echo 1 00:03:52.647 06:59:10 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:52.647 06:59:10 -- scripts/common.sh@365 -- # decimal 2 00:03:52.647 06:59:10 -- scripts/common.sh@352 -- # local d=2 00:03:52.647 06:59:10 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:52.647 06:59:10 -- scripts/common.sh@354 -- # echo 2 00:03:52.647 06:59:10 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:52.647 06:59:10 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:52.647 06:59:10 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:52.647 06:59:10 -- scripts/common.sh@367 -- # return 0 00:03:52.647 06:59:10 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:52.647 06:59:10 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:52.647 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:52.647 --rc genhtml_branch_coverage=1 00:03:52.647 --rc genhtml_function_coverage=1 00:03:52.647 --rc genhtml_legend=1 00:03:52.647 --rc geninfo_all_blocks=1 00:03:52.647 --rc geninfo_unexecuted_blocks=1 00:03:52.647 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:52.647 ' 00:03:52.647 06:59:10 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:52.647 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:52.647 --rc genhtml_branch_coverage=1 00:03:52.647 --rc genhtml_function_coverage=1 00:03:52.647 --rc genhtml_legend=1 00:03:52.647 --rc geninfo_all_blocks=1 00:03:52.647 --rc geninfo_unexecuted_blocks=1 00:03:52.647 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:52.647 ' 00:03:52.647 06:59:10 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:52.647 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:52.647 --rc genhtml_branch_coverage=1 00:03:52.647 --rc genhtml_function_coverage=1 00:03:52.647 --rc genhtml_legend=1 00:03:52.647 --rc geninfo_all_blocks=1 00:03:52.647 --rc geninfo_unexecuted_blocks=1 00:03:52.647 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:52.647 ' 00:03:52.647 06:59:10 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:52.647 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:52.647 --rc genhtml_branch_coverage=1 00:03:52.647 --rc genhtml_function_coverage=1 00:03:52.647 --rc genhtml_legend=1 00:03:52.647 --rc geninfo_all_blocks=1 00:03:52.647 --rc geninfo_unexecuted_blocks=1 00:03:52.647 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:52.647 ' 00:03:52.647 06:59:10 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:52.647 06:59:10 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:52.647 06:59:10 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:52.647 06:59:10 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:52.647 06:59:10 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:52.647 06:59:10 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:52.647 06:59:10 -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:52.647 06:59:10 -- setup/common.sh@18 -- # local node= 00:03:52.647 06:59:10 -- setup/common.sh@19 -- # local var val 00:03:52.647 06:59:10 -- setup/common.sh@20 -- # local mem_f mem 00:03:52.647 06:59:10 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.647 06:59:10 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:52.647 06:59:10 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:52.647 06:59:10 -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.647 06:59:10 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.647 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.647 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 39497064 kB' 'MemAvailable: 43220116 kB' 'Buffers: 9316 kB' 'Cached: 12421860 kB' 'SwapCached: 0 kB' 'Active: 9314560 kB' 'Inactive: 3688932 kB' 'Active(anon): 8898144 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 575708 kB' 'Mapped: 149768 kB' 'Shmem: 8325828 kB' 'KReclaimable: 229140 kB' 'Slab: 874768 kB' 'SReclaimable: 229140 kB' 'SUnreclaim: 645628 kB' 'KernelStack: 21792 kB' 'PageTables: 7800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433348 kB' 'Committed_AS: 10173004 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214064 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.648 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.648 06:59:10 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # continue 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.649 06:59:10 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.649 06:59:10 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.649 06:59:10 -- setup/common.sh@33 -- # echo 2048 00:03:52.649 06:59:10 -- setup/common.sh@33 -- # return 0 00:03:52.649 06:59:10 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:52.649 06:59:10 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:52.649 06:59:10 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:52.649 06:59:10 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:52.649 06:59:10 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:52.649 06:59:10 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:52.649 06:59:10 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:52.649 06:59:10 -- setup/hugepages.sh@207 -- # get_nodes 00:03:52.649 06:59:10 -- setup/hugepages.sh@27 -- # local node 00:03:52.649 06:59:10 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:52.649 06:59:10 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:03:52.649 06:59:10 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:52.649 06:59:10 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:52.649 06:59:10 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:52.649 06:59:10 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:52.649 06:59:10 -- setup/hugepages.sh@208 -- # clear_hp 00:03:52.649 06:59:10 -- setup/hugepages.sh@37 -- # local node hp 00:03:52.649 06:59:10 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:52.649 06:59:10 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:52.649 06:59:10 -- setup/hugepages.sh@41 -- # echo 0 00:03:52.649 06:59:10 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:52.649 06:59:10 -- setup/hugepages.sh@41 -- # echo 0 00:03:52.649 06:59:10 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:52.649 06:59:10 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:52.649 06:59:10 -- setup/hugepages.sh@41 -- # echo 0 00:03:52.649 06:59:10 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:52.649 06:59:10 -- setup/hugepages.sh@41 -- # echo 0 00:03:52.649 06:59:10 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:52.649 06:59:10 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:52.649 06:59:10 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:52.649 06:59:10 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:52.649 06:59:10 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:52.649 06:59:10 -- common/autotest_common.sh@10 -- # set +x 00:03:52.649 ************************************ 00:03:52.649 START TEST default_setup 00:03:52.649 ************************************ 00:03:52.649 06:59:10 -- common/autotest_common.sh@1114 -- # default_setup 00:03:52.649 06:59:10 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:52.649 06:59:10 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:52.649 06:59:10 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:52.649 06:59:10 -- setup/hugepages.sh@51 -- # shift 00:03:52.649 06:59:10 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:52.649 06:59:10 -- setup/hugepages.sh@52 -- # local node_ids 00:03:52.649 06:59:10 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:52.649 06:59:10 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:52.649 06:59:10 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:52.649 06:59:10 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:52.649 06:59:10 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:52.649 06:59:10 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:52.649 06:59:10 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:52.649 06:59:10 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:52.649 06:59:10 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:52.649 06:59:10 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:52.649 06:59:10 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:52.649 06:59:10 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:52.649 06:59:10 -- setup/hugepages.sh@73 -- # return 0 00:03:52.649 06:59:10 -- setup/hugepages.sh@137 -- # setup output 00:03:52.649 06:59:10 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:52.649 06:59:10 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:56.842 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:56.842 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:56.842 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:56.842 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:56.842 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:56.842 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:56.842 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:56.842 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:56.842 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:56.842 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:56.842 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:56.842 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:56.842 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:56.842 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:56.842 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:56.842 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:58.222 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:58.222 06:59:16 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:58.222 06:59:16 -- setup/hugepages.sh@89 -- # local node 00:03:58.222 06:59:16 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:58.222 06:59:16 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:58.222 06:59:16 -- setup/hugepages.sh@92 -- # local surp 00:03:58.222 06:59:16 -- setup/hugepages.sh@93 -- # local resv 00:03:58.222 06:59:16 -- setup/hugepages.sh@94 -- # local anon 00:03:58.222 06:59:16 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:58.222 06:59:16 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:58.222 06:59:16 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:58.222 06:59:16 -- setup/common.sh@18 -- # local node= 00:03:58.222 06:59:16 -- setup/common.sh@19 -- # local var val 00:03:58.222 06:59:16 -- setup/common.sh@20 -- # local mem_f mem 00:03:58.222 06:59:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:58.222 06:59:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:58.222 06:59:16 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:58.222 06:59:16 -- setup/common.sh@28 -- # mapfile -t mem 00:03:58.222 06:59:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:58.222 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.222 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41672736 kB' 'MemAvailable: 45395232 kB' 'Buffers: 9316 kB' 'Cached: 12422004 kB' 'SwapCached: 0 kB' 'Active: 9316080 kB' 'Inactive: 3688932 kB' 'Active(anon): 8899664 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 576496 kB' 'Mapped: 150008 kB' 'Shmem: 8325972 kB' 'KReclaimable: 228028 kB' 'Slab: 873072 kB' 'SReclaimable: 228028 kB' 'SUnreclaim: 645044 kB' 'KernelStack: 21792 kB' 'PageTables: 7556 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10176232 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214256 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.223 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.223 06:59:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:58.223 06:59:16 -- setup/common.sh@33 -- # echo 0 00:03:58.223 06:59:16 -- setup/common.sh@33 -- # return 0 00:03:58.223 06:59:16 -- setup/hugepages.sh@97 -- # anon=0 00:03:58.223 06:59:16 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:58.223 06:59:16 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:58.223 06:59:16 -- setup/common.sh@18 -- # local node= 00:03:58.224 06:59:16 -- setup/common.sh@19 -- # local var val 00:03:58.224 06:59:16 -- setup/common.sh@20 -- # local mem_f mem 00:03:58.224 06:59:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:58.224 06:59:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:58.224 06:59:16 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:58.224 06:59:16 -- setup/common.sh@28 -- # mapfile -t mem 00:03:58.224 06:59:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41677184 kB' 'MemAvailable: 45399680 kB' 'Buffers: 9316 kB' 'Cached: 12422008 kB' 'SwapCached: 0 kB' 'Active: 9316808 kB' 'Inactive: 3688932 kB' 'Active(anon): 8900392 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 577288 kB' 'Mapped: 149980 kB' 'Shmem: 8325976 kB' 'KReclaimable: 228028 kB' 'Slab: 873104 kB' 'SReclaimable: 228028 kB' 'SUnreclaim: 645076 kB' 'KernelStack: 21856 kB' 'PageTables: 7860 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10174744 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214336 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.224 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.224 06:59:16 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.225 06:59:16 -- setup/common.sh@33 -- # echo 0 00:03:58.225 06:59:16 -- setup/common.sh@33 -- # return 0 00:03:58.225 06:59:16 -- setup/hugepages.sh@99 -- # surp=0 00:03:58.225 06:59:16 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:58.225 06:59:16 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:58.225 06:59:16 -- setup/common.sh@18 -- # local node= 00:03:58.225 06:59:16 -- setup/common.sh@19 -- # local var val 00:03:58.225 06:59:16 -- setup/common.sh@20 -- # local mem_f mem 00:03:58.225 06:59:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:58.225 06:59:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:58.225 06:59:16 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:58.225 06:59:16 -- setup/common.sh@28 -- # mapfile -t mem 00:03:58.225 06:59:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41681712 kB' 'MemAvailable: 45404208 kB' 'Buffers: 9316 kB' 'Cached: 12422020 kB' 'SwapCached: 0 kB' 'Active: 9316212 kB' 'Inactive: 3688932 kB' 'Active(anon): 8899796 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 577076 kB' 'Mapped: 149904 kB' 'Shmem: 8325988 kB' 'KReclaimable: 228028 kB' 'Slab: 873008 kB' 'SReclaimable: 228028 kB' 'SUnreclaim: 644980 kB' 'KernelStack: 21952 kB' 'PageTables: 8064 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10176260 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.225 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.225 06:59:16 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.226 06:59:16 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:58.226 06:59:16 -- setup/common.sh@33 -- # echo 0 00:03:58.226 06:59:16 -- setup/common.sh@33 -- # return 0 00:03:58.226 06:59:16 -- setup/hugepages.sh@100 -- # resv=0 00:03:58.226 06:59:16 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:58.226 nr_hugepages=1024 00:03:58.226 06:59:16 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:58.226 resv_hugepages=0 00:03:58.226 06:59:16 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:58.226 surplus_hugepages=0 00:03:58.226 06:59:16 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:58.226 anon_hugepages=0 00:03:58.226 06:59:16 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:58.226 06:59:16 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:58.226 06:59:16 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:58.226 06:59:16 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:58.226 06:59:16 -- setup/common.sh@18 -- # local node= 00:03:58.226 06:59:16 -- setup/common.sh@19 -- # local var val 00:03:58.226 06:59:16 -- setup/common.sh@20 -- # local mem_f mem 00:03:58.226 06:59:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:58.226 06:59:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:58.226 06:59:16 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:58.226 06:59:16 -- setup/common.sh@28 -- # mapfile -t mem 00:03:58.226 06:59:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:58.226 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41682108 kB' 'MemAvailable: 45404604 kB' 'Buffers: 9316 kB' 'Cached: 12422032 kB' 'SwapCached: 0 kB' 'Active: 9316516 kB' 'Inactive: 3688932 kB' 'Active(anon): 8900100 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 577408 kB' 'Mapped: 149904 kB' 'Shmem: 8326000 kB' 'KReclaimable: 228028 kB' 'Slab: 873008 kB' 'SReclaimable: 228028 kB' 'SUnreclaim: 644980 kB' 'KernelStack: 21792 kB' 'PageTables: 7928 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10176272 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.227 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.227 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:58.228 06:59:16 -- setup/common.sh@33 -- # echo 1024 00:03:58.228 06:59:16 -- setup/common.sh@33 -- # return 0 00:03:58.228 06:59:16 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:58.228 06:59:16 -- setup/hugepages.sh@112 -- # get_nodes 00:03:58.228 06:59:16 -- setup/hugepages.sh@27 -- # local node 00:03:58.228 06:59:16 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:58.228 06:59:16 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:58.228 06:59:16 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:58.228 06:59:16 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:58.228 06:59:16 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:58.228 06:59:16 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:58.228 06:59:16 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:58.228 06:59:16 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:58.228 06:59:16 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:58.228 06:59:16 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:58.228 06:59:16 -- setup/common.sh@18 -- # local node=0 00:03:58.228 06:59:16 -- setup/common.sh@19 -- # local var val 00:03:58.228 06:59:16 -- setup/common.sh@20 -- # local mem_f mem 00:03:58.228 06:59:16 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:58.228 06:59:16 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:58.228 06:59:16 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:58.228 06:59:16 -- setup/common.sh@28 -- # mapfile -t mem 00:03:58.228 06:59:16 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 18328284 kB' 'MemUsed: 14257084 kB' 'SwapCached: 0 kB' 'Active: 7119036 kB' 'Inactive: 3526132 kB' 'Active(anon): 6899416 kB' 'Inactive(anon): 0 kB' 'Active(file): 219620 kB' 'Inactive(file): 3526132 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10154660 kB' 'Mapped: 126208 kB' 'AnonPages: 493808 kB' 'Shmem: 6408908 kB' 'KernelStack: 12072 kB' 'PageTables: 5728 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 116492 kB' 'Slab: 430196 kB' 'SReclaimable: 116492 kB' 'SUnreclaim: 313704 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.228 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.228 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # continue 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # IFS=': ' 00:03:58.229 06:59:16 -- setup/common.sh@31 -- # read -r var val _ 00:03:58.229 06:59:16 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:58.229 06:59:16 -- setup/common.sh@33 -- # echo 0 00:03:58.229 06:59:16 -- setup/common.sh@33 -- # return 0 00:03:58.229 06:59:16 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:58.229 06:59:16 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:58.229 06:59:16 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:58.229 06:59:16 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:58.229 06:59:16 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:58.229 node0=1024 expecting 1024 00:03:58.229 06:59:16 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:58.229 00:03:58.229 real 0m5.466s 00:03:58.229 user 0m1.512s 00:03:58.229 sys 0m2.505s 00:03:58.229 06:59:16 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:58.229 06:59:16 -- common/autotest_common.sh@10 -- # set +x 00:03:58.229 ************************************ 00:03:58.229 END TEST default_setup 00:03:58.229 ************************************ 00:03:58.229 06:59:16 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:58.229 06:59:16 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:58.229 06:59:16 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:58.229 06:59:16 -- common/autotest_common.sh@10 -- # set +x 00:03:58.229 ************************************ 00:03:58.229 START TEST per_node_1G_alloc 00:03:58.229 ************************************ 00:03:58.229 06:59:16 -- common/autotest_common.sh@1114 -- # per_node_1G_alloc 00:03:58.229 06:59:16 -- setup/hugepages.sh@143 -- # local IFS=, 00:03:58.229 06:59:16 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:58.229 06:59:16 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:58.229 06:59:16 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:58.229 06:59:16 -- setup/hugepages.sh@51 -- # shift 00:03:58.229 06:59:16 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:58.229 06:59:16 -- setup/hugepages.sh@52 -- # local node_ids 00:03:58.229 06:59:16 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:58.229 06:59:16 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:58.229 06:59:16 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:58.229 06:59:16 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:58.229 06:59:16 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:58.229 06:59:16 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:58.229 06:59:16 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:58.229 06:59:16 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:58.229 06:59:16 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:58.229 06:59:16 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:58.229 06:59:16 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:58.229 06:59:16 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:58.229 06:59:16 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:58.229 06:59:16 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:58.230 06:59:16 -- setup/hugepages.sh@73 -- # return 0 00:03:58.230 06:59:16 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:58.230 06:59:16 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:58.230 06:59:16 -- setup/hugepages.sh@146 -- # setup output 00:03:58.230 06:59:16 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:58.230 06:59:16 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:02.429 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:02.429 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:02.429 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:02.429 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:02.429 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:02.429 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:02.429 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:02.429 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:02.429 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:02.429 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:02.429 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:02.429 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:02.429 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:02.429 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:02.429 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:02.429 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:02.429 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:02.429 06:59:19 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:02.429 06:59:19 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:02.429 06:59:19 -- setup/hugepages.sh@89 -- # local node 00:04:02.429 06:59:19 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:02.429 06:59:19 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:02.429 06:59:19 -- setup/hugepages.sh@92 -- # local surp 00:04:02.429 06:59:19 -- setup/hugepages.sh@93 -- # local resv 00:04:02.429 06:59:19 -- setup/hugepages.sh@94 -- # local anon 00:04:02.429 06:59:19 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:02.429 06:59:19 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:02.429 06:59:19 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:02.429 06:59:19 -- setup/common.sh@18 -- # local node= 00:04:02.429 06:59:19 -- setup/common.sh@19 -- # local var val 00:04:02.429 06:59:19 -- setup/common.sh@20 -- # local mem_f mem 00:04:02.429 06:59:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.429 06:59:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:02.429 06:59:19 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:02.429 06:59:19 -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.429 06:59:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.429 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.429 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.429 06:59:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41710704 kB' 'MemAvailable: 45433200 kB' 'Buffers: 9316 kB' 'Cached: 12422116 kB' 'SwapCached: 0 kB' 'Active: 9316412 kB' 'Inactive: 3688932 kB' 'Active(anon): 8899996 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 577128 kB' 'Mapped: 148712 kB' 'Shmem: 8326084 kB' 'KReclaimable: 228028 kB' 'Slab: 872764 kB' 'SReclaimable: 228028 kB' 'SUnreclaim: 644736 kB' 'KernelStack: 21808 kB' 'PageTables: 7652 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10164860 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214384 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:02.429 06:59:19 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.429 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.429 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.429 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.429 06:59:19 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.429 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.429 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.429 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.429 06:59:19 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.429 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.429 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.429 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.429 06:59:19 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.429 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.429 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.430 06:59:19 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.430 06:59:19 -- setup/common.sh@33 -- # echo 0 00:04:02.430 06:59:19 -- setup/common.sh@33 -- # return 0 00:04:02.430 06:59:19 -- setup/hugepages.sh@97 -- # anon=0 00:04:02.430 06:59:19 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:02.430 06:59:19 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:02.430 06:59:19 -- setup/common.sh@18 -- # local node= 00:04:02.430 06:59:19 -- setup/common.sh@19 -- # local var val 00:04:02.430 06:59:19 -- setup/common.sh@20 -- # local mem_f mem 00:04:02.430 06:59:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.430 06:59:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:02.430 06:59:19 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:02.430 06:59:19 -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.430 06:59:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.430 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41711232 kB' 'MemAvailable: 45433728 kB' 'Buffers: 9316 kB' 'Cached: 12422120 kB' 'SwapCached: 0 kB' 'Active: 9316132 kB' 'Inactive: 3688932 kB' 'Active(anon): 8899716 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 576896 kB' 'Mapped: 148704 kB' 'Shmem: 8326088 kB' 'KReclaimable: 228028 kB' 'Slab: 872828 kB' 'SReclaimable: 228028 kB' 'SUnreclaim: 644800 kB' 'KernelStack: 21824 kB' 'PageTables: 7724 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10164872 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:19 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:19 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:20 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:20 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:20 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:20 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:20 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:20 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:20 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.431 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.431 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.431 06:59:20 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.431 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.432 06:59:20 -- setup/common.sh@33 -- # echo 0 00:04:02.432 06:59:20 -- setup/common.sh@33 -- # return 0 00:04:02.432 06:59:20 -- setup/hugepages.sh@99 -- # surp=0 00:04:02.432 06:59:20 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:02.432 06:59:20 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:02.432 06:59:20 -- setup/common.sh@18 -- # local node= 00:04:02.432 06:59:20 -- setup/common.sh@19 -- # local var val 00:04:02.432 06:59:20 -- setup/common.sh@20 -- # local mem_f mem 00:04:02.432 06:59:20 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.432 06:59:20 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:02.432 06:59:20 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:02.432 06:59:20 -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.432 06:59:20 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41711232 kB' 'MemAvailable: 45433728 kB' 'Buffers: 9316 kB' 'Cached: 12422120 kB' 'SwapCached: 0 kB' 'Active: 9316176 kB' 'Inactive: 3688932 kB' 'Active(anon): 8899760 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 576928 kB' 'Mapped: 148704 kB' 'Shmem: 8326088 kB' 'KReclaimable: 228028 kB' 'Slab: 872828 kB' 'SReclaimable: 228028 kB' 'SUnreclaim: 644800 kB' 'KernelStack: 21840 kB' 'PageTables: 7780 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10164884 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.432 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.432 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.433 06:59:20 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.433 06:59:20 -- setup/common.sh@33 -- # echo 0 00:04:02.433 06:59:20 -- setup/common.sh@33 -- # return 0 00:04:02.433 06:59:20 -- setup/hugepages.sh@100 -- # resv=0 00:04:02.433 06:59:20 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:02.433 nr_hugepages=1024 00:04:02.433 06:59:20 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:02.433 resv_hugepages=0 00:04:02.433 06:59:20 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:02.433 surplus_hugepages=0 00:04:02.433 06:59:20 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:02.433 anon_hugepages=0 00:04:02.433 06:59:20 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:02.433 06:59:20 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:02.433 06:59:20 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:02.433 06:59:20 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:02.433 06:59:20 -- setup/common.sh@18 -- # local node= 00:04:02.433 06:59:20 -- setup/common.sh@19 -- # local var val 00:04:02.433 06:59:20 -- setup/common.sh@20 -- # local mem_f mem 00:04:02.433 06:59:20 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.433 06:59:20 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:02.433 06:59:20 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:02.433 06:59:20 -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.433 06:59:20 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.433 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41714736 kB' 'MemAvailable: 45437232 kB' 'Buffers: 9316 kB' 'Cached: 12422148 kB' 'SwapCached: 0 kB' 'Active: 9316348 kB' 'Inactive: 3688932 kB' 'Active(anon): 8899932 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 577288 kB' 'Mapped: 148704 kB' 'Shmem: 8326116 kB' 'KReclaimable: 228028 kB' 'Slab: 872828 kB' 'SReclaimable: 228028 kB' 'SUnreclaim: 644800 kB' 'KernelStack: 21824 kB' 'PageTables: 7724 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10164900 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.434 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.434 06:59:20 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.435 06:59:20 -- setup/common.sh@33 -- # echo 1024 00:04:02.435 06:59:20 -- setup/common.sh@33 -- # return 0 00:04:02.435 06:59:20 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:02.435 06:59:20 -- setup/hugepages.sh@112 -- # get_nodes 00:04:02.435 06:59:20 -- setup/hugepages.sh@27 -- # local node 00:04:02.435 06:59:20 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:02.435 06:59:20 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:02.435 06:59:20 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:02.435 06:59:20 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:02.435 06:59:20 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:02.435 06:59:20 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:02.435 06:59:20 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:02.435 06:59:20 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:02.435 06:59:20 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:02.435 06:59:20 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:02.435 06:59:20 -- setup/common.sh@18 -- # local node=0 00:04:02.435 06:59:20 -- setup/common.sh@19 -- # local var val 00:04:02.435 06:59:20 -- setup/common.sh@20 -- # local mem_f mem 00:04:02.435 06:59:20 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.435 06:59:20 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:02.435 06:59:20 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:02.435 06:59:20 -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.435 06:59:20 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 19388292 kB' 'MemUsed: 13197076 kB' 'SwapCached: 0 kB' 'Active: 7119440 kB' 'Inactive: 3526132 kB' 'Active(anon): 6899820 kB' 'Inactive(anon): 0 kB' 'Active(file): 219620 kB' 'Inactive(file): 3526132 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10154720 kB' 'Mapped: 125296 kB' 'AnonPages: 494244 kB' 'Shmem: 6408968 kB' 'KernelStack: 12168 kB' 'PageTables: 6048 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 116492 kB' 'Slab: 430164 kB' 'SReclaimable: 116492 kB' 'SUnreclaim: 313672 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.435 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.435 06:59:20 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@33 -- # echo 0 00:04:02.436 06:59:20 -- setup/common.sh@33 -- # return 0 00:04:02.436 06:59:20 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:02.436 06:59:20 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:02.436 06:59:20 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:02.436 06:59:20 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:02.436 06:59:20 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:02.436 06:59:20 -- setup/common.sh@18 -- # local node=1 00:04:02.436 06:59:20 -- setup/common.sh@19 -- # local var val 00:04:02.436 06:59:20 -- setup/common.sh@20 -- # local mem_f mem 00:04:02.436 06:59:20 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.436 06:59:20 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:02.436 06:59:20 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:02.436 06:59:20 -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.436 06:59:20 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698428 kB' 'MemFree: 22327348 kB' 'MemUsed: 5371080 kB' 'SwapCached: 0 kB' 'Active: 2198036 kB' 'Inactive: 162800 kB' 'Active(anon): 2001240 kB' 'Inactive(anon): 0 kB' 'Active(file): 196796 kB' 'Inactive(file): 162800 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2276744 kB' 'Mapped: 23408 kB' 'AnonPages: 84504 kB' 'Shmem: 1917148 kB' 'KernelStack: 9656 kB' 'PageTables: 1676 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 111600 kB' 'Slab: 442728 kB' 'SReclaimable: 111600 kB' 'SUnreclaim: 331128 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.436 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.436 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # continue 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.437 06:59:20 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.437 06:59:20 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.437 06:59:20 -- setup/common.sh@33 -- # echo 0 00:04:02.437 06:59:20 -- setup/common.sh@33 -- # return 0 00:04:02.437 06:59:20 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:02.437 06:59:20 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:02.437 06:59:20 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:02.437 06:59:20 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:02.437 06:59:20 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:02.437 node0=512 expecting 512 00:04:02.437 06:59:20 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:02.437 06:59:20 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:02.437 06:59:20 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:02.437 06:59:20 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:02.437 node1=512 expecting 512 00:04:02.437 06:59:20 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:02.437 00:04:02.437 real 0m3.751s 00:04:02.437 user 0m1.425s 00:04:02.437 sys 0m2.396s 00:04:02.437 06:59:20 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:02.437 06:59:20 -- common/autotest_common.sh@10 -- # set +x 00:04:02.437 ************************************ 00:04:02.437 END TEST per_node_1G_alloc 00:04:02.437 ************************************ 00:04:02.437 06:59:20 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:02.437 06:59:20 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:02.437 06:59:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:02.437 06:59:20 -- common/autotest_common.sh@10 -- # set +x 00:04:02.437 ************************************ 00:04:02.437 START TEST even_2G_alloc 00:04:02.437 ************************************ 00:04:02.437 06:59:20 -- common/autotest_common.sh@1114 -- # even_2G_alloc 00:04:02.437 06:59:20 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:02.437 06:59:20 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:02.437 06:59:20 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:02.437 06:59:20 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:02.437 06:59:20 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:02.437 06:59:20 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:02.437 06:59:20 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:02.437 06:59:20 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:02.437 06:59:20 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:02.437 06:59:20 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:02.437 06:59:20 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:02.437 06:59:20 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:02.437 06:59:20 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:02.437 06:59:20 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:02.437 06:59:20 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:02.437 06:59:20 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:02.437 06:59:20 -- setup/hugepages.sh@83 -- # : 512 00:04:02.437 06:59:20 -- setup/hugepages.sh@84 -- # : 1 00:04:02.437 06:59:20 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:02.437 06:59:20 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:02.437 06:59:20 -- setup/hugepages.sh@83 -- # : 0 00:04:02.437 06:59:20 -- setup/hugepages.sh@84 -- # : 0 00:04:02.437 06:59:20 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:02.437 06:59:20 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:02.437 06:59:20 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:02.437 06:59:20 -- setup/hugepages.sh@153 -- # setup output 00:04:02.437 06:59:20 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:02.437 06:59:20 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:05.733 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:05.733 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:05.733 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:05.733 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:05.733 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:05.733 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:05.733 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:05.733 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:05.733 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:05.733 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:05.733 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:05.733 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:05.733 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:05.733 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:05.733 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:05.733 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:05.733 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:05.733 06:59:23 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:05.733 06:59:23 -- setup/hugepages.sh@89 -- # local node 00:04:05.733 06:59:23 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:05.733 06:59:23 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:05.733 06:59:23 -- setup/hugepages.sh@92 -- # local surp 00:04:05.733 06:59:23 -- setup/hugepages.sh@93 -- # local resv 00:04:05.733 06:59:23 -- setup/hugepages.sh@94 -- # local anon 00:04:05.733 06:59:23 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:05.733 06:59:23 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:05.733 06:59:23 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:05.733 06:59:23 -- setup/common.sh@18 -- # local node= 00:04:05.733 06:59:23 -- setup/common.sh@19 -- # local var val 00:04:05.733 06:59:23 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.733 06:59:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.733 06:59:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.733 06:59:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.733 06:59:23 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.733 06:59:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.733 06:59:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41726312 kB' 'MemAvailable: 45448840 kB' 'Buffers: 9316 kB' 'Cached: 12422264 kB' 'SwapCached: 0 kB' 'Active: 9315492 kB' 'Inactive: 3688932 kB' 'Active(anon): 8899076 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 576132 kB' 'Mapped: 148728 kB' 'Shmem: 8326232 kB' 'KReclaimable: 228092 kB' 'Slab: 873132 kB' 'SReclaimable: 228092 kB' 'SUnreclaim: 645040 kB' 'KernelStack: 21680 kB' 'PageTables: 7308 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10165284 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214240 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.733 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.733 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.734 06:59:23 -- setup/common.sh@33 -- # echo 0 00:04:05.734 06:59:23 -- setup/common.sh@33 -- # return 0 00:04:05.734 06:59:23 -- setup/hugepages.sh@97 -- # anon=0 00:04:05.734 06:59:23 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:05.734 06:59:23 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:05.734 06:59:23 -- setup/common.sh@18 -- # local node= 00:04:05.734 06:59:23 -- setup/common.sh@19 -- # local var val 00:04:05.734 06:59:23 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.734 06:59:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.734 06:59:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.734 06:59:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.734 06:59:23 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.734 06:59:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41726628 kB' 'MemAvailable: 45449156 kB' 'Buffers: 9316 kB' 'Cached: 12422264 kB' 'SwapCached: 0 kB' 'Active: 9315700 kB' 'Inactive: 3688932 kB' 'Active(anon): 8899284 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 576408 kB' 'Mapped: 148708 kB' 'Shmem: 8326232 kB' 'KReclaimable: 228092 kB' 'Slab: 873328 kB' 'SReclaimable: 228092 kB' 'SUnreclaim: 645236 kB' 'KernelStack: 21712 kB' 'PageTables: 7460 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10165664 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214224 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.734 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.734 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.735 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.735 06:59:23 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.735 06:59:23 -- setup/common.sh@33 -- # echo 0 00:04:05.735 06:59:23 -- setup/common.sh@33 -- # return 0 00:04:05.735 06:59:23 -- setup/hugepages.sh@99 -- # surp=0 00:04:05.735 06:59:23 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:05.735 06:59:23 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:05.735 06:59:23 -- setup/common.sh@18 -- # local node= 00:04:05.735 06:59:23 -- setup/common.sh@19 -- # local var val 00:04:05.735 06:59:23 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.735 06:59:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.735 06:59:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.735 06:59:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.735 06:59:23 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.735 06:59:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.736 06:59:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41727892 kB' 'MemAvailable: 45450420 kB' 'Buffers: 9316 kB' 'Cached: 12422276 kB' 'SwapCached: 0 kB' 'Active: 9316652 kB' 'Inactive: 3688932 kB' 'Active(anon): 8900236 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 576844 kB' 'Mapped: 149212 kB' 'Shmem: 8326244 kB' 'KReclaimable: 228092 kB' 'Slab: 873328 kB' 'SReclaimable: 228092 kB' 'SUnreclaim: 645236 kB' 'KernelStack: 21680 kB' 'PageTables: 7380 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10167296 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214192 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.736 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.736 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.737 06:59:23 -- setup/common.sh@33 -- # echo 0 00:04:05.737 06:59:23 -- setup/common.sh@33 -- # return 0 00:04:05.737 06:59:23 -- setup/hugepages.sh@100 -- # resv=0 00:04:05.737 06:59:23 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:05.737 nr_hugepages=1024 00:04:05.737 06:59:23 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:05.737 resv_hugepages=0 00:04:05.737 06:59:23 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:05.737 surplus_hugepages=0 00:04:05.737 06:59:23 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:05.737 anon_hugepages=0 00:04:05.737 06:59:23 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:05.737 06:59:23 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:05.737 06:59:23 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:05.737 06:59:23 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:05.737 06:59:23 -- setup/common.sh@18 -- # local node= 00:04:05.737 06:59:23 -- setup/common.sh@19 -- # local var val 00:04:05.737 06:59:23 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.737 06:59:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.737 06:59:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.737 06:59:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.737 06:59:23 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.737 06:59:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41724912 kB' 'MemAvailable: 45447440 kB' 'Buffers: 9316 kB' 'Cached: 12422292 kB' 'SwapCached: 0 kB' 'Active: 9321692 kB' 'Inactive: 3688932 kB' 'Active(anon): 8905276 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 582420 kB' 'Mapped: 149624 kB' 'Shmem: 8326260 kB' 'KReclaimable: 228092 kB' 'Slab: 873332 kB' 'SReclaimable: 228092 kB' 'SUnreclaim: 645240 kB' 'KernelStack: 21728 kB' 'PageTables: 7524 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10175216 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214196 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.737 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.737 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.738 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.738 06:59:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.738 06:59:23 -- setup/common.sh@33 -- # echo 1024 00:04:05.738 06:59:23 -- setup/common.sh@33 -- # return 0 00:04:05.738 06:59:23 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:05.738 06:59:23 -- setup/hugepages.sh@112 -- # get_nodes 00:04:05.738 06:59:23 -- setup/hugepages.sh@27 -- # local node 00:04:05.738 06:59:23 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:05.738 06:59:23 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:05.738 06:59:23 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:05.738 06:59:23 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:05.738 06:59:23 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:05.738 06:59:23 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:05.738 06:59:23 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:05.738 06:59:23 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:05.738 06:59:23 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:05.738 06:59:23 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:05.738 06:59:23 -- setup/common.sh@18 -- # local node=0 00:04:05.738 06:59:23 -- setup/common.sh@19 -- # local var val 00:04:05.738 06:59:23 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.739 06:59:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.739 06:59:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:05.739 06:59:23 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:05.739 06:59:23 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.739 06:59:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 19392352 kB' 'MemUsed: 13193016 kB' 'SwapCached: 0 kB' 'Active: 7121416 kB' 'Inactive: 3526132 kB' 'Active(anon): 6901796 kB' 'Inactive(anon): 0 kB' 'Active(file): 219620 kB' 'Inactive(file): 3526132 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10154864 kB' 'Mapped: 125304 kB' 'AnonPages: 496000 kB' 'Shmem: 6409112 kB' 'KernelStack: 12040 kB' 'PageTables: 5744 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 116492 kB' 'Slab: 430608 kB' 'SReclaimable: 116492 kB' 'SUnreclaim: 314116 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.739 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.739 06:59:23 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.739 06:59:23 -- setup/common.sh@33 -- # echo 0 00:04:05.739 06:59:23 -- setup/common.sh@33 -- # return 0 00:04:05.739 06:59:23 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:05.739 06:59:23 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:05.739 06:59:23 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:05.740 06:59:23 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:05.740 06:59:23 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:05.740 06:59:23 -- setup/common.sh@18 -- # local node=1 00:04:05.740 06:59:23 -- setup/common.sh@19 -- # local var val 00:04:05.740 06:59:23 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.740 06:59:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.740 06:59:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:05.740 06:59:23 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:05.740 06:59:23 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.740 06:59:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.740 06:59:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698428 kB' 'MemFree: 22336296 kB' 'MemUsed: 5362132 kB' 'SwapCached: 0 kB' 'Active: 2196416 kB' 'Inactive: 162800 kB' 'Active(anon): 1999620 kB' 'Inactive(anon): 0 kB' 'Active(file): 196796 kB' 'Inactive(file): 162800 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2276760 kB' 'Mapped: 24236 kB' 'AnonPages: 82600 kB' 'Shmem: 1917164 kB' 'KernelStack: 9720 kB' 'PageTables: 1556 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 111600 kB' 'Slab: 442716 kB' 'SReclaimable: 111600 kB' 'SUnreclaim: 331116 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # continue 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.740 06:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.740 06:59:23 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.741 06:59:23 -- setup/common.sh@33 -- # echo 0 00:04:05.741 06:59:23 -- setup/common.sh@33 -- # return 0 00:04:05.741 06:59:23 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:05.741 06:59:23 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:05.741 06:59:23 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:05.741 06:59:23 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:05.741 06:59:23 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:05.741 node0=512 expecting 512 00:04:05.741 06:59:23 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:05.741 06:59:23 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:05.741 06:59:23 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:05.741 06:59:23 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:05.741 node1=512 expecting 512 00:04:05.741 06:59:23 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:05.741 00:04:05.741 real 0m3.769s 00:04:05.741 user 0m1.474s 00:04:05.741 sys 0m2.370s 00:04:05.741 06:59:23 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:05.741 06:59:23 -- common/autotest_common.sh@10 -- # set +x 00:04:05.741 ************************************ 00:04:05.741 END TEST even_2G_alloc 00:04:05.741 ************************************ 00:04:06.000 06:59:23 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:06.000 06:59:23 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:06.000 06:59:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:06.000 06:59:23 -- common/autotest_common.sh@10 -- # set +x 00:04:06.000 ************************************ 00:04:06.000 START TEST odd_alloc 00:04:06.000 ************************************ 00:04:06.000 06:59:23 -- common/autotest_common.sh@1114 -- # odd_alloc 00:04:06.000 06:59:23 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:06.000 06:59:23 -- setup/hugepages.sh@49 -- # local size=2098176 00:04:06.000 06:59:24 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:06.000 06:59:24 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:06.000 06:59:24 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:06.000 06:59:24 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:06.000 06:59:24 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:06.000 06:59:24 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:06.000 06:59:24 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:06.000 06:59:24 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:06.000 06:59:24 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:06.000 06:59:24 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:06.000 06:59:24 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:06.000 06:59:24 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:06.000 06:59:24 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:06.000 06:59:24 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:06.000 06:59:24 -- setup/hugepages.sh@83 -- # : 513 00:04:06.000 06:59:24 -- setup/hugepages.sh@84 -- # : 1 00:04:06.000 06:59:24 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:06.000 06:59:24 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:06.000 06:59:24 -- setup/hugepages.sh@83 -- # : 0 00:04:06.000 06:59:24 -- setup/hugepages.sh@84 -- # : 0 00:04:06.000 06:59:24 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:06.001 06:59:24 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:06.001 06:59:24 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:06.001 06:59:24 -- setup/hugepages.sh@160 -- # setup output 00:04:06.001 06:59:24 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:06.001 06:59:24 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:09.293 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:09.293 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:09.293 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:09.293 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:09.293 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:09.293 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:09.293 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:09.293 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:09.293 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:09.293 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:09.293 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:09.293 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:09.293 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:09.293 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:09.293 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:09.293 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:09.293 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:09.559 06:59:27 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:09.559 06:59:27 -- setup/hugepages.sh@89 -- # local node 00:04:09.559 06:59:27 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:09.559 06:59:27 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:09.559 06:59:27 -- setup/hugepages.sh@92 -- # local surp 00:04:09.559 06:59:27 -- setup/hugepages.sh@93 -- # local resv 00:04:09.559 06:59:27 -- setup/hugepages.sh@94 -- # local anon 00:04:09.559 06:59:27 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:09.559 06:59:27 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:09.559 06:59:27 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:09.559 06:59:27 -- setup/common.sh@18 -- # local node= 00:04:09.559 06:59:27 -- setup/common.sh@19 -- # local var val 00:04:09.559 06:59:27 -- setup/common.sh@20 -- # local mem_f mem 00:04:09.559 06:59:27 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.559 06:59:27 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:09.559 06:59:27 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:09.559 06:59:27 -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.559 06:59:27 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.559 06:59:27 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41742416 kB' 'MemAvailable: 45464944 kB' 'Buffers: 9316 kB' 'Cached: 12422396 kB' 'SwapCached: 0 kB' 'Active: 9317548 kB' 'Inactive: 3688932 kB' 'Active(anon): 8901132 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 577668 kB' 'Mapped: 148720 kB' 'Shmem: 8326364 kB' 'KReclaimable: 228092 kB' 'Slab: 873560 kB' 'SReclaimable: 228092 kB' 'SUnreclaim: 645468 kB' 'KernelStack: 21696 kB' 'PageTables: 7300 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10166312 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.559 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.559 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.560 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.560 06:59:27 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:09.560 06:59:27 -- setup/common.sh@33 -- # echo 0 00:04:09.560 06:59:27 -- setup/common.sh@33 -- # return 0 00:04:09.560 06:59:27 -- setup/hugepages.sh@97 -- # anon=0 00:04:09.560 06:59:27 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:09.560 06:59:27 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:09.560 06:59:27 -- setup/common.sh@18 -- # local node= 00:04:09.560 06:59:27 -- setup/common.sh@19 -- # local var val 00:04:09.560 06:59:27 -- setup/common.sh@20 -- # local mem_f mem 00:04:09.560 06:59:27 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.560 06:59:27 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:09.560 06:59:27 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:09.560 06:59:27 -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.561 06:59:27 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41742964 kB' 'MemAvailable: 45465492 kB' 'Buffers: 9316 kB' 'Cached: 12422400 kB' 'SwapCached: 0 kB' 'Active: 9317000 kB' 'Inactive: 3688932 kB' 'Active(anon): 8900584 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 577580 kB' 'Mapped: 148716 kB' 'Shmem: 8326368 kB' 'KReclaimable: 228092 kB' 'Slab: 873588 kB' 'SReclaimable: 228092 kB' 'SUnreclaim: 645496 kB' 'KernelStack: 21712 kB' 'PageTables: 7456 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10166324 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.561 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.561 06:59:27 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.562 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.562 06:59:27 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.563 06:59:27 -- setup/common.sh@33 -- # echo 0 00:04:09.563 06:59:27 -- setup/common.sh@33 -- # return 0 00:04:09.563 06:59:27 -- setup/hugepages.sh@99 -- # surp=0 00:04:09.563 06:59:27 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:09.563 06:59:27 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:09.563 06:59:27 -- setup/common.sh@18 -- # local node= 00:04:09.563 06:59:27 -- setup/common.sh@19 -- # local var val 00:04:09.563 06:59:27 -- setup/common.sh@20 -- # local mem_f mem 00:04:09.563 06:59:27 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.563 06:59:27 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:09.563 06:59:27 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:09.563 06:59:27 -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.563 06:59:27 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.563 06:59:27 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41742964 kB' 'MemAvailable: 45465492 kB' 'Buffers: 9316 kB' 'Cached: 12422424 kB' 'SwapCached: 0 kB' 'Active: 9316964 kB' 'Inactive: 3688932 kB' 'Active(anon): 8900548 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 577504 kB' 'Mapped: 148716 kB' 'Shmem: 8326392 kB' 'KReclaimable: 228092 kB' 'Slab: 873588 kB' 'SReclaimable: 228092 kB' 'SUnreclaim: 645496 kB' 'KernelStack: 21696 kB' 'PageTables: 7400 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10166340 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.563 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.563 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.564 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.564 06:59:27 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:09.565 06:59:27 -- setup/common.sh@33 -- # echo 0 00:04:09.565 06:59:27 -- setup/common.sh@33 -- # return 0 00:04:09.565 06:59:27 -- setup/hugepages.sh@100 -- # resv=0 00:04:09.565 06:59:27 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:09.565 nr_hugepages=1025 00:04:09.565 06:59:27 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:09.565 resv_hugepages=0 00:04:09.565 06:59:27 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:09.565 surplus_hugepages=0 00:04:09.565 06:59:27 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:09.565 anon_hugepages=0 00:04:09.565 06:59:27 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:09.565 06:59:27 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:09.565 06:59:27 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:09.565 06:59:27 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:09.565 06:59:27 -- setup/common.sh@18 -- # local node= 00:04:09.565 06:59:27 -- setup/common.sh@19 -- # local var val 00:04:09.565 06:59:27 -- setup/common.sh@20 -- # local mem_f mem 00:04:09.565 06:59:27 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.565 06:59:27 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:09.565 06:59:27 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:09.565 06:59:27 -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.565 06:59:27 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.565 06:59:27 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41743576 kB' 'MemAvailable: 45466104 kB' 'Buffers: 9316 kB' 'Cached: 12422424 kB' 'SwapCached: 0 kB' 'Active: 9317036 kB' 'Inactive: 3688932 kB' 'Active(anon): 8900620 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 577576 kB' 'Mapped: 148716 kB' 'Shmem: 8326392 kB' 'KReclaimable: 228092 kB' 'Slab: 873588 kB' 'SReclaimable: 228092 kB' 'SUnreclaim: 645496 kB' 'KernelStack: 21712 kB' 'PageTables: 7456 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10166352 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.565 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.565 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.566 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.566 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:09.567 06:59:27 -- setup/common.sh@33 -- # echo 1025 00:04:09.567 06:59:27 -- setup/common.sh@33 -- # return 0 00:04:09.567 06:59:27 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:09.567 06:59:27 -- setup/hugepages.sh@112 -- # get_nodes 00:04:09.567 06:59:27 -- setup/hugepages.sh@27 -- # local node 00:04:09.567 06:59:27 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:09.567 06:59:27 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:09.567 06:59:27 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:09.567 06:59:27 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:09.567 06:59:27 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:09.567 06:59:27 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:09.567 06:59:27 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:09.567 06:59:27 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:09.567 06:59:27 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:09.567 06:59:27 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:09.567 06:59:27 -- setup/common.sh@18 -- # local node=0 00:04:09.567 06:59:27 -- setup/common.sh@19 -- # local var val 00:04:09.567 06:59:27 -- setup/common.sh@20 -- # local mem_f mem 00:04:09.567 06:59:27 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.567 06:59:27 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:09.567 06:59:27 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:09.567 06:59:27 -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.567 06:59:27 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.567 06:59:27 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 19409776 kB' 'MemUsed: 13175592 kB' 'SwapCached: 0 kB' 'Active: 7120224 kB' 'Inactive: 3526132 kB' 'Active(anon): 6900604 kB' 'Inactive(anon): 0 kB' 'Active(file): 219620 kB' 'Inactive(file): 3526132 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10154972 kB' 'Mapped: 125308 kB' 'AnonPages: 494644 kB' 'Shmem: 6409220 kB' 'KernelStack: 12056 kB' 'PageTables: 5744 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 116492 kB' 'Slab: 430808 kB' 'SReclaimable: 116492 kB' 'SUnreclaim: 314316 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.567 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.567 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.568 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.568 06:59:27 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.568 06:59:27 -- setup/common.sh@33 -- # echo 0 00:04:09.568 06:59:27 -- setup/common.sh@33 -- # return 0 00:04:09.568 06:59:27 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:09.569 06:59:27 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:09.569 06:59:27 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:09.569 06:59:27 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:09.569 06:59:27 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:09.569 06:59:27 -- setup/common.sh@18 -- # local node=1 00:04:09.569 06:59:27 -- setup/common.sh@19 -- # local var val 00:04:09.569 06:59:27 -- setup/common.sh@20 -- # local mem_f mem 00:04:09.569 06:59:27 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:09.569 06:59:27 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:09.569 06:59:27 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:09.569 06:59:27 -- setup/common.sh@28 -- # mapfile -t mem 00:04:09.569 06:59:27 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698428 kB' 'MemFree: 22334476 kB' 'MemUsed: 5363952 kB' 'SwapCached: 0 kB' 'Active: 2196804 kB' 'Inactive: 162800 kB' 'Active(anon): 2000008 kB' 'Inactive(anon): 0 kB' 'Active(file): 196796 kB' 'Inactive(file): 162800 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2276784 kB' 'Mapped: 23408 kB' 'AnonPages: 82928 kB' 'Shmem: 1917188 kB' 'KernelStack: 9656 kB' 'PageTables: 1712 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 111600 kB' 'Slab: 442780 kB' 'SReclaimable: 111600 kB' 'SUnreclaim: 331180 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.569 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.569 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # continue 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # IFS=': ' 00:04:09.570 06:59:27 -- setup/common.sh@31 -- # read -r var val _ 00:04:09.570 06:59:27 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:09.570 06:59:27 -- setup/common.sh@33 -- # echo 0 00:04:09.570 06:59:27 -- setup/common.sh@33 -- # return 0 00:04:09.570 06:59:27 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:09.570 06:59:27 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:09.570 06:59:27 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:09.570 06:59:27 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:09.570 06:59:27 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:09.570 node0=512 expecting 513 00:04:09.570 06:59:27 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:09.570 06:59:27 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:09.570 06:59:27 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:09.570 06:59:27 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:09.570 node1=513 expecting 512 00:04:09.570 06:59:27 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:09.570 00:04:09.570 real 0m3.735s 00:04:09.570 user 0m1.428s 00:04:09.570 sys 0m2.384s 00:04:09.570 06:59:27 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:09.570 06:59:27 -- common/autotest_common.sh@10 -- # set +x 00:04:09.570 ************************************ 00:04:09.570 END TEST odd_alloc 00:04:09.570 ************************************ 00:04:09.570 06:59:27 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:09.570 06:59:27 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:09.570 06:59:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:09.570 06:59:27 -- common/autotest_common.sh@10 -- # set +x 00:04:09.570 ************************************ 00:04:09.570 START TEST custom_alloc 00:04:09.570 ************************************ 00:04:09.570 06:59:27 -- common/autotest_common.sh@1114 -- # custom_alloc 00:04:09.570 06:59:27 -- setup/hugepages.sh@167 -- # local IFS=, 00:04:09.570 06:59:27 -- setup/hugepages.sh@169 -- # local node 00:04:09.570 06:59:27 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:09.570 06:59:27 -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:09.570 06:59:27 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:09.570 06:59:27 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:09.570 06:59:27 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:09.570 06:59:27 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:09.570 06:59:27 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:09.570 06:59:27 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:09.570 06:59:27 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:09.570 06:59:27 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:09.570 06:59:27 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:09.570 06:59:27 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:09.570 06:59:27 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:09.570 06:59:27 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:09.570 06:59:27 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:09.570 06:59:27 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:09.570 06:59:27 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:09.570 06:59:27 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:09.570 06:59:27 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:09.570 06:59:27 -- setup/hugepages.sh@83 -- # : 256 00:04:09.570 06:59:27 -- setup/hugepages.sh@84 -- # : 1 00:04:09.570 06:59:27 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:09.570 06:59:27 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:09.570 06:59:27 -- setup/hugepages.sh@83 -- # : 0 00:04:09.571 06:59:27 -- setup/hugepages.sh@84 -- # : 0 00:04:09.571 06:59:27 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:09.571 06:59:27 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:09.571 06:59:27 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:09.571 06:59:27 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:09.571 06:59:27 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:09.571 06:59:27 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:09.830 06:59:27 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:09.830 06:59:27 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:09.830 06:59:27 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:09.830 06:59:27 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:09.830 06:59:27 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:09.830 06:59:27 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:09.830 06:59:27 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:09.830 06:59:27 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:09.830 06:59:27 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:09.830 06:59:27 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:09.830 06:59:27 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:09.830 06:59:27 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:09.830 06:59:27 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:09.830 06:59:27 -- setup/hugepages.sh@78 -- # return 0 00:04:09.830 06:59:27 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:09.830 06:59:27 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:09.830 06:59:27 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:09.830 06:59:27 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:09.830 06:59:27 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:09.830 06:59:27 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:09.830 06:59:27 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:09.830 06:59:27 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:09.830 06:59:27 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:09.830 06:59:27 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:09.830 06:59:27 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:09.830 06:59:27 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:09.830 06:59:27 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:09.830 06:59:27 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:09.830 06:59:27 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:09.830 06:59:27 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:09.830 06:59:27 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:09.830 06:59:27 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:09.830 06:59:27 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:09.830 06:59:27 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:09.830 06:59:27 -- setup/hugepages.sh@78 -- # return 0 00:04:09.830 06:59:27 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:09.830 06:59:27 -- setup/hugepages.sh@187 -- # setup output 00:04:09.830 06:59:27 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:09.830 06:59:27 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:13.121 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:13.121 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:13.121 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:13.121 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:13.121 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:13.121 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:13.121 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:13.121 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:13.121 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:13.121 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:13.121 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:13.121 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:13.121 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:13.121 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:13.121 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:13.121 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:13.121 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:13.121 06:59:31 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:13.121 06:59:31 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:13.121 06:59:31 -- setup/hugepages.sh@89 -- # local node 00:04:13.121 06:59:31 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:13.121 06:59:31 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:13.121 06:59:31 -- setup/hugepages.sh@92 -- # local surp 00:04:13.121 06:59:31 -- setup/hugepages.sh@93 -- # local resv 00:04:13.121 06:59:31 -- setup/hugepages.sh@94 -- # local anon 00:04:13.121 06:59:31 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:13.121 06:59:31 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:13.121 06:59:31 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:13.121 06:59:31 -- setup/common.sh@18 -- # local node= 00:04:13.121 06:59:31 -- setup/common.sh@19 -- # local var val 00:04:13.121 06:59:31 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.121 06:59:31 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.121 06:59:31 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.121 06:59:31 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.121 06:59:31 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.121 06:59:31 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.121 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.121 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.121 06:59:31 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40694640 kB' 'MemAvailable: 44417216 kB' 'Buffers: 9316 kB' 'Cached: 12422524 kB' 'SwapCached: 0 kB' 'Active: 9319036 kB' 'Inactive: 3688932 kB' 'Active(anon): 8902620 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 579416 kB' 'Mapped: 148744 kB' 'Shmem: 8326492 kB' 'KReclaimable: 228188 kB' 'Slab: 874160 kB' 'SReclaimable: 228188 kB' 'SUnreclaim: 645972 kB' 'KernelStack: 21760 kB' 'PageTables: 7624 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10166964 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:13.121 06:59:31 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.121 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.121 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.385 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.385 06:59:31 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.386 06:59:31 -- setup/common.sh@33 -- # echo 0 00:04:13.386 06:59:31 -- setup/common.sh@33 -- # return 0 00:04:13.386 06:59:31 -- setup/hugepages.sh@97 -- # anon=0 00:04:13.386 06:59:31 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:13.386 06:59:31 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.386 06:59:31 -- setup/common.sh@18 -- # local node= 00:04:13.386 06:59:31 -- setup/common.sh@19 -- # local var val 00:04:13.386 06:59:31 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.386 06:59:31 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.386 06:59:31 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.386 06:59:31 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.386 06:59:31 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.386 06:59:31 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40695168 kB' 'MemAvailable: 44417744 kB' 'Buffers: 9316 kB' 'Cached: 12422524 kB' 'SwapCached: 0 kB' 'Active: 9318052 kB' 'Inactive: 3688932 kB' 'Active(anon): 8901636 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 578380 kB' 'Mapped: 148724 kB' 'Shmem: 8326492 kB' 'KReclaimable: 228188 kB' 'Slab: 874216 kB' 'SReclaimable: 228188 kB' 'SUnreclaim: 646028 kB' 'KernelStack: 21712 kB' 'PageTables: 7460 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10166976 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214256 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.386 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.386 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.387 06:59:31 -- setup/common.sh@33 -- # echo 0 00:04:13.387 06:59:31 -- setup/common.sh@33 -- # return 0 00:04:13.387 06:59:31 -- setup/hugepages.sh@99 -- # surp=0 00:04:13.387 06:59:31 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:13.387 06:59:31 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:13.387 06:59:31 -- setup/common.sh@18 -- # local node= 00:04:13.387 06:59:31 -- setup/common.sh@19 -- # local var val 00:04:13.387 06:59:31 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.387 06:59:31 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.387 06:59:31 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.387 06:59:31 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.387 06:59:31 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.387 06:59:31 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.387 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.387 06:59:31 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40695168 kB' 'MemAvailable: 44417744 kB' 'Buffers: 9316 kB' 'Cached: 12422524 kB' 'SwapCached: 0 kB' 'Active: 9317748 kB' 'Inactive: 3688932 kB' 'Active(anon): 8901332 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 578064 kB' 'Mapped: 148724 kB' 'Shmem: 8326492 kB' 'KReclaimable: 228188 kB' 'Slab: 874220 kB' 'SReclaimable: 228188 kB' 'SUnreclaim: 646032 kB' 'KernelStack: 21712 kB' 'PageTables: 7460 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10166992 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214256 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.388 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.388 06:59:31 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.389 06:59:31 -- setup/common.sh@33 -- # echo 0 00:04:13.389 06:59:31 -- setup/common.sh@33 -- # return 0 00:04:13.389 06:59:31 -- setup/hugepages.sh@100 -- # resv=0 00:04:13.389 06:59:31 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:13.389 nr_hugepages=1536 00:04:13.389 06:59:31 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:13.389 resv_hugepages=0 00:04:13.389 06:59:31 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:13.389 surplus_hugepages=0 00:04:13.389 06:59:31 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:13.389 anon_hugepages=0 00:04:13.389 06:59:31 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:13.389 06:59:31 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:13.389 06:59:31 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:13.389 06:59:31 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:13.389 06:59:31 -- setup/common.sh@18 -- # local node= 00:04:13.389 06:59:31 -- setup/common.sh@19 -- # local var val 00:04:13.389 06:59:31 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.389 06:59:31 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.389 06:59:31 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.389 06:59:31 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.389 06:59:31 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.389 06:59:31 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40695096 kB' 'MemAvailable: 44417672 kB' 'Buffers: 9316 kB' 'Cached: 12422564 kB' 'SwapCached: 0 kB' 'Active: 9317724 kB' 'Inactive: 3688932 kB' 'Active(anon): 8901308 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 577996 kB' 'Mapped: 148724 kB' 'Shmem: 8326532 kB' 'KReclaimable: 228188 kB' 'Slab: 874220 kB' 'SReclaimable: 228188 kB' 'SUnreclaim: 646032 kB' 'KernelStack: 21696 kB' 'PageTables: 7404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10167004 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214256 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.389 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.389 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.390 06:59:31 -- setup/common.sh@33 -- # echo 1536 00:04:13.390 06:59:31 -- setup/common.sh@33 -- # return 0 00:04:13.390 06:59:31 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:13.390 06:59:31 -- setup/hugepages.sh@112 -- # get_nodes 00:04:13.390 06:59:31 -- setup/hugepages.sh@27 -- # local node 00:04:13.390 06:59:31 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:13.390 06:59:31 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:13.390 06:59:31 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:13.390 06:59:31 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:13.390 06:59:31 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:13.390 06:59:31 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:13.390 06:59:31 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:13.390 06:59:31 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:13.390 06:59:31 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:13.390 06:59:31 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.390 06:59:31 -- setup/common.sh@18 -- # local node=0 00:04:13.390 06:59:31 -- setup/common.sh@19 -- # local var val 00:04:13.390 06:59:31 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.390 06:59:31 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.390 06:59:31 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:13.390 06:59:31 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:13.390 06:59:31 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.390 06:59:31 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 19424148 kB' 'MemUsed: 13161220 kB' 'SwapCached: 0 kB' 'Active: 7120628 kB' 'Inactive: 3526132 kB' 'Active(anon): 6901008 kB' 'Inactive(anon): 0 kB' 'Active(file): 219620 kB' 'Inactive(file): 3526132 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10155092 kB' 'Mapped: 125316 kB' 'AnonPages: 494820 kB' 'Shmem: 6409340 kB' 'KernelStack: 12024 kB' 'PageTables: 5644 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 116588 kB' 'Slab: 431000 kB' 'SReclaimable: 116588 kB' 'SUnreclaim: 314412 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.390 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.390 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.391 06:59:31 -- setup/common.sh@33 -- # echo 0 00:04:13.391 06:59:31 -- setup/common.sh@33 -- # return 0 00:04:13.391 06:59:31 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:13.391 06:59:31 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:13.391 06:59:31 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:13.391 06:59:31 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:13.391 06:59:31 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.391 06:59:31 -- setup/common.sh@18 -- # local node=1 00:04:13.391 06:59:31 -- setup/common.sh@19 -- # local var val 00:04:13.391 06:59:31 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.391 06:59:31 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.391 06:59:31 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:13.391 06:59:31 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:13.391 06:59:31 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.391 06:59:31 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.391 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.391 06:59:31 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698428 kB' 'MemFree: 21272736 kB' 'MemUsed: 6425692 kB' 'SwapCached: 0 kB' 'Active: 2197312 kB' 'Inactive: 162800 kB' 'Active(anon): 2000516 kB' 'Inactive(anon): 0 kB' 'Active(file): 196796 kB' 'Inactive(file): 162800 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2276792 kB' 'Mapped: 23408 kB' 'AnonPages: 83432 kB' 'Shmem: 1917196 kB' 'KernelStack: 9656 kB' 'PageTables: 1724 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 111600 kB' 'Slab: 443220 kB' 'SReclaimable: 111600 kB' 'SUnreclaim: 331620 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # continue 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.392 06:59:31 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.392 06:59:31 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.392 06:59:31 -- setup/common.sh@33 -- # echo 0 00:04:13.392 06:59:31 -- setup/common.sh@33 -- # return 0 00:04:13.392 06:59:31 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:13.392 06:59:31 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:13.392 06:59:31 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:13.392 06:59:31 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:13.392 06:59:31 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:13.392 node0=512 expecting 512 00:04:13.392 06:59:31 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:13.392 06:59:31 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:13.392 06:59:31 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:13.392 06:59:31 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:13.392 node1=1024 expecting 1024 00:04:13.392 06:59:31 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:13.392 00:04:13.392 real 0m3.752s 00:04:13.392 user 0m1.410s 00:04:13.392 sys 0m2.411s 00:04:13.392 06:59:31 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:13.392 06:59:31 -- common/autotest_common.sh@10 -- # set +x 00:04:13.392 ************************************ 00:04:13.392 END TEST custom_alloc 00:04:13.392 ************************************ 00:04:13.392 06:59:31 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:13.392 06:59:31 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:13.392 06:59:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:13.392 06:59:31 -- common/autotest_common.sh@10 -- # set +x 00:04:13.392 ************************************ 00:04:13.393 START TEST no_shrink_alloc 00:04:13.393 ************************************ 00:04:13.393 06:59:31 -- common/autotest_common.sh@1114 -- # no_shrink_alloc 00:04:13.393 06:59:31 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:13.393 06:59:31 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:13.393 06:59:31 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:13.393 06:59:31 -- setup/hugepages.sh@51 -- # shift 00:04:13.393 06:59:31 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:13.393 06:59:31 -- setup/hugepages.sh@52 -- # local node_ids 00:04:13.393 06:59:31 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:13.393 06:59:31 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:13.393 06:59:31 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:13.393 06:59:31 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:13.393 06:59:31 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:13.393 06:59:31 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:13.393 06:59:31 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:13.393 06:59:31 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:13.393 06:59:31 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:13.393 06:59:31 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:13.393 06:59:31 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:13.393 06:59:31 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:13.393 06:59:31 -- setup/hugepages.sh@73 -- # return 0 00:04:13.393 06:59:31 -- setup/hugepages.sh@198 -- # setup output 00:04:13.393 06:59:31 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:13.393 06:59:31 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:17.592 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:17.592 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:17.592 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:17.592 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:17.592 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:17.592 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:17.592 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:17.592 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:17.592 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:17.592 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:17.592 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:17.592 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:17.592 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:17.592 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:17.592 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:17.592 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:17.592 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:17.592 06:59:35 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:17.592 06:59:35 -- setup/hugepages.sh@89 -- # local node 00:04:17.592 06:59:35 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:17.592 06:59:35 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:17.592 06:59:35 -- setup/hugepages.sh@92 -- # local surp 00:04:17.592 06:59:35 -- setup/hugepages.sh@93 -- # local resv 00:04:17.592 06:59:35 -- setup/hugepages.sh@94 -- # local anon 00:04:17.592 06:59:35 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:17.592 06:59:35 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:17.592 06:59:35 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:17.592 06:59:35 -- setup/common.sh@18 -- # local node= 00:04:17.592 06:59:35 -- setup/common.sh@19 -- # local var val 00:04:17.592 06:59:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.592 06:59:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.592 06:59:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.592 06:59:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.592 06:59:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.592 06:59:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41735048 kB' 'MemAvailable: 45457624 kB' 'Buffers: 9316 kB' 'Cached: 12422664 kB' 'SwapCached: 0 kB' 'Active: 9319456 kB' 'Inactive: 3688932 kB' 'Active(anon): 8903040 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 579608 kB' 'Mapped: 148840 kB' 'Shmem: 8326632 kB' 'KReclaimable: 228188 kB' 'Slab: 874660 kB' 'SReclaimable: 228188 kB' 'SUnreclaim: 646472 kB' 'KernelStack: 21792 kB' 'PageTables: 7836 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10200144 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214480 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.592 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.592 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:17.593 06:59:35 -- setup/common.sh@33 -- # echo 0 00:04:17.593 06:59:35 -- setup/common.sh@33 -- # return 0 00:04:17.593 06:59:35 -- setup/hugepages.sh@97 -- # anon=0 00:04:17.593 06:59:35 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:17.593 06:59:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:17.593 06:59:35 -- setup/common.sh@18 -- # local node= 00:04:17.593 06:59:35 -- setup/common.sh@19 -- # local var val 00:04:17.593 06:59:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.593 06:59:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.593 06:59:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.593 06:59:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.593 06:59:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.593 06:59:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41744280 kB' 'MemAvailable: 45466856 kB' 'Buffers: 9316 kB' 'Cached: 12422668 kB' 'SwapCached: 0 kB' 'Active: 9319704 kB' 'Inactive: 3688932 kB' 'Active(anon): 8903288 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 580088 kB' 'Mapped: 148728 kB' 'Shmem: 8326636 kB' 'KReclaimable: 228188 kB' 'Slab: 874644 kB' 'SReclaimable: 228188 kB' 'SUnreclaim: 646456 kB' 'KernelStack: 21712 kB' 'PageTables: 7404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10170428 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.593 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.593 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.594 06:59:35 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.594 06:59:35 -- setup/common.sh@33 -- # echo 0 00:04:17.594 06:59:35 -- setup/common.sh@33 -- # return 0 00:04:17.594 06:59:35 -- setup/hugepages.sh@99 -- # surp=0 00:04:17.594 06:59:35 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:17.594 06:59:35 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:17.594 06:59:35 -- setup/common.sh@18 -- # local node= 00:04:17.594 06:59:35 -- setup/common.sh@19 -- # local var val 00:04:17.594 06:59:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.594 06:59:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.594 06:59:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.594 06:59:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.594 06:59:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.594 06:59:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.594 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41746916 kB' 'MemAvailable: 45469492 kB' 'Buffers: 9316 kB' 'Cached: 12422684 kB' 'SwapCached: 0 kB' 'Active: 9319288 kB' 'Inactive: 3688932 kB' 'Active(anon): 8902872 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 579524 kB' 'Mapped: 148728 kB' 'Shmem: 8326652 kB' 'KReclaimable: 228188 kB' 'Slab: 874644 kB' 'SReclaimable: 228188 kB' 'SUnreclaim: 646456 kB' 'KernelStack: 21920 kB' 'PageTables: 7892 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10171964 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.595 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.595 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:17.596 06:59:35 -- setup/common.sh@33 -- # echo 0 00:04:17.596 06:59:35 -- setup/common.sh@33 -- # return 0 00:04:17.596 06:59:35 -- setup/hugepages.sh@100 -- # resv=0 00:04:17.596 06:59:35 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:17.596 nr_hugepages=1024 00:04:17.596 06:59:35 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:17.596 resv_hugepages=0 00:04:17.596 06:59:35 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:17.596 surplus_hugepages=0 00:04:17.596 06:59:35 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:17.596 anon_hugepages=0 00:04:17.596 06:59:35 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:17.596 06:59:35 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:17.596 06:59:35 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:17.596 06:59:35 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:17.596 06:59:35 -- setup/common.sh@18 -- # local node= 00:04:17.596 06:59:35 -- setup/common.sh@19 -- # local var val 00:04:17.596 06:59:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.596 06:59:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.596 06:59:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:17.596 06:59:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:17.596 06:59:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.596 06:59:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41745332 kB' 'MemAvailable: 45467908 kB' 'Buffers: 9316 kB' 'Cached: 12422684 kB' 'SwapCached: 0 kB' 'Active: 9319076 kB' 'Inactive: 3688932 kB' 'Active(anon): 8902660 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 579304 kB' 'Mapped: 148712 kB' 'Shmem: 8326652 kB' 'KReclaimable: 228188 kB' 'Slab: 874644 kB' 'SReclaimable: 228188 kB' 'SUnreclaim: 646456 kB' 'KernelStack: 21808 kB' 'PageTables: 7732 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10171976 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214336 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.596 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.596 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:17.597 06:59:35 -- setup/common.sh@33 -- # echo 1024 00:04:17.597 06:59:35 -- setup/common.sh@33 -- # return 0 00:04:17.597 06:59:35 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:17.597 06:59:35 -- setup/hugepages.sh@112 -- # get_nodes 00:04:17.597 06:59:35 -- setup/hugepages.sh@27 -- # local node 00:04:17.597 06:59:35 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:17.597 06:59:35 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:17.597 06:59:35 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:17.597 06:59:35 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:17.597 06:59:35 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:17.597 06:59:35 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:17.597 06:59:35 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:17.597 06:59:35 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:17.597 06:59:35 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:17.597 06:59:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:17.597 06:59:35 -- setup/common.sh@18 -- # local node=0 00:04:17.597 06:59:35 -- setup/common.sh@19 -- # local var val 00:04:17.597 06:59:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:17.597 06:59:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:17.597 06:59:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:17.597 06:59:35 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:17.597 06:59:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:17.597 06:59:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:17.597 06:59:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 18373656 kB' 'MemUsed: 14211712 kB' 'SwapCached: 0 kB' 'Active: 7124552 kB' 'Inactive: 3526132 kB' 'Active(anon): 6904932 kB' 'Inactive(anon): 0 kB' 'Active(file): 219620 kB' 'Inactive(file): 3526132 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10155228 kB' 'Mapped: 125320 kB' 'AnonPages: 498728 kB' 'Shmem: 6409476 kB' 'KernelStack: 12024 kB' 'PageTables: 5648 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 116588 kB' 'Slab: 431080 kB' 'SReclaimable: 116588 kB' 'SUnreclaim: 314492 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.597 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.597 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # continue 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:17.598 06:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:17.598 06:59:35 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:17.598 06:59:35 -- setup/common.sh@33 -- # echo 0 00:04:17.598 06:59:35 -- setup/common.sh@33 -- # return 0 00:04:17.598 06:59:35 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:17.598 06:59:35 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:17.598 06:59:35 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:17.598 06:59:35 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:17.598 06:59:35 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:17.598 node0=1024 expecting 1024 00:04:17.598 06:59:35 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:17.598 06:59:35 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:17.598 06:59:35 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:17.598 06:59:35 -- setup/hugepages.sh@202 -- # setup output 00:04:17.598 06:59:35 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:17.598 06:59:35 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:20.894 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:20.894 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:20.894 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:20.894 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:20.894 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:20.894 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:20.894 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:20.894 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:20.894 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:20.894 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:20.894 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:20.894 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:20.894 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:20.894 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:20.894 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:20.894 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:20.894 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:20.894 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:20.894 06:59:38 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:20.894 06:59:38 -- setup/hugepages.sh@89 -- # local node 00:04:20.894 06:59:38 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:20.894 06:59:38 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:20.894 06:59:38 -- setup/hugepages.sh@92 -- # local surp 00:04:20.894 06:59:38 -- setup/hugepages.sh@93 -- # local resv 00:04:20.894 06:59:38 -- setup/hugepages.sh@94 -- # local anon 00:04:20.894 06:59:38 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:20.894 06:59:38 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:20.894 06:59:38 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:20.894 06:59:38 -- setup/common.sh@18 -- # local node= 00:04:20.894 06:59:38 -- setup/common.sh@19 -- # local var val 00:04:20.894 06:59:38 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.894 06:59:38 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.894 06:59:38 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.894 06:59:38 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.894 06:59:38 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.894 06:59:38 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.894 06:59:38 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41729040 kB' 'MemAvailable: 45451616 kB' 'Buffers: 9316 kB' 'Cached: 12422792 kB' 'SwapCached: 0 kB' 'Active: 9320560 kB' 'Inactive: 3688932 kB' 'Active(anon): 8904144 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 580148 kB' 'Mapped: 148764 kB' 'Shmem: 8326760 kB' 'KReclaimable: 228188 kB' 'Slab: 874512 kB' 'SReclaimable: 228188 kB' 'SUnreclaim: 646324 kB' 'KernelStack: 21744 kB' 'PageTables: 7560 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10168536 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.894 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.894 06:59:38 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.895 06:59:38 -- setup/common.sh@33 -- # echo 0 00:04:20.895 06:59:38 -- setup/common.sh@33 -- # return 0 00:04:20.895 06:59:38 -- setup/hugepages.sh@97 -- # anon=0 00:04:20.895 06:59:38 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:20.895 06:59:38 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.895 06:59:38 -- setup/common.sh@18 -- # local node= 00:04:20.895 06:59:38 -- setup/common.sh@19 -- # local var val 00:04:20.895 06:59:38 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.895 06:59:38 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.895 06:59:38 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.895 06:59:38 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.895 06:59:38 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.895 06:59:38 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41729836 kB' 'MemAvailable: 45452412 kB' 'Buffers: 9316 kB' 'Cached: 12422796 kB' 'SwapCached: 0 kB' 'Active: 9320180 kB' 'Inactive: 3688932 kB' 'Active(anon): 8903764 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 580332 kB' 'Mapped: 148812 kB' 'Shmem: 8326764 kB' 'KReclaimable: 228188 kB' 'Slab: 874656 kB' 'SReclaimable: 228188 kB' 'SUnreclaim: 646468 kB' 'KernelStack: 21712 kB' 'PageTables: 7464 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10168548 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214272 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.895 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.895 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.896 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.896 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.897 06:59:38 -- setup/common.sh@33 -- # echo 0 00:04:20.897 06:59:38 -- setup/common.sh@33 -- # return 0 00:04:20.897 06:59:38 -- setup/hugepages.sh@99 -- # surp=0 00:04:20.897 06:59:38 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:20.897 06:59:38 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:20.897 06:59:38 -- setup/common.sh@18 -- # local node= 00:04:20.897 06:59:38 -- setup/common.sh@19 -- # local var val 00:04:20.897 06:59:38 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.897 06:59:38 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.897 06:59:38 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.897 06:59:38 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.897 06:59:38 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.897 06:59:38 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41732612 kB' 'MemAvailable: 45455188 kB' 'Buffers: 9316 kB' 'Cached: 12422808 kB' 'SwapCached: 0 kB' 'Active: 9320188 kB' 'Inactive: 3688932 kB' 'Active(anon): 8903772 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 580324 kB' 'Mapped: 148736 kB' 'Shmem: 8326776 kB' 'KReclaimable: 228188 kB' 'Slab: 874656 kB' 'SReclaimable: 228188 kB' 'SUnreclaim: 646468 kB' 'KernelStack: 21728 kB' 'PageTables: 7512 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10168564 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214272 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.897 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.897 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.898 06:59:38 -- setup/common.sh@33 -- # echo 0 00:04:20.898 06:59:38 -- setup/common.sh@33 -- # return 0 00:04:20.898 06:59:38 -- setup/hugepages.sh@100 -- # resv=0 00:04:20.898 06:59:38 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:20.898 nr_hugepages=1024 00:04:20.898 06:59:38 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:20.898 resv_hugepages=0 00:04:20.898 06:59:38 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:20.898 surplus_hugepages=0 00:04:20.898 06:59:38 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:20.898 anon_hugepages=0 00:04:20.898 06:59:38 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:20.898 06:59:38 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:20.898 06:59:38 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:20.898 06:59:38 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:20.898 06:59:38 -- setup/common.sh@18 -- # local node= 00:04:20.898 06:59:38 -- setup/common.sh@19 -- # local var val 00:04:20.898 06:59:38 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.898 06:59:38 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.898 06:59:38 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.898 06:59:38 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.898 06:59:38 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.898 06:59:38 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41733120 kB' 'MemAvailable: 45455696 kB' 'Buffers: 9316 kB' 'Cached: 12422808 kB' 'SwapCached: 0 kB' 'Active: 9320220 kB' 'Inactive: 3688932 kB' 'Active(anon): 8903804 kB' 'Inactive(anon): 0 kB' 'Active(file): 416416 kB' 'Inactive(file): 3688932 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 580360 kB' 'Mapped: 148736 kB' 'Shmem: 8326776 kB' 'KReclaimable: 228188 kB' 'Slab: 874656 kB' 'SReclaimable: 228188 kB' 'SUnreclaim: 646468 kB' 'KernelStack: 21744 kB' 'PageTables: 7568 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10168576 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214272 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 490868 kB' 'DirectMap2M: 10729472 kB' 'DirectMap1G: 58720256 kB' 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.898 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.898 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.899 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.899 06:59:38 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.899 06:59:38 -- setup/common.sh@33 -- # echo 1024 00:04:20.899 06:59:38 -- setup/common.sh@33 -- # return 0 00:04:20.899 06:59:38 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:20.899 06:59:38 -- setup/hugepages.sh@112 -- # get_nodes 00:04:20.899 06:59:38 -- setup/hugepages.sh@27 -- # local node 00:04:20.899 06:59:38 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:20.899 06:59:38 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:20.899 06:59:38 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:20.899 06:59:38 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:20.900 06:59:38 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:20.900 06:59:38 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:20.900 06:59:38 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:20.900 06:59:38 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:20.900 06:59:38 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:20.900 06:59:38 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.900 06:59:38 -- setup/common.sh@18 -- # local node=0 00:04:20.900 06:59:38 -- setup/common.sh@19 -- # local var val 00:04:20.900 06:59:38 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.900 06:59:38 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.900 06:59:38 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:20.900 06:59:38 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:20.900 06:59:38 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.900 06:59:38 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 18349108 kB' 'MemUsed: 14236260 kB' 'SwapCached: 0 kB' 'Active: 7122992 kB' 'Inactive: 3526132 kB' 'Active(anon): 6903372 kB' 'Inactive(anon): 0 kB' 'Active(file): 219620 kB' 'Inactive(file): 3526132 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10155352 kB' 'Mapped: 125328 kB' 'AnonPages: 497088 kB' 'Shmem: 6409600 kB' 'KernelStack: 12088 kB' 'PageTables: 5828 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 116588 kB' 'Slab: 431256 kB' 'SReclaimable: 116588 kB' 'SUnreclaim: 314668 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.900 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.900 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.901 06:59:38 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.901 06:59:38 -- setup/common.sh@32 -- # continue 00:04:20.901 06:59:38 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.901 06:59:38 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.901 06:59:38 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.901 06:59:38 -- setup/common.sh@33 -- # echo 0 00:04:20.901 06:59:38 -- setup/common.sh@33 -- # return 0 00:04:20.901 06:59:38 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:20.901 06:59:38 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:20.901 06:59:38 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:20.901 06:59:39 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:20.901 06:59:39 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:20.901 node0=1024 expecting 1024 00:04:20.901 06:59:39 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:20.901 00:04:20.901 real 0m7.417s 00:04:20.901 user 0m2.784s 00:04:20.901 sys 0m4.775s 00:04:20.901 06:59:39 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:20.901 06:59:39 -- common/autotest_common.sh@10 -- # set +x 00:04:20.901 ************************************ 00:04:20.901 END TEST no_shrink_alloc 00:04:20.901 ************************************ 00:04:20.901 06:59:39 -- setup/hugepages.sh@217 -- # clear_hp 00:04:20.901 06:59:39 -- setup/hugepages.sh@37 -- # local node hp 00:04:20.901 06:59:39 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:20.901 06:59:39 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:20.901 06:59:39 -- setup/hugepages.sh@41 -- # echo 0 00:04:20.901 06:59:39 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:20.901 06:59:39 -- setup/hugepages.sh@41 -- # echo 0 00:04:20.901 06:59:39 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:20.901 06:59:39 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:20.901 06:59:39 -- setup/hugepages.sh@41 -- # echo 0 00:04:20.901 06:59:39 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:20.901 06:59:39 -- setup/hugepages.sh@41 -- # echo 0 00:04:20.901 06:59:39 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:20.901 06:59:39 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:20.901 00:04:20.901 real 0m28.446s 00:04:20.901 user 0m10.252s 00:04:20.901 sys 0m17.251s 00:04:20.901 06:59:39 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:20.901 06:59:39 -- common/autotest_common.sh@10 -- # set +x 00:04:20.901 ************************************ 00:04:20.901 END TEST hugepages 00:04:20.901 ************************************ 00:04:20.901 06:59:39 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:20.901 06:59:39 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:20.901 06:59:39 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:20.901 06:59:39 -- common/autotest_common.sh@10 -- # set +x 00:04:20.901 ************************************ 00:04:20.901 START TEST driver 00:04:20.901 ************************************ 00:04:20.901 06:59:39 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:21.161 * Looking for test storage... 00:04:21.161 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:21.161 06:59:39 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:21.161 06:59:39 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:21.161 06:59:39 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:21.161 06:59:39 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:21.161 06:59:39 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:21.161 06:59:39 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:21.161 06:59:39 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:21.161 06:59:39 -- scripts/common.sh@335 -- # IFS=.-: 00:04:21.161 06:59:39 -- scripts/common.sh@335 -- # read -ra ver1 00:04:21.161 06:59:39 -- scripts/common.sh@336 -- # IFS=.-: 00:04:21.161 06:59:39 -- scripts/common.sh@336 -- # read -ra ver2 00:04:21.161 06:59:39 -- scripts/common.sh@337 -- # local 'op=<' 00:04:21.161 06:59:39 -- scripts/common.sh@339 -- # ver1_l=2 00:04:21.161 06:59:39 -- scripts/common.sh@340 -- # ver2_l=1 00:04:21.161 06:59:39 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:21.161 06:59:39 -- scripts/common.sh@343 -- # case "$op" in 00:04:21.161 06:59:39 -- scripts/common.sh@344 -- # : 1 00:04:21.161 06:59:39 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:21.161 06:59:39 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:21.161 06:59:39 -- scripts/common.sh@364 -- # decimal 1 00:04:21.161 06:59:39 -- scripts/common.sh@352 -- # local d=1 00:04:21.161 06:59:39 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:21.161 06:59:39 -- scripts/common.sh@354 -- # echo 1 00:04:21.161 06:59:39 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:21.161 06:59:39 -- scripts/common.sh@365 -- # decimal 2 00:04:21.161 06:59:39 -- scripts/common.sh@352 -- # local d=2 00:04:21.161 06:59:39 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:21.161 06:59:39 -- scripts/common.sh@354 -- # echo 2 00:04:21.161 06:59:39 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:21.161 06:59:39 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:21.161 06:59:39 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:21.161 06:59:39 -- scripts/common.sh@367 -- # return 0 00:04:21.161 06:59:39 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:21.161 06:59:39 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:21.161 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:21.161 --rc genhtml_branch_coverage=1 00:04:21.161 --rc genhtml_function_coverage=1 00:04:21.161 --rc genhtml_legend=1 00:04:21.161 --rc geninfo_all_blocks=1 00:04:21.161 --rc geninfo_unexecuted_blocks=1 00:04:21.161 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:21.161 ' 00:04:21.161 06:59:39 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:21.161 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:21.161 --rc genhtml_branch_coverage=1 00:04:21.161 --rc genhtml_function_coverage=1 00:04:21.161 --rc genhtml_legend=1 00:04:21.161 --rc geninfo_all_blocks=1 00:04:21.161 --rc geninfo_unexecuted_blocks=1 00:04:21.161 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:21.161 ' 00:04:21.161 06:59:39 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:21.161 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:21.161 --rc genhtml_branch_coverage=1 00:04:21.161 --rc genhtml_function_coverage=1 00:04:21.161 --rc genhtml_legend=1 00:04:21.161 --rc geninfo_all_blocks=1 00:04:21.161 --rc geninfo_unexecuted_blocks=1 00:04:21.161 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:21.161 ' 00:04:21.161 06:59:39 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:21.161 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:21.161 --rc genhtml_branch_coverage=1 00:04:21.161 --rc genhtml_function_coverage=1 00:04:21.161 --rc genhtml_legend=1 00:04:21.161 --rc geninfo_all_blocks=1 00:04:21.161 --rc geninfo_unexecuted_blocks=1 00:04:21.161 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:21.161 ' 00:04:21.161 06:59:39 -- setup/driver.sh@68 -- # setup reset 00:04:21.161 06:59:39 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:21.161 06:59:39 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:26.437 06:59:44 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:26.437 06:59:44 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:26.437 06:59:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:26.437 06:59:44 -- common/autotest_common.sh@10 -- # set +x 00:04:26.437 ************************************ 00:04:26.437 START TEST guess_driver 00:04:26.437 ************************************ 00:04:26.437 06:59:44 -- common/autotest_common.sh@1114 -- # guess_driver 00:04:26.437 06:59:44 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:26.437 06:59:44 -- setup/driver.sh@47 -- # local fail=0 00:04:26.437 06:59:44 -- setup/driver.sh@49 -- # pick_driver 00:04:26.437 06:59:44 -- setup/driver.sh@36 -- # vfio 00:04:26.437 06:59:44 -- setup/driver.sh@21 -- # local iommu_grups 00:04:26.437 06:59:44 -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:26.437 06:59:44 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:26.437 06:59:44 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:26.437 06:59:44 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:26.437 06:59:44 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:04:26.437 06:59:44 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:26.437 06:59:44 -- setup/driver.sh@14 -- # mod vfio_pci 00:04:26.437 06:59:44 -- setup/driver.sh@12 -- # dep vfio_pci 00:04:26.437 06:59:44 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:26.437 06:59:44 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:26.437 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:26.437 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:26.437 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:26.437 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:26.437 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:26.437 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:26.437 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:26.437 06:59:44 -- setup/driver.sh@30 -- # return 0 00:04:26.437 06:59:44 -- setup/driver.sh@37 -- # echo vfio-pci 00:04:26.437 06:59:44 -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:26.437 06:59:44 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:26.437 06:59:44 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:26.437 Looking for driver=vfio-pci 00:04:26.437 06:59:44 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:26.437 06:59:44 -- setup/driver.sh@45 -- # setup output config 00:04:26.437 06:59:44 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:26.437 06:59:44 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:29.728 06:59:47 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.729 06:59:47 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.729 06:59:47 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.729 06:59:47 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.729 06:59:47 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.729 06:59:47 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.729 06:59:47 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.729 06:59:47 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.729 06:59:47 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.729 06:59:47 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.729 06:59:47 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.729 06:59:47 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.729 06:59:47 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.729 06:59:47 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.729 06:59:47 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.729 06:59:47 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.729 06:59:47 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.729 06:59:47 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.729 06:59:47 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.729 06:59:47 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.729 06:59:47 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.729 06:59:47 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.729 06:59:47 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.729 06:59:47 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.729 06:59:47 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.729 06:59:47 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.729 06:59:47 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.729 06:59:47 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.729 06:59:47 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.729 06:59:47 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.729 06:59:47 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.729 06:59:47 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.729 06:59:47 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.729 06:59:47 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.729 06:59:47 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.729 06:59:47 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.729 06:59:47 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.729 06:59:47 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.729 06:59:47 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.729 06:59:47 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.729 06:59:47 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.729 06:59:47 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.729 06:59:47 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.729 06:59:47 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.729 06:59:47 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.729 06:59:47 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.729 06:59:47 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.729 06:59:47 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:31.635 06:59:49 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:31.635 06:59:49 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:31.635 06:59:49 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:31.635 06:59:49 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:31.635 06:59:49 -- setup/driver.sh@65 -- # setup reset 00:04:31.635 06:59:49 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:31.635 06:59:49 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:36.910 00:04:36.910 real 0m10.324s 00:04:36.910 user 0m2.784s 00:04:36.910 sys 0m5.211s 00:04:36.910 06:59:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:36.910 06:59:54 -- common/autotest_common.sh@10 -- # set +x 00:04:36.910 ************************************ 00:04:36.910 END TEST guess_driver 00:04:36.910 ************************************ 00:04:36.910 00:04:36.910 real 0m15.509s 00:04:36.910 user 0m4.270s 00:04:36.910 sys 0m8.094s 00:04:36.910 06:59:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:36.910 06:59:54 -- common/autotest_common.sh@10 -- # set +x 00:04:36.910 ************************************ 00:04:36.910 END TEST driver 00:04:36.910 ************************************ 00:04:36.910 06:59:54 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:36.910 06:59:54 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:36.910 06:59:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:36.910 06:59:54 -- common/autotest_common.sh@10 -- # set +x 00:04:36.910 ************************************ 00:04:36.910 START TEST devices 00:04:36.910 ************************************ 00:04:36.910 06:59:54 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:36.910 * Looking for test storage... 00:04:36.910 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:36.910 06:59:54 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:36.910 06:59:54 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:36.910 06:59:54 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:36.910 06:59:54 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:36.910 06:59:54 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:36.910 06:59:54 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:36.910 06:59:54 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:36.910 06:59:54 -- scripts/common.sh@335 -- # IFS=.-: 00:04:36.910 06:59:54 -- scripts/common.sh@335 -- # read -ra ver1 00:04:36.910 06:59:54 -- scripts/common.sh@336 -- # IFS=.-: 00:04:36.910 06:59:54 -- scripts/common.sh@336 -- # read -ra ver2 00:04:36.910 06:59:54 -- scripts/common.sh@337 -- # local 'op=<' 00:04:36.910 06:59:54 -- scripts/common.sh@339 -- # ver1_l=2 00:04:36.910 06:59:54 -- scripts/common.sh@340 -- # ver2_l=1 00:04:36.910 06:59:54 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:36.910 06:59:54 -- scripts/common.sh@343 -- # case "$op" in 00:04:36.910 06:59:54 -- scripts/common.sh@344 -- # : 1 00:04:36.910 06:59:54 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:36.910 06:59:54 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:36.910 06:59:54 -- scripts/common.sh@364 -- # decimal 1 00:04:36.910 06:59:54 -- scripts/common.sh@352 -- # local d=1 00:04:36.910 06:59:54 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:36.910 06:59:54 -- scripts/common.sh@354 -- # echo 1 00:04:36.910 06:59:54 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:36.910 06:59:54 -- scripts/common.sh@365 -- # decimal 2 00:04:36.910 06:59:54 -- scripts/common.sh@352 -- # local d=2 00:04:36.910 06:59:54 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:36.910 06:59:54 -- scripts/common.sh@354 -- # echo 2 00:04:36.910 06:59:54 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:36.910 06:59:54 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:36.910 06:59:54 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:36.910 06:59:54 -- scripts/common.sh@367 -- # return 0 00:04:36.910 06:59:54 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:36.910 06:59:54 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:36.910 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:36.910 --rc genhtml_branch_coverage=1 00:04:36.910 --rc genhtml_function_coverage=1 00:04:36.910 --rc genhtml_legend=1 00:04:36.910 --rc geninfo_all_blocks=1 00:04:36.910 --rc geninfo_unexecuted_blocks=1 00:04:36.910 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:36.910 ' 00:04:36.910 06:59:54 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:36.910 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:36.911 --rc genhtml_branch_coverage=1 00:04:36.911 --rc genhtml_function_coverage=1 00:04:36.911 --rc genhtml_legend=1 00:04:36.911 --rc geninfo_all_blocks=1 00:04:36.911 --rc geninfo_unexecuted_blocks=1 00:04:36.911 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:36.911 ' 00:04:36.911 06:59:54 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:36.911 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:36.911 --rc genhtml_branch_coverage=1 00:04:36.911 --rc genhtml_function_coverage=1 00:04:36.911 --rc genhtml_legend=1 00:04:36.911 --rc geninfo_all_blocks=1 00:04:36.911 --rc geninfo_unexecuted_blocks=1 00:04:36.911 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:36.911 ' 00:04:36.911 06:59:54 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:36.911 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:36.911 --rc genhtml_branch_coverage=1 00:04:36.911 --rc genhtml_function_coverage=1 00:04:36.911 --rc genhtml_legend=1 00:04:36.911 --rc geninfo_all_blocks=1 00:04:36.911 --rc geninfo_unexecuted_blocks=1 00:04:36.911 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:36.911 ' 00:04:36.911 06:59:54 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:36.911 06:59:54 -- setup/devices.sh@192 -- # setup reset 00:04:36.911 06:59:54 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:36.911 06:59:54 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:41.105 06:59:58 -- setup/devices.sh@194 -- # get_zoned_devs 00:04:41.105 06:59:58 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:04:41.105 06:59:58 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:04:41.105 06:59:58 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:04:41.105 06:59:58 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:04:41.105 06:59:58 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:04:41.105 06:59:58 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:04:41.105 06:59:58 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:41.105 06:59:58 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:04:41.105 06:59:58 -- setup/devices.sh@196 -- # blocks=() 00:04:41.105 06:59:58 -- setup/devices.sh@196 -- # declare -a blocks 00:04:41.105 06:59:58 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:41.105 06:59:58 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:41.105 06:59:58 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:41.105 06:59:58 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:41.105 06:59:58 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:41.105 06:59:58 -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:41.105 06:59:58 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:04:41.105 06:59:58 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:41.105 06:59:58 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:41.105 06:59:58 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:04:41.105 06:59:58 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:41.105 No valid GPT data, bailing 00:04:41.105 06:59:58 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:41.105 06:59:58 -- scripts/common.sh@393 -- # pt= 00:04:41.105 06:59:58 -- scripts/common.sh@394 -- # return 1 00:04:41.105 06:59:58 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:41.105 06:59:58 -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:41.105 06:59:58 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:41.105 06:59:58 -- setup/common.sh@80 -- # echo 1600321314816 00:04:41.105 06:59:58 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:04:41.105 06:59:58 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:41.105 06:59:58 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:04:41.105 06:59:58 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:41.105 06:59:58 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:41.105 06:59:58 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:41.105 06:59:58 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:41.105 06:59:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:41.105 06:59:58 -- common/autotest_common.sh@10 -- # set +x 00:04:41.105 ************************************ 00:04:41.105 START TEST nvme_mount 00:04:41.105 ************************************ 00:04:41.105 06:59:58 -- common/autotest_common.sh@1114 -- # nvme_mount 00:04:41.105 06:59:58 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:41.105 06:59:58 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:41.105 06:59:58 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:41.105 06:59:58 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:41.105 06:59:58 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:41.105 06:59:58 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:41.105 06:59:58 -- setup/common.sh@40 -- # local part_no=1 00:04:41.105 06:59:58 -- setup/common.sh@41 -- # local size=1073741824 00:04:41.105 06:59:58 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:41.105 06:59:58 -- setup/common.sh@44 -- # parts=() 00:04:41.105 06:59:58 -- setup/common.sh@44 -- # local parts 00:04:41.105 06:59:58 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:41.105 06:59:58 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:41.105 06:59:58 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:41.105 06:59:58 -- setup/common.sh@46 -- # (( part++ )) 00:04:41.105 06:59:58 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:41.105 06:59:58 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:41.105 06:59:58 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:41.105 06:59:58 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:41.674 Creating new GPT entries in memory. 00:04:41.674 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:41.674 other utilities. 00:04:41.674 06:59:59 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:41.674 06:59:59 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:41.674 06:59:59 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:41.674 06:59:59 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:41.674 06:59:59 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:43.052 Creating new GPT entries in memory. 00:04:43.052 The operation has completed successfully. 00:04:43.052 07:00:00 -- setup/common.sh@57 -- # (( part++ )) 00:04:43.052 07:00:00 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:43.052 07:00:00 -- setup/common.sh@62 -- # wait 448898 00:04:43.052 07:00:00 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:43.052 07:00:00 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:43.052 07:00:00 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:43.052 07:00:00 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:43.052 07:00:00 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:43.052 07:00:00 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:43.052 07:00:01 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:43.052 07:00:01 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:43.052 07:00:01 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:43.052 07:00:01 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:43.052 07:00:01 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:43.052 07:00:01 -- setup/devices.sh@53 -- # local found=0 00:04:43.052 07:00:01 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:43.052 07:00:01 -- setup/devices.sh@56 -- # : 00:04:43.052 07:00:01 -- setup/devices.sh@59 -- # local pci status 00:04:43.052 07:00:01 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.052 07:00:01 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:43.052 07:00:01 -- setup/devices.sh@47 -- # setup output config 00:04:43.052 07:00:01 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:43.052 07:00:01 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:46.343 07:00:04 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.343 07:00:04 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:46.343 07:00:04 -- setup/devices.sh@63 -- # found=1 00:04:46.343 07:00:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.343 07:00:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.343 07:00:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.343 07:00:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.343 07:00:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.343 07:00:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.343 07:00:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.343 07:00:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.343 07:00:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.343 07:00:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.343 07:00:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.343 07:00:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.343 07:00:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.343 07:00:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.343 07:00:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.343 07:00:04 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.343 07:00:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.343 07:00:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.343 07:00:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.343 07:00:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.343 07:00:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.343 07:00:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.343 07:00:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.343 07:00:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.343 07:00:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.343 07:00:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.343 07:00:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.343 07:00:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.343 07:00:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.343 07:00:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.343 07:00:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.343 07:00:04 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:46.343 07:00:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.343 07:00:04 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:46.343 07:00:04 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:46.343 07:00:04 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.343 07:00:04 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:46.343 07:00:04 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:46.343 07:00:04 -- setup/devices.sh@110 -- # cleanup_nvme 00:04:46.343 07:00:04 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.343 07:00:04 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.343 07:00:04 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:46.343 07:00:04 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:46.343 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:46.343 07:00:04 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:46.343 07:00:04 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:46.602 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:46.602 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:46.602 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:46.602 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:46.602 07:00:04 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:46.602 07:00:04 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:46.602 07:00:04 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.602 07:00:04 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:46.602 07:00:04 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:46.602 07:00:04 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.862 07:00:04 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:46.862 07:00:04 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:46.862 07:00:04 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:46.862 07:00:04 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.862 07:00:04 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:46.862 07:00:04 -- setup/devices.sh@53 -- # local found=0 00:04:46.862 07:00:04 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:46.862 07:00:04 -- setup/devices.sh@56 -- # : 00:04:46.862 07:00:04 -- setup/devices.sh@59 -- # local pci status 00:04:46.862 07:00:04 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.862 07:00:04 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:46.862 07:00:04 -- setup/devices.sh@47 -- # setup output config 00:04:46.862 07:00:04 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:46.862 07:00:04 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:50.153 07:00:08 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.153 07:00:08 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:50.153 07:00:08 -- setup/devices.sh@63 -- # found=1 00:04:50.153 07:00:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.153 07:00:08 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.153 07:00:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.153 07:00:08 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.153 07:00:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.153 07:00:08 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.153 07:00:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.153 07:00:08 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.153 07:00:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.153 07:00:08 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.153 07:00:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.153 07:00:08 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.153 07:00:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.153 07:00:08 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.153 07:00:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.153 07:00:08 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.153 07:00:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.153 07:00:08 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.153 07:00:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.153 07:00:08 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.153 07:00:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.153 07:00:08 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.153 07:00:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.153 07:00:08 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.153 07:00:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.153 07:00:08 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.153 07:00:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.153 07:00:08 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.153 07:00:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.153 07:00:08 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.153 07:00:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.153 07:00:08 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:50.153 07:00:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.153 07:00:08 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:50.153 07:00:08 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:50.153 07:00:08 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:50.153 07:00:08 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:50.153 07:00:08 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:50.153 07:00:08 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:50.153 07:00:08 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:04:50.153 07:00:08 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:50.153 07:00:08 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:50.153 07:00:08 -- setup/devices.sh@50 -- # local mount_point= 00:04:50.153 07:00:08 -- setup/devices.sh@51 -- # local test_file= 00:04:50.153 07:00:08 -- setup/devices.sh@53 -- # local found=0 00:04:50.153 07:00:08 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:50.153 07:00:08 -- setup/devices.sh@59 -- # local pci status 00:04:50.153 07:00:08 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.153 07:00:08 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:50.153 07:00:08 -- setup/devices.sh@47 -- # setup output config 00:04:50.153 07:00:08 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:50.153 07:00:08 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:53.442 07:00:11 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.442 07:00:11 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:53.442 07:00:11 -- setup/devices.sh@63 -- # found=1 00:04:53.442 07:00:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.442 07:00:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.442 07:00:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.442 07:00:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.442 07:00:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.442 07:00:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.442 07:00:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.442 07:00:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.442 07:00:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.442 07:00:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.442 07:00:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.442 07:00:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.442 07:00:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.442 07:00:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.442 07:00:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.442 07:00:11 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.442 07:00:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.442 07:00:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.442 07:00:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.442 07:00:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.442 07:00:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.442 07:00:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.442 07:00:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.442 07:00:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.442 07:00:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.442 07:00:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.442 07:00:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.701 07:00:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.701 07:00:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.701 07:00:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.701 07:00:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.701 07:00:11 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.701 07:00:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.701 07:00:11 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:53.701 07:00:11 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:53.701 07:00:11 -- setup/devices.sh@68 -- # return 0 00:04:53.701 07:00:11 -- setup/devices.sh@128 -- # cleanup_nvme 00:04:53.701 07:00:11 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:53.701 07:00:11 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:53.701 07:00:11 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:53.701 07:00:11 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:53.701 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:53.701 00:04:53.701 real 0m13.010s 00:04:53.701 user 0m3.865s 00:04:53.701 sys 0m7.143s 00:04:53.701 07:00:11 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:53.701 07:00:11 -- common/autotest_common.sh@10 -- # set +x 00:04:53.701 ************************************ 00:04:53.701 END TEST nvme_mount 00:04:53.701 ************************************ 00:04:53.701 07:00:11 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:53.701 07:00:11 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:53.701 07:00:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:53.701 07:00:11 -- common/autotest_common.sh@10 -- # set +x 00:04:53.701 ************************************ 00:04:53.701 START TEST dm_mount 00:04:53.701 ************************************ 00:04:53.701 07:00:11 -- common/autotest_common.sh@1114 -- # dm_mount 00:04:53.701 07:00:11 -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:53.701 07:00:11 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:53.701 07:00:11 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:53.701 07:00:11 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:53.701 07:00:11 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:53.701 07:00:11 -- setup/common.sh@40 -- # local part_no=2 00:04:53.701 07:00:11 -- setup/common.sh@41 -- # local size=1073741824 00:04:53.701 07:00:11 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:53.701 07:00:11 -- setup/common.sh@44 -- # parts=() 00:04:53.701 07:00:11 -- setup/common.sh@44 -- # local parts 00:04:53.701 07:00:11 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:53.701 07:00:11 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:53.701 07:00:11 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:53.701 07:00:11 -- setup/common.sh@46 -- # (( part++ )) 00:04:53.701 07:00:11 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:53.701 07:00:11 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:53.701 07:00:11 -- setup/common.sh@46 -- # (( part++ )) 00:04:53.701 07:00:11 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:53.701 07:00:11 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:53.701 07:00:11 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:53.701 07:00:11 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:55.079 Creating new GPT entries in memory. 00:04:55.079 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:55.079 other utilities. 00:04:55.079 07:00:12 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:55.079 07:00:12 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:55.079 07:00:12 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:55.079 07:00:12 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:55.079 07:00:12 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:56.017 Creating new GPT entries in memory. 00:04:56.017 The operation has completed successfully. 00:04:56.017 07:00:13 -- setup/common.sh@57 -- # (( part++ )) 00:04:56.017 07:00:13 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:56.017 07:00:13 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:56.017 07:00:13 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:56.017 07:00:13 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:56.958 The operation has completed successfully. 00:04:56.958 07:00:14 -- setup/common.sh@57 -- # (( part++ )) 00:04:56.958 07:00:14 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:56.958 07:00:14 -- setup/common.sh@62 -- # wait 454166 00:04:56.958 07:00:15 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:56.958 07:00:15 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.958 07:00:15 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:56.958 07:00:15 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:56.958 07:00:15 -- setup/devices.sh@160 -- # for t in {1..5} 00:04:56.958 07:00:15 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:56.958 07:00:15 -- setup/devices.sh@161 -- # break 00:04:56.958 07:00:15 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:56.958 07:00:15 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:56.958 07:00:15 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:56.958 07:00:15 -- setup/devices.sh@166 -- # dm=dm-0 00:04:56.959 07:00:15 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:56.959 07:00:15 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:56.959 07:00:15 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.959 07:00:15 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:04:56.959 07:00:15 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.959 07:00:15 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:56.959 07:00:15 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:56.959 07:00:15 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.959 07:00:15 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:56.959 07:00:15 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:56.959 07:00:15 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:56.959 07:00:15 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.959 07:00:15 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:56.959 07:00:15 -- setup/devices.sh@53 -- # local found=0 00:04:56.959 07:00:15 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:56.959 07:00:15 -- setup/devices.sh@56 -- # : 00:04:56.959 07:00:15 -- setup/devices.sh@59 -- # local pci status 00:04:56.959 07:00:15 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.959 07:00:15 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:56.959 07:00:15 -- setup/devices.sh@47 -- # setup output config 00:04:56.959 07:00:15 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:56.959 07:00:15 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:00.250 07:00:18 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.250 07:00:18 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:00.250 07:00:18 -- setup/devices.sh@63 -- # found=1 00:05:00.250 07:00:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.250 07:00:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.250 07:00:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.250 07:00:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.250 07:00:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.250 07:00:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.250 07:00:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.250 07:00:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.250 07:00:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.250 07:00:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.250 07:00:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.250 07:00:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.250 07:00:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.250 07:00:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.250 07:00:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.250 07:00:18 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.250 07:00:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.250 07:00:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.250 07:00:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.250 07:00:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.250 07:00:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.250 07:00:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.250 07:00:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.250 07:00:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.250 07:00:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.250 07:00:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.250 07:00:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.250 07:00:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.250 07:00:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.250 07:00:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.250 07:00:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.250 07:00:18 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:00.250 07:00:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.510 07:00:18 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:00.510 07:00:18 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:00.510 07:00:18 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:00.510 07:00:18 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:00.510 07:00:18 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:00.510 07:00:18 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:00.510 07:00:18 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:00.510 07:00:18 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:00.510 07:00:18 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:00.510 07:00:18 -- setup/devices.sh@50 -- # local mount_point= 00:05:00.510 07:00:18 -- setup/devices.sh@51 -- # local test_file= 00:05:00.510 07:00:18 -- setup/devices.sh@53 -- # local found=0 00:05:00.510 07:00:18 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:00.510 07:00:18 -- setup/devices.sh@59 -- # local pci status 00:05:00.510 07:00:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.510 07:00:18 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:00.510 07:00:18 -- setup/devices.sh@47 -- # setup output config 00:05:00.510 07:00:18 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:00.510 07:00:18 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:03.798 07:00:21 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.798 07:00:21 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:03.798 07:00:21 -- setup/devices.sh@63 -- # found=1 00:05:03.798 07:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.798 07:00:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.798 07:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.798 07:00:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.798 07:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.798 07:00:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.798 07:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.798 07:00:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.798 07:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.798 07:00:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.798 07:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.798 07:00:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.798 07:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.798 07:00:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.798 07:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.798 07:00:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.798 07:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.798 07:00:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.798 07:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.798 07:00:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.798 07:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.798 07:00:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.798 07:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.798 07:00:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.798 07:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.798 07:00:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.798 07:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.798 07:00:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.798 07:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.798 07:00:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.798 07:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.798 07:00:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.798 07:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:04.056 07:00:22 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:04.056 07:00:22 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:04.056 07:00:22 -- setup/devices.sh@68 -- # return 0 00:05:04.056 07:00:22 -- setup/devices.sh@187 -- # cleanup_dm 00:05:04.056 07:00:22 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:04.057 07:00:22 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:04.057 07:00:22 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:04.057 07:00:22 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:04.057 07:00:22 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:04.057 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:04.057 07:00:22 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:04.057 07:00:22 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:04.057 00:05:04.057 real 0m10.284s 00:05:04.057 user 0m2.575s 00:05:04.057 sys 0m4.814s 00:05:04.057 07:00:22 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:04.057 07:00:22 -- common/autotest_common.sh@10 -- # set +x 00:05:04.057 ************************************ 00:05:04.057 END TEST dm_mount 00:05:04.057 ************************************ 00:05:04.057 07:00:22 -- setup/devices.sh@1 -- # cleanup 00:05:04.057 07:00:22 -- setup/devices.sh@11 -- # cleanup_nvme 00:05:04.057 07:00:22 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:04.057 07:00:22 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:04.057 07:00:22 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:04.057 07:00:22 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:04.057 07:00:22 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:04.315 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:04.315 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:04.315 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:04.315 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:04.315 07:00:22 -- setup/devices.sh@12 -- # cleanup_dm 00:05:04.315 07:00:22 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:04.315 07:00:22 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:04.315 07:00:22 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:04.315 07:00:22 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:04.316 07:00:22 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:04.316 07:00:22 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:04.316 00:05:04.316 real 0m27.871s 00:05:04.316 user 0m8.053s 00:05:04.316 sys 0m14.861s 00:05:04.316 07:00:22 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:04.316 07:00:22 -- common/autotest_common.sh@10 -- # set +x 00:05:04.316 ************************************ 00:05:04.316 END TEST devices 00:05:04.316 ************************************ 00:05:04.574 00:05:04.574 real 1m37.741s 00:05:04.574 user 0m31.078s 00:05:04.574 sys 0m55.853s 00:05:04.574 07:00:22 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:04.574 07:00:22 -- common/autotest_common.sh@10 -- # set +x 00:05:04.574 ************************************ 00:05:04.574 END TEST setup.sh 00:05:04.574 ************************************ 00:05:04.574 07:00:22 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:05:07.865 Hugepages 00:05:07.865 node hugesize free / total 00:05:07.865 node0 1048576kB 0 / 0 00:05:07.865 node0 2048kB 2048 / 2048 00:05:07.865 node1 1048576kB 0 / 0 00:05:07.865 node1 2048kB 0 / 0 00:05:07.865 00:05:07.865 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:07.865 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:07.865 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:07.865 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:07.865 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:07.865 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:07.865 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:07.865 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:07.865 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:07.865 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:07.865 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:07.865 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:07.865 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:07.865 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:07.865 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:07.865 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:07.865 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:07.865 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:05:07.865 07:00:26 -- spdk/autotest.sh@128 -- # uname -s 00:05:07.865 07:00:26 -- spdk/autotest.sh@128 -- # [[ Linux == Linux ]] 00:05:07.865 07:00:26 -- spdk/autotest.sh@130 -- # nvme_namespace_revert 00:05:07.865 07:00:26 -- common/autotest_common.sh@1526 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:12.060 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:12.060 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:12.060 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:12.060 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:12.060 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:12.060 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:12.060 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:12.060 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:12.060 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:12.060 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:12.060 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:12.060 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:12.060 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:12.060 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:12.060 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:12.060 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:13.439 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:13.439 07:00:31 -- common/autotest_common.sh@1527 -- # sleep 1 00:05:14.376 07:00:32 -- common/autotest_common.sh@1528 -- # bdfs=() 00:05:14.376 07:00:32 -- common/autotest_common.sh@1528 -- # local bdfs 00:05:14.376 07:00:32 -- common/autotest_common.sh@1529 -- # bdfs=($(get_nvme_bdfs)) 00:05:14.376 07:00:32 -- common/autotest_common.sh@1529 -- # get_nvme_bdfs 00:05:14.376 07:00:32 -- common/autotest_common.sh@1508 -- # bdfs=() 00:05:14.376 07:00:32 -- common/autotest_common.sh@1508 -- # local bdfs 00:05:14.376 07:00:32 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:14.376 07:00:32 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:14.376 07:00:32 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:05:14.376 07:00:32 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:05:14.376 07:00:32 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:05:14.376 07:00:32 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:17.666 Waiting for block devices as requested 00:05:17.925 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:17.925 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:17.925 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:18.185 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:18.185 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:18.185 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:18.444 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:18.444 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:18.444 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:18.704 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:18.704 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:18.704 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:18.963 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:18.963 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:18.963 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:19.222 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:19.222 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:05:19.481 07:00:37 -- common/autotest_common.sh@1533 -- # for bdf in "${bdfs[@]}" 00:05:19.481 07:00:37 -- common/autotest_common.sh@1534 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:05:19.481 07:00:37 -- common/autotest_common.sh@1497 -- # readlink -f /sys/class/nvme/nvme0 00:05:19.481 07:00:37 -- common/autotest_common.sh@1497 -- # grep 0000:d8:00.0/nvme/nvme 00:05:19.481 07:00:37 -- common/autotest_common.sh@1497 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:19.481 07:00:37 -- common/autotest_common.sh@1498 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:05:19.481 07:00:37 -- common/autotest_common.sh@1502 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:19.481 07:00:37 -- common/autotest_common.sh@1502 -- # printf '%s\n' nvme0 00:05:19.481 07:00:37 -- common/autotest_common.sh@1534 -- # nvme_ctrlr=/dev/nvme0 00:05:19.481 07:00:37 -- common/autotest_common.sh@1535 -- # [[ -z /dev/nvme0 ]] 00:05:19.481 07:00:37 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:05:19.481 07:00:37 -- common/autotest_common.sh@1540 -- # grep oacs 00:05:19.481 07:00:37 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:05:19.481 07:00:37 -- common/autotest_common.sh@1540 -- # oacs=' 0xe' 00:05:19.481 07:00:37 -- common/autotest_common.sh@1541 -- # oacs_ns_manage=8 00:05:19.481 07:00:37 -- common/autotest_common.sh@1543 -- # [[ 8 -ne 0 ]] 00:05:19.481 07:00:37 -- common/autotest_common.sh@1549 -- # nvme id-ctrl /dev/nvme0 00:05:19.481 07:00:37 -- common/autotest_common.sh@1549 -- # grep unvmcap 00:05:19.481 07:00:37 -- common/autotest_common.sh@1549 -- # cut -d: -f2 00:05:19.481 07:00:37 -- common/autotest_common.sh@1549 -- # unvmcap=' 0' 00:05:19.481 07:00:37 -- common/autotest_common.sh@1550 -- # [[ 0 -eq 0 ]] 00:05:19.481 07:00:37 -- common/autotest_common.sh@1552 -- # continue 00:05:19.481 07:00:37 -- spdk/autotest.sh@133 -- # timing_exit pre_cleanup 00:05:19.481 07:00:37 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:19.481 07:00:37 -- common/autotest_common.sh@10 -- # set +x 00:05:19.481 07:00:37 -- spdk/autotest.sh@136 -- # timing_enter afterboot 00:05:19.481 07:00:37 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:19.481 07:00:37 -- common/autotest_common.sh@10 -- # set +x 00:05:19.481 07:00:37 -- spdk/autotest.sh@137 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:23.676 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:23.676 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:23.676 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:23.676 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:23.676 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:23.676 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:23.676 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:23.676 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:23.676 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:23.676 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:23.676 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:23.676 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:23.676 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:23.676 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:23.676 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:23.676 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:25.055 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:25.055 07:00:42 -- spdk/autotest.sh@138 -- # timing_exit afterboot 00:05:25.055 07:00:42 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:25.055 07:00:42 -- common/autotest_common.sh@10 -- # set +x 00:05:25.056 07:00:43 -- spdk/autotest.sh@142 -- # opal_revert_cleanup 00:05:25.056 07:00:43 -- common/autotest_common.sh@1586 -- # mapfile -t bdfs 00:05:25.056 07:00:43 -- common/autotest_common.sh@1586 -- # get_nvme_bdfs_by_id 0x0a54 00:05:25.056 07:00:43 -- common/autotest_common.sh@1572 -- # bdfs=() 00:05:25.056 07:00:43 -- common/autotest_common.sh@1572 -- # local bdfs 00:05:25.056 07:00:43 -- common/autotest_common.sh@1574 -- # get_nvme_bdfs 00:05:25.056 07:00:43 -- common/autotest_common.sh@1508 -- # bdfs=() 00:05:25.056 07:00:43 -- common/autotest_common.sh@1508 -- # local bdfs 00:05:25.056 07:00:43 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:25.056 07:00:43 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:25.056 07:00:43 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:05:25.056 07:00:43 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:05:25.056 07:00:43 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:05:25.056 07:00:43 -- common/autotest_common.sh@1574 -- # for bdf in $(get_nvme_bdfs) 00:05:25.056 07:00:43 -- common/autotest_common.sh@1575 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:05:25.056 07:00:43 -- common/autotest_common.sh@1575 -- # device=0x0a54 00:05:25.056 07:00:43 -- common/autotest_common.sh@1576 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:25.056 07:00:43 -- common/autotest_common.sh@1577 -- # bdfs+=($bdf) 00:05:25.056 07:00:43 -- common/autotest_common.sh@1581 -- # printf '%s\n' 0000:d8:00.0 00:05:25.056 07:00:43 -- common/autotest_common.sh@1587 -- # [[ -z 0000:d8:00.0 ]] 00:05:25.056 07:00:43 -- common/autotest_common.sh@1592 -- # spdk_tgt_pid=464167 00:05:25.056 07:00:43 -- common/autotest_common.sh@1593 -- # waitforlisten 464167 00:05:25.056 07:00:43 -- common/autotest_common.sh@1591 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:25.056 07:00:43 -- common/autotest_common.sh@829 -- # '[' -z 464167 ']' 00:05:25.056 07:00:43 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:25.056 07:00:43 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:25.056 07:00:43 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:25.056 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:25.056 07:00:43 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:25.056 07:00:43 -- common/autotest_common.sh@10 -- # set +x 00:05:25.056 [2024-12-13 07:00:43.180693] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:25.056 [2024-12-13 07:00:43.180771] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid464167 ] 00:05:25.056 EAL: No free 2048 kB hugepages reported on node 1 00:05:25.056 [2024-12-13 07:00:43.264114] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:25.315 [2024-12-13 07:00:43.303158] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:25.315 [2024-12-13 07:00:43.303286] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:25.882 07:00:44 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:25.882 07:00:44 -- common/autotest_common.sh@862 -- # return 0 00:05:25.882 07:00:44 -- common/autotest_common.sh@1595 -- # bdf_id=0 00:05:25.882 07:00:44 -- common/autotest_common.sh@1596 -- # for bdf in "${bdfs[@]}" 00:05:25.882 07:00:44 -- common/autotest_common.sh@1597 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:05:29.169 nvme0n1 00:05:29.169 07:00:47 -- common/autotest_common.sh@1599 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:29.169 [2024-12-13 07:00:47.178362] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:29.169 request: 00:05:29.169 { 00:05:29.169 "nvme_ctrlr_name": "nvme0", 00:05:29.169 "password": "test", 00:05:29.169 "method": "bdev_nvme_opal_revert", 00:05:29.169 "req_id": 1 00:05:29.169 } 00:05:29.169 Got JSON-RPC error response 00:05:29.169 response: 00:05:29.169 { 00:05:29.169 "code": -32602, 00:05:29.169 "message": "Invalid parameters" 00:05:29.169 } 00:05:29.169 07:00:47 -- common/autotest_common.sh@1599 -- # true 00:05:29.169 07:00:47 -- common/autotest_common.sh@1600 -- # (( ++bdf_id )) 00:05:29.169 07:00:47 -- common/autotest_common.sh@1603 -- # killprocess 464167 00:05:29.169 07:00:47 -- common/autotest_common.sh@936 -- # '[' -z 464167 ']' 00:05:29.169 07:00:47 -- common/autotest_common.sh@940 -- # kill -0 464167 00:05:29.169 07:00:47 -- common/autotest_common.sh@941 -- # uname 00:05:29.169 07:00:47 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:29.169 07:00:47 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 464167 00:05:29.169 07:00:47 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:29.169 07:00:47 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:29.169 07:00:47 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 464167' 00:05:29.169 killing process with pid 464167 00:05:29.169 07:00:47 -- common/autotest_common.sh@955 -- # kill 464167 00:05:29.169 07:00:47 -- common/autotest_common.sh@960 -- # wait 464167 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.169 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:29.170 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.705 07:00:49 -- spdk/autotest.sh@148 -- # '[' 0 -eq 1 ']' 00:05:31.705 07:00:49 -- spdk/autotest.sh@152 -- # '[' 1 -eq 1 ']' 00:05:31.705 07:00:49 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:05:31.705 07:00:49 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:05:31.705 07:00:49 -- spdk/autotest.sh@160 -- # timing_enter lib 00:05:31.705 07:00:49 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:31.705 07:00:49 -- common/autotest_common.sh@10 -- # set +x 00:05:31.705 07:00:49 -- spdk/autotest.sh@162 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:31.705 07:00:49 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:31.705 07:00:49 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:31.705 07:00:49 -- common/autotest_common.sh@10 -- # set +x 00:05:31.705 ************************************ 00:05:31.705 START TEST env 00:05:31.705 ************************************ 00:05:31.705 07:00:49 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:31.705 * Looking for test storage... 00:05:31.705 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:31.705 07:00:49 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:31.705 07:00:49 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:31.705 07:00:49 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:31.705 07:00:49 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:31.705 07:00:49 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:31.705 07:00:49 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:31.705 07:00:49 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:31.705 07:00:49 -- scripts/common.sh@335 -- # IFS=.-: 00:05:31.705 07:00:49 -- scripts/common.sh@335 -- # read -ra ver1 00:05:31.705 07:00:49 -- scripts/common.sh@336 -- # IFS=.-: 00:05:31.705 07:00:49 -- scripts/common.sh@336 -- # read -ra ver2 00:05:31.705 07:00:49 -- scripts/common.sh@337 -- # local 'op=<' 00:05:31.705 07:00:49 -- scripts/common.sh@339 -- # ver1_l=2 00:05:31.705 07:00:49 -- scripts/common.sh@340 -- # ver2_l=1 00:05:31.705 07:00:49 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:31.705 07:00:49 -- scripts/common.sh@343 -- # case "$op" in 00:05:31.705 07:00:49 -- scripts/common.sh@344 -- # : 1 00:05:31.705 07:00:49 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:31.705 07:00:49 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:31.705 07:00:49 -- scripts/common.sh@364 -- # decimal 1 00:05:31.705 07:00:49 -- scripts/common.sh@352 -- # local d=1 00:05:31.705 07:00:49 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:31.705 07:00:49 -- scripts/common.sh@354 -- # echo 1 00:05:31.705 07:00:49 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:31.705 07:00:49 -- scripts/common.sh@365 -- # decimal 2 00:05:31.705 07:00:49 -- scripts/common.sh@352 -- # local d=2 00:05:31.705 07:00:49 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:31.705 07:00:49 -- scripts/common.sh@354 -- # echo 2 00:05:31.705 07:00:49 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:31.705 07:00:49 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:31.705 07:00:49 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:31.705 07:00:49 -- scripts/common.sh@367 -- # return 0 00:05:31.705 07:00:49 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:31.705 07:00:49 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:31.705 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:31.705 --rc genhtml_branch_coverage=1 00:05:31.705 --rc genhtml_function_coverage=1 00:05:31.705 --rc genhtml_legend=1 00:05:31.705 --rc geninfo_all_blocks=1 00:05:31.705 --rc geninfo_unexecuted_blocks=1 00:05:31.705 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:31.705 ' 00:05:31.705 07:00:49 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:31.705 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:31.705 --rc genhtml_branch_coverage=1 00:05:31.705 --rc genhtml_function_coverage=1 00:05:31.705 --rc genhtml_legend=1 00:05:31.705 --rc geninfo_all_blocks=1 00:05:31.705 --rc geninfo_unexecuted_blocks=1 00:05:31.705 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:31.705 ' 00:05:31.705 07:00:49 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:31.705 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:31.705 --rc genhtml_branch_coverage=1 00:05:31.705 --rc genhtml_function_coverage=1 00:05:31.705 --rc genhtml_legend=1 00:05:31.705 --rc geninfo_all_blocks=1 00:05:31.705 --rc geninfo_unexecuted_blocks=1 00:05:31.705 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:31.705 ' 00:05:31.705 07:00:49 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:31.705 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:31.705 --rc genhtml_branch_coverage=1 00:05:31.705 --rc genhtml_function_coverage=1 00:05:31.705 --rc genhtml_legend=1 00:05:31.705 --rc geninfo_all_blocks=1 00:05:31.705 --rc geninfo_unexecuted_blocks=1 00:05:31.705 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:31.705 ' 00:05:31.705 07:00:49 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:31.705 07:00:49 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:31.705 07:00:49 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:31.705 07:00:49 -- common/autotest_common.sh@10 -- # set +x 00:05:31.705 ************************************ 00:05:31.705 START TEST env_memory 00:05:31.705 ************************************ 00:05:31.705 07:00:49 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:31.705 00:05:31.705 00:05:31.705 CUnit - A unit testing framework for C - Version 2.1-3 00:05:31.705 http://cunit.sourceforge.net/ 00:05:31.705 00:05:31.705 00:05:31.705 Suite: memory 00:05:31.705 Test: alloc and free memory map ...[2024-12-13 07:00:49.685067] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:31.705 passed 00:05:31.705 Test: mem map translation ...[2024-12-13 07:00:49.697914] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:31.705 [2024-12-13 07:00:49.697930] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:31.705 [2024-12-13 07:00:49.697959] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:31.705 [2024-12-13 07:00:49.697966] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:31.705 passed 00:05:31.705 Test: mem map registration ...[2024-12-13 07:00:49.717529] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:31.705 [2024-12-13 07:00:49.717544] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:31.705 passed 00:05:31.705 Test: mem map adjacent registrations ...passed 00:05:31.705 00:05:31.705 Run Summary: Type Total Ran Passed Failed Inactive 00:05:31.705 suites 1 1 n/a 0 0 00:05:31.705 tests 4 4 4 0 0 00:05:31.705 asserts 152 152 152 0 n/a 00:05:31.705 00:05:31.705 Elapsed time = 0.081 seconds 00:05:31.705 00:05:31.705 real 0m0.094s 00:05:31.705 user 0m0.084s 00:05:31.705 sys 0m0.010s 00:05:31.705 07:00:49 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:31.705 07:00:49 -- common/autotest_common.sh@10 -- # set +x 00:05:31.705 ************************************ 00:05:31.705 END TEST env_memory 00:05:31.705 ************************************ 00:05:31.705 07:00:49 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:31.705 07:00:49 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:31.705 07:00:49 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:31.705 07:00:49 -- common/autotest_common.sh@10 -- # set +x 00:05:31.705 ************************************ 00:05:31.705 START TEST env_vtophys 00:05:31.705 ************************************ 00:05:31.705 07:00:49 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:31.705 EAL: lib.eal log level changed from notice to debug 00:05:31.705 EAL: Detected lcore 0 as core 0 on socket 0 00:05:31.705 EAL: Detected lcore 1 as core 1 on socket 0 00:05:31.705 EAL: Detected lcore 2 as core 2 on socket 0 00:05:31.705 EAL: Detected lcore 3 as core 3 on socket 0 00:05:31.705 EAL: Detected lcore 4 as core 4 on socket 0 00:05:31.705 EAL: Detected lcore 5 as core 5 on socket 0 00:05:31.705 EAL: Detected lcore 6 as core 6 on socket 0 00:05:31.705 EAL: Detected lcore 7 as core 8 on socket 0 00:05:31.705 EAL: Detected lcore 8 as core 9 on socket 0 00:05:31.705 EAL: Detected lcore 9 as core 10 on socket 0 00:05:31.705 EAL: Detected lcore 10 as core 11 on socket 0 00:05:31.705 EAL: Detected lcore 11 as core 12 on socket 0 00:05:31.706 EAL: Detected lcore 12 as core 13 on socket 0 00:05:31.706 EAL: Detected lcore 13 as core 14 on socket 0 00:05:31.706 EAL: Detected lcore 14 as core 16 on socket 0 00:05:31.706 EAL: Detected lcore 15 as core 17 on socket 0 00:05:31.706 EAL: Detected lcore 16 as core 18 on socket 0 00:05:31.706 EAL: Detected lcore 17 as core 19 on socket 0 00:05:31.706 EAL: Detected lcore 18 as core 20 on socket 0 00:05:31.706 EAL: Detected lcore 19 as core 21 on socket 0 00:05:31.706 EAL: Detected lcore 20 as core 22 on socket 0 00:05:31.706 EAL: Detected lcore 21 as core 24 on socket 0 00:05:31.706 EAL: Detected lcore 22 as core 25 on socket 0 00:05:31.706 EAL: Detected lcore 23 as core 26 on socket 0 00:05:31.706 EAL: Detected lcore 24 as core 27 on socket 0 00:05:31.706 EAL: Detected lcore 25 as core 28 on socket 0 00:05:31.706 EAL: Detected lcore 26 as core 29 on socket 0 00:05:31.706 EAL: Detected lcore 27 as core 30 on socket 0 00:05:31.706 EAL: Detected lcore 28 as core 0 on socket 1 00:05:31.706 EAL: Detected lcore 29 as core 1 on socket 1 00:05:31.706 EAL: Detected lcore 30 as core 2 on socket 1 00:05:31.706 EAL: Detected lcore 31 as core 3 on socket 1 00:05:31.706 EAL: Detected lcore 32 as core 4 on socket 1 00:05:31.706 EAL: Detected lcore 33 as core 5 on socket 1 00:05:31.706 EAL: Detected lcore 34 as core 6 on socket 1 00:05:31.706 EAL: Detected lcore 35 as core 8 on socket 1 00:05:31.706 EAL: Detected lcore 36 as core 9 on socket 1 00:05:31.706 EAL: Detected lcore 37 as core 10 on socket 1 00:05:31.706 EAL: Detected lcore 38 as core 11 on socket 1 00:05:31.706 EAL: Detected lcore 39 as core 12 on socket 1 00:05:31.706 EAL: Detected lcore 40 as core 13 on socket 1 00:05:31.706 EAL: Detected lcore 41 as core 14 on socket 1 00:05:31.706 EAL: Detected lcore 42 as core 16 on socket 1 00:05:31.706 EAL: Detected lcore 43 as core 17 on socket 1 00:05:31.706 EAL: Detected lcore 44 as core 18 on socket 1 00:05:31.706 EAL: Detected lcore 45 as core 19 on socket 1 00:05:31.706 EAL: Detected lcore 46 as core 20 on socket 1 00:05:31.706 EAL: Detected lcore 47 as core 21 on socket 1 00:05:31.706 EAL: Detected lcore 48 as core 22 on socket 1 00:05:31.706 EAL: Detected lcore 49 as core 24 on socket 1 00:05:31.706 EAL: Detected lcore 50 as core 25 on socket 1 00:05:31.706 EAL: Detected lcore 51 as core 26 on socket 1 00:05:31.706 EAL: Detected lcore 52 as core 27 on socket 1 00:05:31.706 EAL: Detected lcore 53 as core 28 on socket 1 00:05:31.706 EAL: Detected lcore 54 as core 29 on socket 1 00:05:31.706 EAL: Detected lcore 55 as core 30 on socket 1 00:05:31.706 EAL: Detected lcore 56 as core 0 on socket 0 00:05:31.706 EAL: Detected lcore 57 as core 1 on socket 0 00:05:31.706 EAL: Detected lcore 58 as core 2 on socket 0 00:05:31.706 EAL: Detected lcore 59 as core 3 on socket 0 00:05:31.706 EAL: Detected lcore 60 as core 4 on socket 0 00:05:31.706 EAL: Detected lcore 61 as core 5 on socket 0 00:05:31.706 EAL: Detected lcore 62 as core 6 on socket 0 00:05:31.706 EAL: Detected lcore 63 as core 8 on socket 0 00:05:31.706 EAL: Detected lcore 64 as core 9 on socket 0 00:05:31.706 EAL: Detected lcore 65 as core 10 on socket 0 00:05:31.706 EAL: Detected lcore 66 as core 11 on socket 0 00:05:31.706 EAL: Detected lcore 67 as core 12 on socket 0 00:05:31.706 EAL: Detected lcore 68 as core 13 on socket 0 00:05:31.706 EAL: Detected lcore 69 as core 14 on socket 0 00:05:31.706 EAL: Detected lcore 70 as core 16 on socket 0 00:05:31.706 EAL: Detected lcore 71 as core 17 on socket 0 00:05:31.706 EAL: Detected lcore 72 as core 18 on socket 0 00:05:31.706 EAL: Detected lcore 73 as core 19 on socket 0 00:05:31.706 EAL: Detected lcore 74 as core 20 on socket 0 00:05:31.706 EAL: Detected lcore 75 as core 21 on socket 0 00:05:31.706 EAL: Detected lcore 76 as core 22 on socket 0 00:05:31.706 EAL: Detected lcore 77 as core 24 on socket 0 00:05:31.706 EAL: Detected lcore 78 as core 25 on socket 0 00:05:31.706 EAL: Detected lcore 79 as core 26 on socket 0 00:05:31.706 EAL: Detected lcore 80 as core 27 on socket 0 00:05:31.706 EAL: Detected lcore 81 as core 28 on socket 0 00:05:31.706 EAL: Detected lcore 82 as core 29 on socket 0 00:05:31.706 EAL: Detected lcore 83 as core 30 on socket 0 00:05:31.706 EAL: Detected lcore 84 as core 0 on socket 1 00:05:31.706 EAL: Detected lcore 85 as core 1 on socket 1 00:05:31.706 EAL: Detected lcore 86 as core 2 on socket 1 00:05:31.706 EAL: Detected lcore 87 as core 3 on socket 1 00:05:31.706 EAL: Detected lcore 88 as core 4 on socket 1 00:05:31.706 EAL: Detected lcore 89 as core 5 on socket 1 00:05:31.706 EAL: Detected lcore 90 as core 6 on socket 1 00:05:31.706 EAL: Detected lcore 91 as core 8 on socket 1 00:05:31.706 EAL: Detected lcore 92 as core 9 on socket 1 00:05:31.706 EAL: Detected lcore 93 as core 10 on socket 1 00:05:31.706 EAL: Detected lcore 94 as core 11 on socket 1 00:05:31.706 EAL: Detected lcore 95 as core 12 on socket 1 00:05:31.706 EAL: Detected lcore 96 as core 13 on socket 1 00:05:31.706 EAL: Detected lcore 97 as core 14 on socket 1 00:05:31.706 EAL: Detected lcore 98 as core 16 on socket 1 00:05:31.706 EAL: Detected lcore 99 as core 17 on socket 1 00:05:31.706 EAL: Detected lcore 100 as core 18 on socket 1 00:05:31.706 EAL: Detected lcore 101 as core 19 on socket 1 00:05:31.706 EAL: Detected lcore 102 as core 20 on socket 1 00:05:31.706 EAL: Detected lcore 103 as core 21 on socket 1 00:05:31.706 EAL: Detected lcore 104 as core 22 on socket 1 00:05:31.706 EAL: Detected lcore 105 as core 24 on socket 1 00:05:31.706 EAL: Detected lcore 106 as core 25 on socket 1 00:05:31.706 EAL: Detected lcore 107 as core 26 on socket 1 00:05:31.706 EAL: Detected lcore 108 as core 27 on socket 1 00:05:31.706 EAL: Detected lcore 109 as core 28 on socket 1 00:05:31.706 EAL: Detected lcore 110 as core 29 on socket 1 00:05:31.706 EAL: Detected lcore 111 as core 30 on socket 1 00:05:31.706 EAL: Maximum logical cores by configuration: 128 00:05:31.706 EAL: Detected CPU lcores: 112 00:05:31.706 EAL: Detected NUMA nodes: 2 00:05:31.706 EAL: Checking presence of .so 'librte_eal.so.23.0' 00:05:31.706 EAL: Checking presence of .so 'librte_eal.so.23' 00:05:31.706 EAL: Checking presence of .so 'librte_eal.so' 00:05:31.706 EAL: Detected static linkage of DPDK 00:05:31.706 EAL: No shared files mode enabled, IPC will be disabled 00:05:31.706 EAL: Bus pci wants IOVA as 'DC' 00:05:31.706 EAL: Buses did not request a specific IOVA mode. 00:05:31.706 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:31.706 EAL: Selected IOVA mode 'VA' 00:05:31.706 EAL: No free 2048 kB hugepages reported on node 1 00:05:31.706 EAL: Probing VFIO support... 00:05:31.706 EAL: IOMMU type 1 (Type 1) is supported 00:05:31.706 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:31.706 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:31.706 EAL: VFIO support initialized 00:05:31.706 EAL: Ask a virtual area of 0x2e000 bytes 00:05:31.706 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:31.706 EAL: Setting up physically contiguous memory... 00:05:31.706 EAL: Setting maximum number of open files to 524288 00:05:31.706 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:31.706 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:31.706 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:31.706 EAL: Ask a virtual area of 0x61000 bytes 00:05:31.706 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:31.706 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:31.706 EAL: Ask a virtual area of 0x400000000 bytes 00:05:31.706 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:31.706 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:31.706 EAL: Ask a virtual area of 0x61000 bytes 00:05:31.706 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:31.706 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:31.706 EAL: Ask a virtual area of 0x400000000 bytes 00:05:31.706 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:31.706 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:31.706 EAL: Ask a virtual area of 0x61000 bytes 00:05:31.706 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:31.706 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:31.706 EAL: Ask a virtual area of 0x400000000 bytes 00:05:31.706 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:31.706 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:31.706 EAL: Ask a virtual area of 0x61000 bytes 00:05:31.706 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:31.706 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:31.706 EAL: Ask a virtual area of 0x400000000 bytes 00:05:31.706 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:31.706 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:31.706 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:31.706 EAL: Ask a virtual area of 0x61000 bytes 00:05:31.706 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:31.706 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:31.706 EAL: Ask a virtual area of 0x400000000 bytes 00:05:31.706 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:31.706 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:31.706 EAL: Ask a virtual area of 0x61000 bytes 00:05:31.706 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:31.706 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:31.706 EAL: Ask a virtual area of 0x400000000 bytes 00:05:31.706 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:31.706 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:31.706 EAL: Ask a virtual area of 0x61000 bytes 00:05:31.706 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:31.706 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:31.706 EAL: Ask a virtual area of 0x400000000 bytes 00:05:31.706 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:31.706 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:31.706 EAL: Ask a virtual area of 0x61000 bytes 00:05:31.706 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:31.706 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:31.706 EAL: Ask a virtual area of 0x400000000 bytes 00:05:31.706 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:31.706 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:31.706 EAL: Hugepages will be freed exactly as allocated. 00:05:31.706 EAL: No shared files mode enabled, IPC is disabled 00:05:31.706 EAL: No shared files mode enabled, IPC is disabled 00:05:31.706 EAL: TSC frequency is ~2500000 KHz 00:05:31.706 EAL: Main lcore 0 is ready (tid=7f9512013a00;cpuset=[0]) 00:05:31.706 EAL: Trying to obtain current memory policy. 00:05:31.706 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:31.706 EAL: Restoring previous memory policy: 0 00:05:31.706 EAL: request: mp_malloc_sync 00:05:31.706 EAL: No shared files mode enabled, IPC is disabled 00:05:31.707 EAL: Heap on socket 0 was expanded by 2MB 00:05:31.707 EAL: No shared files mode enabled, IPC is disabled 00:05:31.707 EAL: Mem event callback 'spdk:(nil)' registered 00:05:31.707 00:05:31.707 00:05:31.707 CUnit - A unit testing framework for C - Version 2.1-3 00:05:31.707 http://cunit.sourceforge.net/ 00:05:31.707 00:05:31.707 00:05:31.707 Suite: components_suite 00:05:31.707 Test: vtophys_malloc_test ...passed 00:05:31.707 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:31.707 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:31.707 EAL: Restoring previous memory policy: 4 00:05:31.707 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.707 EAL: request: mp_malloc_sync 00:05:31.707 EAL: No shared files mode enabled, IPC is disabled 00:05:31.707 EAL: Heap on socket 0 was expanded by 4MB 00:05:31.707 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.707 EAL: request: mp_malloc_sync 00:05:31.707 EAL: No shared files mode enabled, IPC is disabled 00:05:31.707 EAL: Heap on socket 0 was shrunk by 4MB 00:05:31.707 EAL: Trying to obtain current memory policy. 00:05:31.707 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:31.707 EAL: Restoring previous memory policy: 4 00:05:31.707 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.707 EAL: request: mp_malloc_sync 00:05:31.707 EAL: No shared files mode enabled, IPC is disabled 00:05:31.707 EAL: Heap on socket 0 was expanded by 6MB 00:05:31.707 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.707 EAL: request: mp_malloc_sync 00:05:31.707 EAL: No shared files mode enabled, IPC is disabled 00:05:31.707 EAL: Heap on socket 0 was shrunk by 6MB 00:05:31.707 EAL: Trying to obtain current memory policy. 00:05:31.707 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:31.707 EAL: Restoring previous memory policy: 4 00:05:31.707 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.707 EAL: request: mp_malloc_sync 00:05:31.707 EAL: No shared files mode enabled, IPC is disabled 00:05:31.707 EAL: Heap on socket 0 was expanded by 10MB 00:05:31.707 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.707 EAL: request: mp_malloc_sync 00:05:31.707 EAL: No shared files mode enabled, IPC is disabled 00:05:31.707 EAL: Heap on socket 0 was shrunk by 10MB 00:05:31.707 EAL: Trying to obtain current memory policy. 00:05:31.707 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:31.707 EAL: Restoring previous memory policy: 4 00:05:31.707 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.707 EAL: request: mp_malloc_sync 00:05:31.707 EAL: No shared files mode enabled, IPC is disabled 00:05:31.707 EAL: Heap on socket 0 was expanded by 18MB 00:05:31.707 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.707 EAL: request: mp_malloc_sync 00:05:31.707 EAL: No shared files mode enabled, IPC is disabled 00:05:31.707 EAL: Heap on socket 0 was shrunk by 18MB 00:05:31.707 EAL: Trying to obtain current memory policy. 00:05:31.707 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:31.707 EAL: Restoring previous memory policy: 4 00:05:31.707 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.707 EAL: request: mp_malloc_sync 00:05:31.707 EAL: No shared files mode enabled, IPC is disabled 00:05:31.707 EAL: Heap on socket 0 was expanded by 34MB 00:05:31.707 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.707 EAL: request: mp_malloc_sync 00:05:31.707 EAL: No shared files mode enabled, IPC is disabled 00:05:31.707 EAL: Heap on socket 0 was shrunk by 34MB 00:05:31.707 EAL: Trying to obtain current memory policy. 00:05:31.707 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:31.707 EAL: Restoring previous memory policy: 4 00:05:31.707 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.707 EAL: request: mp_malloc_sync 00:05:31.707 EAL: No shared files mode enabled, IPC is disabled 00:05:31.707 EAL: Heap on socket 0 was expanded by 66MB 00:05:31.707 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.707 EAL: request: mp_malloc_sync 00:05:31.707 EAL: No shared files mode enabled, IPC is disabled 00:05:31.707 EAL: Heap on socket 0 was shrunk by 66MB 00:05:31.707 EAL: Trying to obtain current memory policy. 00:05:31.707 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:31.966 EAL: Restoring previous memory policy: 4 00:05:31.966 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.966 EAL: request: mp_malloc_sync 00:05:31.966 EAL: No shared files mode enabled, IPC is disabled 00:05:31.966 EAL: Heap on socket 0 was expanded by 130MB 00:05:31.966 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.966 EAL: request: mp_malloc_sync 00:05:31.966 EAL: No shared files mode enabled, IPC is disabled 00:05:31.966 EAL: Heap on socket 0 was shrunk by 130MB 00:05:31.966 EAL: Trying to obtain current memory policy. 00:05:31.966 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:31.966 EAL: Restoring previous memory policy: 4 00:05:31.966 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.966 EAL: request: mp_malloc_sync 00:05:31.966 EAL: No shared files mode enabled, IPC is disabled 00:05:31.966 EAL: Heap on socket 0 was expanded by 258MB 00:05:31.966 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.966 EAL: request: mp_malloc_sync 00:05:31.966 EAL: No shared files mode enabled, IPC is disabled 00:05:31.966 EAL: Heap on socket 0 was shrunk by 258MB 00:05:31.966 EAL: Trying to obtain current memory policy. 00:05:31.966 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:32.225 EAL: Restoring previous memory policy: 4 00:05:32.225 EAL: Calling mem event callback 'spdk:(nil)' 00:05:32.225 EAL: request: mp_malloc_sync 00:05:32.225 EAL: No shared files mode enabled, IPC is disabled 00:05:32.225 EAL: Heap on socket 0 was expanded by 514MB 00:05:32.225 EAL: Calling mem event callback 'spdk:(nil)' 00:05:32.225 EAL: request: mp_malloc_sync 00:05:32.225 EAL: No shared files mode enabled, IPC is disabled 00:05:32.225 EAL: Heap on socket 0 was shrunk by 514MB 00:05:32.226 EAL: Trying to obtain current memory policy. 00:05:32.226 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:32.484 EAL: Restoring previous memory policy: 4 00:05:32.484 EAL: Calling mem event callback 'spdk:(nil)' 00:05:32.484 EAL: request: mp_malloc_sync 00:05:32.484 EAL: No shared files mode enabled, IPC is disabled 00:05:32.484 EAL: Heap on socket 0 was expanded by 1026MB 00:05:32.744 EAL: Calling mem event callback 'spdk:(nil)' 00:05:32.744 EAL: request: mp_malloc_sync 00:05:32.744 EAL: No shared files mode enabled, IPC is disabled 00:05:32.744 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:32.744 passed 00:05:32.744 00:05:32.744 Run Summary: Type Total Ran Passed Failed Inactive 00:05:32.744 suites 1 1 n/a 0 0 00:05:32.744 tests 2 2 2 0 0 00:05:32.744 asserts 497 497 497 0 n/a 00:05:32.744 00:05:32.744 Elapsed time = 0.985 seconds 00:05:32.744 EAL: Calling mem event callback 'spdk:(nil)' 00:05:32.744 EAL: request: mp_malloc_sync 00:05:32.744 EAL: No shared files mode enabled, IPC is disabled 00:05:32.744 EAL: Heap on socket 0 was shrunk by 2MB 00:05:32.744 EAL: No shared files mode enabled, IPC is disabled 00:05:32.744 EAL: No shared files mode enabled, IPC is disabled 00:05:32.744 EAL: No shared files mode enabled, IPC is disabled 00:05:32.744 00:05:32.744 real 0m1.121s 00:05:32.744 user 0m0.640s 00:05:32.744 sys 0m0.449s 00:05:32.744 07:00:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:32.744 07:00:50 -- common/autotest_common.sh@10 -- # set +x 00:05:32.744 ************************************ 00:05:32.744 END TEST env_vtophys 00:05:32.744 ************************************ 00:05:32.744 07:00:50 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:32.744 07:00:50 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:32.744 07:00:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:32.744 07:00:50 -- common/autotest_common.sh@10 -- # set +x 00:05:32.744 ************************************ 00:05:32.744 START TEST env_pci 00:05:32.744 ************************************ 00:05:32.744 07:00:50 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:32.744 00:05:32.744 00:05:32.744 CUnit - A unit testing framework for C - Version 2.1-3 00:05:32.744 http://cunit.sourceforge.net/ 00:05:32.744 00:05:32.744 00:05:32.744 Suite: pci 00:05:32.744 Test: pci_hook ...[2024-12-13 07:00:50.982253] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 465653 has claimed it 00:05:33.003 EAL: Cannot find device (10000:00:01.0) 00:05:33.003 EAL: Failed to attach device on primary process 00:05:33.003 passed 00:05:33.003 00:05:33.003 Run Summary: Type Total Ran Passed Failed Inactive 00:05:33.003 suites 1 1 n/a 0 0 00:05:33.003 tests 1 1 1 0 0 00:05:33.003 asserts 25 25 25 0 n/a 00:05:33.003 00:05:33.003 Elapsed time = 0.038 seconds 00:05:33.003 00:05:33.003 real 0m0.057s 00:05:33.003 user 0m0.008s 00:05:33.003 sys 0m0.049s 00:05:33.003 07:00:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:33.003 07:00:51 -- common/autotest_common.sh@10 -- # set +x 00:05:33.003 ************************************ 00:05:33.003 END TEST env_pci 00:05:33.003 ************************************ 00:05:33.003 07:00:51 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:33.003 07:00:51 -- env/env.sh@15 -- # uname 00:05:33.003 07:00:51 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:33.003 07:00:51 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:33.003 07:00:51 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:33.003 07:00:51 -- common/autotest_common.sh@1087 -- # '[' 5 -le 1 ']' 00:05:33.003 07:00:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:33.003 07:00:51 -- common/autotest_common.sh@10 -- # set +x 00:05:33.003 ************************************ 00:05:33.003 START TEST env_dpdk_post_init 00:05:33.003 ************************************ 00:05:33.003 07:00:51 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:33.003 EAL: Detected CPU lcores: 112 00:05:33.003 EAL: Detected NUMA nodes: 2 00:05:33.003 EAL: Detected static linkage of DPDK 00:05:33.003 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:33.003 EAL: Selected IOVA mode 'VA' 00:05:33.003 EAL: No free 2048 kB hugepages reported on node 1 00:05:33.003 EAL: VFIO support initialized 00:05:33.003 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:33.003 EAL: Using IOMMU type 1 (Type 1) 00:05:33.940 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:05:38.127 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:05:38.127 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:05:38.127 Starting DPDK initialization... 00:05:38.127 Starting SPDK post initialization... 00:05:38.127 SPDK NVMe probe 00:05:38.127 Attaching to 0000:d8:00.0 00:05:38.127 Attached to 0000:d8:00.0 00:05:38.127 Cleaning up... 00:05:38.127 00:05:38.127 real 0m4.765s 00:05:38.127 user 0m3.610s 00:05:38.127 sys 0m0.399s 00:05:38.127 07:00:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:38.127 07:00:55 -- common/autotest_common.sh@10 -- # set +x 00:05:38.127 ************************************ 00:05:38.127 END TEST env_dpdk_post_init 00:05:38.127 ************************************ 00:05:38.127 07:00:55 -- env/env.sh@26 -- # uname 00:05:38.127 07:00:55 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:38.127 07:00:55 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:38.127 07:00:55 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:38.127 07:00:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:38.127 07:00:55 -- common/autotest_common.sh@10 -- # set +x 00:05:38.127 ************************************ 00:05:38.127 START TEST env_mem_callbacks 00:05:38.127 ************************************ 00:05:38.127 07:00:55 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:38.127 EAL: Detected CPU lcores: 112 00:05:38.127 EAL: Detected NUMA nodes: 2 00:05:38.127 EAL: Detected static linkage of DPDK 00:05:38.128 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:38.128 EAL: Selected IOVA mode 'VA' 00:05:38.128 EAL: No free 2048 kB hugepages reported on node 1 00:05:38.128 EAL: VFIO support initialized 00:05:38.128 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:38.128 00:05:38.128 00:05:38.128 CUnit - A unit testing framework for C - Version 2.1-3 00:05:38.128 http://cunit.sourceforge.net/ 00:05:38.128 00:05:38.128 00:05:38.128 Suite: memory 00:05:38.128 Test: test ... 00:05:38.128 register 0x200000200000 2097152 00:05:38.128 malloc 3145728 00:05:38.128 register 0x200000400000 4194304 00:05:38.128 buf 0x200000500000 len 3145728 PASSED 00:05:38.128 malloc 64 00:05:38.128 buf 0x2000004fff40 len 64 PASSED 00:05:38.128 malloc 4194304 00:05:38.128 register 0x200000800000 6291456 00:05:38.128 buf 0x200000a00000 len 4194304 PASSED 00:05:38.128 free 0x200000500000 3145728 00:05:38.128 free 0x2000004fff40 64 00:05:38.128 unregister 0x200000400000 4194304 PASSED 00:05:38.128 free 0x200000a00000 4194304 00:05:38.128 unregister 0x200000800000 6291456 PASSED 00:05:38.128 malloc 8388608 00:05:38.128 register 0x200000400000 10485760 00:05:38.128 buf 0x200000600000 len 8388608 PASSED 00:05:38.128 free 0x200000600000 8388608 00:05:38.128 unregister 0x200000400000 10485760 PASSED 00:05:38.128 passed 00:05:38.128 00:05:38.128 Run Summary: Type Total Ran Passed Failed Inactive 00:05:38.128 suites 1 1 n/a 0 0 00:05:38.128 tests 1 1 1 0 0 00:05:38.128 asserts 15 15 15 0 n/a 00:05:38.128 00:05:38.128 Elapsed time = 0.008 seconds 00:05:38.128 00:05:38.128 real 0m0.068s 00:05:38.128 user 0m0.014s 00:05:38.128 sys 0m0.053s 00:05:38.128 07:00:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:38.128 07:00:55 -- common/autotest_common.sh@10 -- # set +x 00:05:38.128 ************************************ 00:05:38.128 END TEST env_mem_callbacks 00:05:38.128 ************************************ 00:05:38.128 00:05:38.128 real 0m6.562s 00:05:38.128 user 0m4.537s 00:05:38.128 sys 0m1.292s 00:05:38.128 07:00:56 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:38.128 07:00:56 -- common/autotest_common.sh@10 -- # set +x 00:05:38.128 ************************************ 00:05:38.128 END TEST env 00:05:38.128 ************************************ 00:05:38.128 07:00:56 -- spdk/autotest.sh@163 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:38.128 07:00:56 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:38.128 07:00:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:38.128 07:00:56 -- common/autotest_common.sh@10 -- # set +x 00:05:38.128 ************************************ 00:05:38.128 START TEST rpc 00:05:38.128 ************************************ 00:05:38.128 07:00:56 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:38.128 * Looking for test storage... 00:05:38.128 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:38.128 07:00:56 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:38.128 07:00:56 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:38.128 07:00:56 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:38.128 07:00:56 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:38.128 07:00:56 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:38.128 07:00:56 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:38.128 07:00:56 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:38.128 07:00:56 -- scripts/common.sh@335 -- # IFS=.-: 00:05:38.128 07:00:56 -- scripts/common.sh@335 -- # read -ra ver1 00:05:38.128 07:00:56 -- scripts/common.sh@336 -- # IFS=.-: 00:05:38.128 07:00:56 -- scripts/common.sh@336 -- # read -ra ver2 00:05:38.128 07:00:56 -- scripts/common.sh@337 -- # local 'op=<' 00:05:38.128 07:00:56 -- scripts/common.sh@339 -- # ver1_l=2 00:05:38.128 07:00:56 -- scripts/common.sh@340 -- # ver2_l=1 00:05:38.128 07:00:56 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:38.128 07:00:56 -- scripts/common.sh@343 -- # case "$op" in 00:05:38.128 07:00:56 -- scripts/common.sh@344 -- # : 1 00:05:38.128 07:00:56 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:38.128 07:00:56 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:38.128 07:00:56 -- scripts/common.sh@364 -- # decimal 1 00:05:38.128 07:00:56 -- scripts/common.sh@352 -- # local d=1 00:05:38.128 07:00:56 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:38.128 07:00:56 -- scripts/common.sh@354 -- # echo 1 00:05:38.128 07:00:56 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:38.128 07:00:56 -- scripts/common.sh@365 -- # decimal 2 00:05:38.128 07:00:56 -- scripts/common.sh@352 -- # local d=2 00:05:38.128 07:00:56 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:38.128 07:00:56 -- scripts/common.sh@354 -- # echo 2 00:05:38.128 07:00:56 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:38.128 07:00:56 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:38.128 07:00:56 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:38.128 07:00:56 -- scripts/common.sh@367 -- # return 0 00:05:38.128 07:00:56 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:38.128 07:00:56 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:38.128 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:38.128 --rc genhtml_branch_coverage=1 00:05:38.128 --rc genhtml_function_coverage=1 00:05:38.128 --rc genhtml_legend=1 00:05:38.128 --rc geninfo_all_blocks=1 00:05:38.128 --rc geninfo_unexecuted_blocks=1 00:05:38.128 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:38.128 ' 00:05:38.128 07:00:56 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:38.128 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:38.128 --rc genhtml_branch_coverage=1 00:05:38.128 --rc genhtml_function_coverage=1 00:05:38.128 --rc genhtml_legend=1 00:05:38.128 --rc geninfo_all_blocks=1 00:05:38.128 --rc geninfo_unexecuted_blocks=1 00:05:38.128 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:38.128 ' 00:05:38.128 07:00:56 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:38.128 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:38.128 --rc genhtml_branch_coverage=1 00:05:38.128 --rc genhtml_function_coverage=1 00:05:38.128 --rc genhtml_legend=1 00:05:38.128 --rc geninfo_all_blocks=1 00:05:38.128 --rc geninfo_unexecuted_blocks=1 00:05:38.128 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:38.128 ' 00:05:38.128 07:00:56 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:38.128 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:38.128 --rc genhtml_branch_coverage=1 00:05:38.128 --rc genhtml_function_coverage=1 00:05:38.128 --rc genhtml_legend=1 00:05:38.128 --rc geninfo_all_blocks=1 00:05:38.128 --rc geninfo_unexecuted_blocks=1 00:05:38.128 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:38.128 ' 00:05:38.128 07:00:56 -- rpc/rpc.sh@65 -- # spdk_pid=466648 00:05:38.128 07:00:56 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:38.128 07:00:56 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:38.128 07:00:56 -- rpc/rpc.sh@67 -- # waitforlisten 466648 00:05:38.128 07:00:56 -- common/autotest_common.sh@829 -- # '[' -z 466648 ']' 00:05:38.128 07:00:56 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:38.128 07:00:56 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:38.128 07:00:56 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:38.128 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:38.128 07:00:56 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:38.128 07:00:56 -- common/autotest_common.sh@10 -- # set +x 00:05:38.128 [2024-12-13 07:00:56.270869] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:38.128 [2024-12-13 07:00:56.270961] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid466648 ] 00:05:38.128 EAL: No free 2048 kB hugepages reported on node 1 00:05:38.128 [2024-12-13 07:00:56.351852] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:38.387 [2024-12-13 07:00:56.387988] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:38.387 [2024-12-13 07:00:56.388090] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:38.387 [2024-12-13 07:00:56.388100] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 466648' to capture a snapshot of events at runtime. 00:05:38.387 [2024-12-13 07:00:56.388109] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid466648 for offline analysis/debug. 00:05:38.387 [2024-12-13 07:00:56.388129] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:38.954 07:00:57 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:38.954 07:00:57 -- common/autotest_common.sh@862 -- # return 0 00:05:38.954 07:00:57 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:38.954 07:00:57 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:38.954 07:00:57 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:38.954 07:00:57 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:38.954 07:00:57 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:38.954 07:00:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:38.954 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:38.954 ************************************ 00:05:38.954 START TEST rpc_integrity 00:05:38.954 ************************************ 00:05:38.954 07:00:57 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:05:38.954 07:00:57 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:38.954 07:00:57 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:38.954 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:38.954 07:00:57 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:38.954 07:00:57 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:38.954 07:00:57 -- rpc/rpc.sh@13 -- # jq length 00:05:38.954 07:00:57 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:38.954 07:00:57 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:38.954 07:00:57 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:38.954 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:38.954 07:00:57 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:38.954 07:00:57 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:38.954 07:00:57 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:38.954 07:00:57 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:38.954 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.213 07:00:57 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.213 07:00:57 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:39.213 { 00:05:39.213 "name": "Malloc0", 00:05:39.213 "aliases": [ 00:05:39.213 "a5eab198-0a0e-4d90-94a4-e72faba0e619" 00:05:39.213 ], 00:05:39.213 "product_name": "Malloc disk", 00:05:39.213 "block_size": 512, 00:05:39.213 "num_blocks": 16384, 00:05:39.213 "uuid": "a5eab198-0a0e-4d90-94a4-e72faba0e619", 00:05:39.213 "assigned_rate_limits": { 00:05:39.213 "rw_ios_per_sec": 0, 00:05:39.213 "rw_mbytes_per_sec": 0, 00:05:39.213 "r_mbytes_per_sec": 0, 00:05:39.213 "w_mbytes_per_sec": 0 00:05:39.213 }, 00:05:39.213 "claimed": false, 00:05:39.213 "zoned": false, 00:05:39.213 "supported_io_types": { 00:05:39.213 "read": true, 00:05:39.213 "write": true, 00:05:39.213 "unmap": true, 00:05:39.213 "write_zeroes": true, 00:05:39.213 "flush": true, 00:05:39.213 "reset": true, 00:05:39.213 "compare": false, 00:05:39.214 "compare_and_write": false, 00:05:39.214 "abort": true, 00:05:39.214 "nvme_admin": false, 00:05:39.214 "nvme_io": false 00:05:39.214 }, 00:05:39.214 "memory_domains": [ 00:05:39.214 { 00:05:39.214 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:39.214 "dma_device_type": 2 00:05:39.214 } 00:05:39.214 ], 00:05:39.214 "driver_specific": {} 00:05:39.214 } 00:05:39.214 ]' 00:05:39.214 07:00:57 -- rpc/rpc.sh@17 -- # jq length 00:05:39.214 07:00:57 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:39.214 07:00:57 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:39.214 07:00:57 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.214 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.214 [2024-12-13 07:00:57.252115] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:39.214 [2024-12-13 07:00:57.252149] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:39.214 [2024-12-13 07:00:57.252170] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4a5c850 00:05:39.214 [2024-12-13 07:00:57.252180] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:39.214 [2024-12-13 07:00:57.252994] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:39.214 [2024-12-13 07:00:57.253018] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:39.214 Passthru0 00:05:39.214 07:00:57 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.214 07:00:57 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:39.214 07:00:57 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.214 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.214 07:00:57 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.214 07:00:57 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:39.214 { 00:05:39.214 "name": "Malloc0", 00:05:39.214 "aliases": [ 00:05:39.214 "a5eab198-0a0e-4d90-94a4-e72faba0e619" 00:05:39.214 ], 00:05:39.214 "product_name": "Malloc disk", 00:05:39.214 "block_size": 512, 00:05:39.214 "num_blocks": 16384, 00:05:39.214 "uuid": "a5eab198-0a0e-4d90-94a4-e72faba0e619", 00:05:39.214 "assigned_rate_limits": { 00:05:39.214 "rw_ios_per_sec": 0, 00:05:39.214 "rw_mbytes_per_sec": 0, 00:05:39.214 "r_mbytes_per_sec": 0, 00:05:39.214 "w_mbytes_per_sec": 0 00:05:39.214 }, 00:05:39.214 "claimed": true, 00:05:39.214 "claim_type": "exclusive_write", 00:05:39.214 "zoned": false, 00:05:39.214 "supported_io_types": { 00:05:39.214 "read": true, 00:05:39.214 "write": true, 00:05:39.214 "unmap": true, 00:05:39.214 "write_zeroes": true, 00:05:39.214 "flush": true, 00:05:39.214 "reset": true, 00:05:39.214 "compare": false, 00:05:39.214 "compare_and_write": false, 00:05:39.214 "abort": true, 00:05:39.214 "nvme_admin": false, 00:05:39.214 "nvme_io": false 00:05:39.214 }, 00:05:39.214 "memory_domains": [ 00:05:39.214 { 00:05:39.214 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:39.214 "dma_device_type": 2 00:05:39.214 } 00:05:39.214 ], 00:05:39.214 "driver_specific": {} 00:05:39.214 }, 00:05:39.214 { 00:05:39.214 "name": "Passthru0", 00:05:39.214 "aliases": [ 00:05:39.214 "f0aa7370-d486-5308-bab7-14ef79bdc218" 00:05:39.214 ], 00:05:39.214 "product_name": "passthru", 00:05:39.214 "block_size": 512, 00:05:39.214 "num_blocks": 16384, 00:05:39.214 "uuid": "f0aa7370-d486-5308-bab7-14ef79bdc218", 00:05:39.214 "assigned_rate_limits": { 00:05:39.214 "rw_ios_per_sec": 0, 00:05:39.214 "rw_mbytes_per_sec": 0, 00:05:39.214 "r_mbytes_per_sec": 0, 00:05:39.214 "w_mbytes_per_sec": 0 00:05:39.214 }, 00:05:39.214 "claimed": false, 00:05:39.214 "zoned": false, 00:05:39.214 "supported_io_types": { 00:05:39.214 "read": true, 00:05:39.214 "write": true, 00:05:39.214 "unmap": true, 00:05:39.214 "write_zeroes": true, 00:05:39.214 "flush": true, 00:05:39.214 "reset": true, 00:05:39.214 "compare": false, 00:05:39.214 "compare_and_write": false, 00:05:39.214 "abort": true, 00:05:39.214 "nvme_admin": false, 00:05:39.214 "nvme_io": false 00:05:39.214 }, 00:05:39.214 "memory_domains": [ 00:05:39.214 { 00:05:39.214 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:39.214 "dma_device_type": 2 00:05:39.214 } 00:05:39.214 ], 00:05:39.214 "driver_specific": { 00:05:39.214 "passthru": { 00:05:39.214 "name": "Passthru0", 00:05:39.214 "base_bdev_name": "Malloc0" 00:05:39.214 } 00:05:39.214 } 00:05:39.214 } 00:05:39.214 ]' 00:05:39.214 07:00:57 -- rpc/rpc.sh@21 -- # jq length 00:05:39.214 07:00:57 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:39.214 07:00:57 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:39.214 07:00:57 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.214 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.214 07:00:57 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.214 07:00:57 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:39.214 07:00:57 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.214 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.214 07:00:57 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.214 07:00:57 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:39.214 07:00:57 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.214 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.214 07:00:57 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.214 07:00:57 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:39.214 07:00:57 -- rpc/rpc.sh@26 -- # jq length 00:05:39.214 07:00:57 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:39.214 00:05:39.214 real 0m0.284s 00:05:39.214 user 0m0.177s 00:05:39.214 sys 0m0.048s 00:05:39.214 07:00:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:39.214 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.214 ************************************ 00:05:39.214 END TEST rpc_integrity 00:05:39.214 ************************************ 00:05:39.214 07:00:57 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:39.214 07:00:57 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:39.214 07:00:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:39.214 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.214 ************************************ 00:05:39.214 START TEST rpc_plugins 00:05:39.214 ************************************ 00:05:39.214 07:00:57 -- common/autotest_common.sh@1114 -- # rpc_plugins 00:05:39.473 07:00:57 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:39.473 07:00:57 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.473 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.473 07:00:57 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.473 07:00:57 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:39.473 07:00:57 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:39.473 07:00:57 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.473 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.473 07:00:57 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.473 07:00:57 -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:39.473 { 00:05:39.473 "name": "Malloc1", 00:05:39.473 "aliases": [ 00:05:39.473 "44a2c2bf-cb33-4a4f-9544-64530fc8cded" 00:05:39.473 ], 00:05:39.473 "product_name": "Malloc disk", 00:05:39.473 "block_size": 4096, 00:05:39.473 "num_blocks": 256, 00:05:39.473 "uuid": "44a2c2bf-cb33-4a4f-9544-64530fc8cded", 00:05:39.473 "assigned_rate_limits": { 00:05:39.473 "rw_ios_per_sec": 0, 00:05:39.473 "rw_mbytes_per_sec": 0, 00:05:39.473 "r_mbytes_per_sec": 0, 00:05:39.473 "w_mbytes_per_sec": 0 00:05:39.473 }, 00:05:39.473 "claimed": false, 00:05:39.473 "zoned": false, 00:05:39.473 "supported_io_types": { 00:05:39.473 "read": true, 00:05:39.473 "write": true, 00:05:39.473 "unmap": true, 00:05:39.473 "write_zeroes": true, 00:05:39.473 "flush": true, 00:05:39.473 "reset": true, 00:05:39.473 "compare": false, 00:05:39.473 "compare_and_write": false, 00:05:39.473 "abort": true, 00:05:39.473 "nvme_admin": false, 00:05:39.473 "nvme_io": false 00:05:39.473 }, 00:05:39.473 "memory_domains": [ 00:05:39.473 { 00:05:39.473 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:39.473 "dma_device_type": 2 00:05:39.473 } 00:05:39.473 ], 00:05:39.473 "driver_specific": {} 00:05:39.473 } 00:05:39.473 ]' 00:05:39.473 07:00:57 -- rpc/rpc.sh@32 -- # jq length 00:05:39.473 07:00:57 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:39.473 07:00:57 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:39.473 07:00:57 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.473 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.473 07:00:57 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.473 07:00:57 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:39.473 07:00:57 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.473 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.473 07:00:57 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.473 07:00:57 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:39.473 07:00:57 -- rpc/rpc.sh@36 -- # jq length 00:05:39.473 07:00:57 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:39.473 00:05:39.473 real 0m0.142s 00:05:39.473 user 0m0.086s 00:05:39.473 sys 0m0.024s 00:05:39.473 07:00:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:39.473 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.473 ************************************ 00:05:39.473 END TEST rpc_plugins 00:05:39.473 ************************************ 00:05:39.473 07:00:57 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:39.473 07:00:57 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:39.473 07:00:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:39.473 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.473 ************************************ 00:05:39.473 START TEST rpc_trace_cmd_test 00:05:39.473 ************************************ 00:05:39.473 07:00:57 -- common/autotest_common.sh@1114 -- # rpc_trace_cmd_test 00:05:39.473 07:00:57 -- rpc/rpc.sh@40 -- # local info 00:05:39.473 07:00:57 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:39.473 07:00:57 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.473 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.473 07:00:57 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.473 07:00:57 -- rpc/rpc.sh@42 -- # info='{ 00:05:39.473 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid466648", 00:05:39.473 "tpoint_group_mask": "0x8", 00:05:39.473 "iscsi_conn": { 00:05:39.473 "mask": "0x2", 00:05:39.473 "tpoint_mask": "0x0" 00:05:39.473 }, 00:05:39.473 "scsi": { 00:05:39.473 "mask": "0x4", 00:05:39.473 "tpoint_mask": "0x0" 00:05:39.473 }, 00:05:39.473 "bdev": { 00:05:39.473 "mask": "0x8", 00:05:39.474 "tpoint_mask": "0xffffffffffffffff" 00:05:39.474 }, 00:05:39.474 "nvmf_rdma": { 00:05:39.474 "mask": "0x10", 00:05:39.474 "tpoint_mask": "0x0" 00:05:39.474 }, 00:05:39.474 "nvmf_tcp": { 00:05:39.474 "mask": "0x20", 00:05:39.474 "tpoint_mask": "0x0" 00:05:39.474 }, 00:05:39.474 "ftl": { 00:05:39.474 "mask": "0x40", 00:05:39.474 "tpoint_mask": "0x0" 00:05:39.474 }, 00:05:39.474 "blobfs": { 00:05:39.474 "mask": "0x80", 00:05:39.474 "tpoint_mask": "0x0" 00:05:39.474 }, 00:05:39.474 "dsa": { 00:05:39.474 "mask": "0x200", 00:05:39.474 "tpoint_mask": "0x0" 00:05:39.474 }, 00:05:39.474 "thread": { 00:05:39.474 "mask": "0x400", 00:05:39.474 "tpoint_mask": "0x0" 00:05:39.474 }, 00:05:39.474 "nvme_pcie": { 00:05:39.474 "mask": "0x800", 00:05:39.474 "tpoint_mask": "0x0" 00:05:39.474 }, 00:05:39.474 "iaa": { 00:05:39.474 "mask": "0x1000", 00:05:39.474 "tpoint_mask": "0x0" 00:05:39.474 }, 00:05:39.474 "nvme_tcp": { 00:05:39.474 "mask": "0x2000", 00:05:39.474 "tpoint_mask": "0x0" 00:05:39.474 }, 00:05:39.474 "bdev_nvme": { 00:05:39.474 "mask": "0x4000", 00:05:39.474 "tpoint_mask": "0x0" 00:05:39.474 } 00:05:39.474 }' 00:05:39.474 07:00:57 -- rpc/rpc.sh@43 -- # jq length 00:05:39.474 07:00:57 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:05:39.474 07:00:57 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:39.732 07:00:57 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:39.732 07:00:57 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:39.732 07:00:57 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:39.732 07:00:57 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:39.732 07:00:57 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:39.732 07:00:57 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:39.732 07:00:57 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:39.732 00:05:39.732 real 0m0.229s 00:05:39.732 user 0m0.176s 00:05:39.732 sys 0m0.044s 00:05:39.732 07:00:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:39.732 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.732 ************************************ 00:05:39.732 END TEST rpc_trace_cmd_test 00:05:39.732 ************************************ 00:05:39.732 07:00:57 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:39.732 07:00:57 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:39.732 07:00:57 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:39.732 07:00:57 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:39.732 07:00:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:39.732 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.732 ************************************ 00:05:39.732 START TEST rpc_daemon_integrity 00:05:39.732 ************************************ 00:05:39.732 07:00:57 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:05:39.732 07:00:57 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:39.732 07:00:57 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.732 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.732 07:00:57 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.732 07:00:57 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:39.732 07:00:57 -- rpc/rpc.sh@13 -- # jq length 00:05:39.732 07:00:57 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:39.732 07:00:57 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:39.732 07:00:57 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.732 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.991 07:00:57 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.991 07:00:57 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:39.991 07:00:57 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:39.991 07:00:57 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.991 07:00:57 -- common/autotest_common.sh@10 -- # set +x 00:05:39.991 07:00:57 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.991 07:00:57 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:39.991 { 00:05:39.991 "name": "Malloc2", 00:05:39.991 "aliases": [ 00:05:39.991 "0c878442-bfe8-4b52-87c4-eb6f79f0801e" 00:05:39.991 ], 00:05:39.991 "product_name": "Malloc disk", 00:05:39.991 "block_size": 512, 00:05:39.991 "num_blocks": 16384, 00:05:39.991 "uuid": "0c878442-bfe8-4b52-87c4-eb6f79f0801e", 00:05:39.991 "assigned_rate_limits": { 00:05:39.991 "rw_ios_per_sec": 0, 00:05:39.991 "rw_mbytes_per_sec": 0, 00:05:39.991 "r_mbytes_per_sec": 0, 00:05:39.991 "w_mbytes_per_sec": 0 00:05:39.991 }, 00:05:39.991 "claimed": false, 00:05:39.991 "zoned": false, 00:05:39.991 "supported_io_types": { 00:05:39.991 "read": true, 00:05:39.991 "write": true, 00:05:39.991 "unmap": true, 00:05:39.991 "write_zeroes": true, 00:05:39.991 "flush": true, 00:05:39.991 "reset": true, 00:05:39.991 "compare": false, 00:05:39.991 "compare_and_write": false, 00:05:39.991 "abort": true, 00:05:39.991 "nvme_admin": false, 00:05:39.991 "nvme_io": false 00:05:39.991 }, 00:05:39.991 "memory_domains": [ 00:05:39.991 { 00:05:39.991 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:39.991 "dma_device_type": 2 00:05:39.991 } 00:05:39.991 ], 00:05:39.991 "driver_specific": {} 00:05:39.991 } 00:05:39.991 ]' 00:05:39.991 07:00:58 -- rpc/rpc.sh@17 -- # jq length 00:05:39.991 07:00:58 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:39.991 07:00:58 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:39.991 07:00:58 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.991 07:00:58 -- common/autotest_common.sh@10 -- # set +x 00:05:39.991 [2024-12-13 07:00:58.050225] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:39.991 [2024-12-13 07:00:58.050255] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:39.991 [2024-12-13 07:00:58.050270] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4a5e4c0 00:05:39.991 [2024-12-13 07:00:58.050279] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:39.991 [2024-12-13 07:00:58.050961] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:39.991 [2024-12-13 07:00:58.050982] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:39.991 Passthru0 00:05:39.992 07:00:58 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.992 07:00:58 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:39.992 07:00:58 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.992 07:00:58 -- common/autotest_common.sh@10 -- # set +x 00:05:39.992 07:00:58 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.992 07:00:58 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:39.992 { 00:05:39.992 "name": "Malloc2", 00:05:39.992 "aliases": [ 00:05:39.992 "0c878442-bfe8-4b52-87c4-eb6f79f0801e" 00:05:39.992 ], 00:05:39.992 "product_name": "Malloc disk", 00:05:39.992 "block_size": 512, 00:05:39.992 "num_blocks": 16384, 00:05:39.992 "uuid": "0c878442-bfe8-4b52-87c4-eb6f79f0801e", 00:05:39.992 "assigned_rate_limits": { 00:05:39.992 "rw_ios_per_sec": 0, 00:05:39.992 "rw_mbytes_per_sec": 0, 00:05:39.992 "r_mbytes_per_sec": 0, 00:05:39.992 "w_mbytes_per_sec": 0 00:05:39.992 }, 00:05:39.992 "claimed": true, 00:05:39.992 "claim_type": "exclusive_write", 00:05:39.992 "zoned": false, 00:05:39.992 "supported_io_types": { 00:05:39.992 "read": true, 00:05:39.992 "write": true, 00:05:39.992 "unmap": true, 00:05:39.992 "write_zeroes": true, 00:05:39.992 "flush": true, 00:05:39.992 "reset": true, 00:05:39.992 "compare": false, 00:05:39.992 "compare_and_write": false, 00:05:39.992 "abort": true, 00:05:39.992 "nvme_admin": false, 00:05:39.992 "nvme_io": false 00:05:39.992 }, 00:05:39.992 "memory_domains": [ 00:05:39.992 { 00:05:39.992 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:39.992 "dma_device_type": 2 00:05:39.992 } 00:05:39.992 ], 00:05:39.992 "driver_specific": {} 00:05:39.992 }, 00:05:39.992 { 00:05:39.992 "name": "Passthru0", 00:05:39.992 "aliases": [ 00:05:39.992 "e6fdb2e9-a61b-53ec-a4b8-3b6f9a4a53da" 00:05:39.992 ], 00:05:39.992 "product_name": "passthru", 00:05:39.992 "block_size": 512, 00:05:39.992 "num_blocks": 16384, 00:05:39.992 "uuid": "e6fdb2e9-a61b-53ec-a4b8-3b6f9a4a53da", 00:05:39.992 "assigned_rate_limits": { 00:05:39.992 "rw_ios_per_sec": 0, 00:05:39.992 "rw_mbytes_per_sec": 0, 00:05:39.992 "r_mbytes_per_sec": 0, 00:05:39.992 "w_mbytes_per_sec": 0 00:05:39.992 }, 00:05:39.992 "claimed": false, 00:05:39.992 "zoned": false, 00:05:39.992 "supported_io_types": { 00:05:39.992 "read": true, 00:05:39.992 "write": true, 00:05:39.992 "unmap": true, 00:05:39.992 "write_zeroes": true, 00:05:39.992 "flush": true, 00:05:39.992 "reset": true, 00:05:39.992 "compare": false, 00:05:39.992 "compare_and_write": false, 00:05:39.992 "abort": true, 00:05:39.992 "nvme_admin": false, 00:05:39.992 "nvme_io": false 00:05:39.992 }, 00:05:39.992 "memory_domains": [ 00:05:39.992 { 00:05:39.992 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:39.992 "dma_device_type": 2 00:05:39.992 } 00:05:39.992 ], 00:05:39.992 "driver_specific": { 00:05:39.992 "passthru": { 00:05:39.992 "name": "Passthru0", 00:05:39.992 "base_bdev_name": "Malloc2" 00:05:39.992 } 00:05:39.992 } 00:05:39.992 } 00:05:39.992 ]' 00:05:39.992 07:00:58 -- rpc/rpc.sh@21 -- # jq length 00:05:39.992 07:00:58 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:39.992 07:00:58 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:39.992 07:00:58 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.992 07:00:58 -- common/autotest_common.sh@10 -- # set +x 00:05:39.992 07:00:58 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.992 07:00:58 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:39.992 07:00:58 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.992 07:00:58 -- common/autotest_common.sh@10 -- # set +x 00:05:39.992 07:00:58 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.992 07:00:58 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:39.992 07:00:58 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.992 07:00:58 -- common/autotest_common.sh@10 -- # set +x 00:05:39.992 07:00:58 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.992 07:00:58 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:39.992 07:00:58 -- rpc/rpc.sh@26 -- # jq length 00:05:39.992 07:00:58 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:39.992 00:05:39.992 real 0m0.273s 00:05:39.992 user 0m0.161s 00:05:39.992 sys 0m0.056s 00:05:39.992 07:00:58 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:39.992 07:00:58 -- common/autotest_common.sh@10 -- # set +x 00:05:39.992 ************************************ 00:05:39.992 END TEST rpc_daemon_integrity 00:05:39.992 ************************************ 00:05:40.251 07:00:58 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:40.251 07:00:58 -- rpc/rpc.sh@84 -- # killprocess 466648 00:05:40.251 07:00:58 -- common/autotest_common.sh@936 -- # '[' -z 466648 ']' 00:05:40.251 07:00:58 -- common/autotest_common.sh@940 -- # kill -0 466648 00:05:40.251 07:00:58 -- common/autotest_common.sh@941 -- # uname 00:05:40.251 07:00:58 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:40.251 07:00:58 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 466648 00:05:40.251 07:00:58 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:40.251 07:00:58 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:40.251 07:00:58 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 466648' 00:05:40.251 killing process with pid 466648 00:05:40.251 07:00:58 -- common/autotest_common.sh@955 -- # kill 466648 00:05:40.251 07:00:58 -- common/autotest_common.sh@960 -- # wait 466648 00:05:40.510 00:05:40.510 real 0m2.524s 00:05:40.510 user 0m3.129s 00:05:40.510 sys 0m0.807s 00:05:40.510 07:00:58 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:40.510 07:00:58 -- common/autotest_common.sh@10 -- # set +x 00:05:40.510 ************************************ 00:05:40.510 END TEST rpc 00:05:40.510 ************************************ 00:05:40.510 07:00:58 -- spdk/autotest.sh@164 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:40.510 07:00:58 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:40.510 07:00:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:40.510 07:00:58 -- common/autotest_common.sh@10 -- # set +x 00:05:40.510 ************************************ 00:05:40.510 START TEST rpc_client 00:05:40.510 ************************************ 00:05:40.510 07:00:58 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:40.510 * Looking for test storage... 00:05:40.510 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:05:40.510 07:00:58 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:40.510 07:00:58 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:40.510 07:00:58 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:40.778 07:00:58 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:40.778 07:00:58 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:40.778 07:00:58 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:40.778 07:00:58 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:40.778 07:00:58 -- scripts/common.sh@335 -- # IFS=.-: 00:05:40.778 07:00:58 -- scripts/common.sh@335 -- # read -ra ver1 00:05:40.778 07:00:58 -- scripts/common.sh@336 -- # IFS=.-: 00:05:40.778 07:00:58 -- scripts/common.sh@336 -- # read -ra ver2 00:05:40.778 07:00:58 -- scripts/common.sh@337 -- # local 'op=<' 00:05:40.778 07:00:58 -- scripts/common.sh@339 -- # ver1_l=2 00:05:40.778 07:00:58 -- scripts/common.sh@340 -- # ver2_l=1 00:05:40.778 07:00:58 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:40.778 07:00:58 -- scripts/common.sh@343 -- # case "$op" in 00:05:40.778 07:00:58 -- scripts/common.sh@344 -- # : 1 00:05:40.778 07:00:58 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:40.778 07:00:58 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:40.778 07:00:58 -- scripts/common.sh@364 -- # decimal 1 00:05:40.778 07:00:58 -- scripts/common.sh@352 -- # local d=1 00:05:40.778 07:00:58 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:40.778 07:00:58 -- scripts/common.sh@354 -- # echo 1 00:05:40.778 07:00:58 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:40.778 07:00:58 -- scripts/common.sh@365 -- # decimal 2 00:05:40.778 07:00:58 -- scripts/common.sh@352 -- # local d=2 00:05:40.778 07:00:58 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:40.778 07:00:58 -- scripts/common.sh@354 -- # echo 2 00:05:40.778 07:00:58 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:40.778 07:00:58 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:40.778 07:00:58 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:40.778 07:00:58 -- scripts/common.sh@367 -- # return 0 00:05:40.778 07:00:58 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:40.778 07:00:58 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:40.778 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.778 --rc genhtml_branch_coverage=1 00:05:40.778 --rc genhtml_function_coverage=1 00:05:40.778 --rc genhtml_legend=1 00:05:40.778 --rc geninfo_all_blocks=1 00:05:40.778 --rc geninfo_unexecuted_blocks=1 00:05:40.778 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.778 ' 00:05:40.778 07:00:58 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:40.778 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.778 --rc genhtml_branch_coverage=1 00:05:40.778 --rc genhtml_function_coverage=1 00:05:40.778 --rc genhtml_legend=1 00:05:40.778 --rc geninfo_all_blocks=1 00:05:40.778 --rc geninfo_unexecuted_blocks=1 00:05:40.778 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.778 ' 00:05:40.778 07:00:58 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:40.778 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.778 --rc genhtml_branch_coverage=1 00:05:40.778 --rc genhtml_function_coverage=1 00:05:40.778 --rc genhtml_legend=1 00:05:40.778 --rc geninfo_all_blocks=1 00:05:40.778 --rc geninfo_unexecuted_blocks=1 00:05:40.778 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.778 ' 00:05:40.778 07:00:58 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:40.778 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.778 --rc genhtml_branch_coverage=1 00:05:40.778 --rc genhtml_function_coverage=1 00:05:40.778 --rc genhtml_legend=1 00:05:40.778 --rc geninfo_all_blocks=1 00:05:40.778 --rc geninfo_unexecuted_blocks=1 00:05:40.778 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.778 ' 00:05:40.778 07:00:58 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:40.778 OK 00:05:40.778 07:00:58 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:40.778 00:05:40.778 real 0m0.211s 00:05:40.778 user 0m0.118s 00:05:40.778 sys 0m0.111s 00:05:40.778 07:00:58 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:40.778 07:00:58 -- common/autotest_common.sh@10 -- # set +x 00:05:40.778 ************************************ 00:05:40.778 END TEST rpc_client 00:05:40.778 ************************************ 00:05:40.778 07:00:58 -- spdk/autotest.sh@165 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:40.779 07:00:58 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:40.779 07:00:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:40.779 07:00:58 -- common/autotest_common.sh@10 -- # set +x 00:05:40.779 ************************************ 00:05:40.779 START TEST json_config 00:05:40.779 ************************************ 00:05:40.779 07:00:58 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:40.779 07:00:58 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:40.779 07:00:58 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:40.779 07:00:58 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:41.047 07:00:59 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:41.047 07:00:59 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:41.047 07:00:59 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:41.047 07:00:59 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:41.047 07:00:59 -- scripts/common.sh@335 -- # IFS=.-: 00:05:41.047 07:00:59 -- scripts/common.sh@335 -- # read -ra ver1 00:05:41.047 07:00:59 -- scripts/common.sh@336 -- # IFS=.-: 00:05:41.047 07:00:59 -- scripts/common.sh@336 -- # read -ra ver2 00:05:41.047 07:00:59 -- scripts/common.sh@337 -- # local 'op=<' 00:05:41.047 07:00:59 -- scripts/common.sh@339 -- # ver1_l=2 00:05:41.047 07:00:59 -- scripts/common.sh@340 -- # ver2_l=1 00:05:41.047 07:00:59 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:41.047 07:00:59 -- scripts/common.sh@343 -- # case "$op" in 00:05:41.047 07:00:59 -- scripts/common.sh@344 -- # : 1 00:05:41.047 07:00:59 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:41.047 07:00:59 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:41.047 07:00:59 -- scripts/common.sh@364 -- # decimal 1 00:05:41.047 07:00:59 -- scripts/common.sh@352 -- # local d=1 00:05:41.047 07:00:59 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:41.047 07:00:59 -- scripts/common.sh@354 -- # echo 1 00:05:41.047 07:00:59 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:41.047 07:00:59 -- scripts/common.sh@365 -- # decimal 2 00:05:41.047 07:00:59 -- scripts/common.sh@352 -- # local d=2 00:05:41.047 07:00:59 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:41.047 07:00:59 -- scripts/common.sh@354 -- # echo 2 00:05:41.047 07:00:59 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:41.047 07:00:59 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:41.047 07:00:59 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:41.047 07:00:59 -- scripts/common.sh@367 -- # return 0 00:05:41.047 07:00:59 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:41.047 07:00:59 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:41.047 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:41.047 --rc genhtml_branch_coverage=1 00:05:41.047 --rc genhtml_function_coverage=1 00:05:41.047 --rc genhtml_legend=1 00:05:41.047 --rc geninfo_all_blocks=1 00:05:41.047 --rc geninfo_unexecuted_blocks=1 00:05:41.047 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:41.047 ' 00:05:41.047 07:00:59 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:41.047 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:41.047 --rc genhtml_branch_coverage=1 00:05:41.047 --rc genhtml_function_coverage=1 00:05:41.047 --rc genhtml_legend=1 00:05:41.047 --rc geninfo_all_blocks=1 00:05:41.047 --rc geninfo_unexecuted_blocks=1 00:05:41.047 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:41.047 ' 00:05:41.047 07:00:59 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:41.047 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:41.047 --rc genhtml_branch_coverage=1 00:05:41.047 --rc genhtml_function_coverage=1 00:05:41.047 --rc genhtml_legend=1 00:05:41.047 --rc geninfo_all_blocks=1 00:05:41.047 --rc geninfo_unexecuted_blocks=1 00:05:41.047 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:41.047 ' 00:05:41.048 07:00:59 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:41.048 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:41.048 --rc genhtml_branch_coverage=1 00:05:41.048 --rc genhtml_function_coverage=1 00:05:41.048 --rc genhtml_legend=1 00:05:41.048 --rc geninfo_all_blocks=1 00:05:41.048 --rc geninfo_unexecuted_blocks=1 00:05:41.048 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:41.048 ' 00:05:41.048 07:00:59 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:41.048 07:00:59 -- nvmf/common.sh@7 -- # uname -s 00:05:41.048 07:00:59 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:41.048 07:00:59 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:41.048 07:00:59 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:41.048 07:00:59 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:41.048 07:00:59 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:41.048 07:00:59 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:41.048 07:00:59 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:41.048 07:00:59 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:41.048 07:00:59 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:41.048 07:00:59 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:41.048 07:00:59 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:41.048 07:00:59 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:41.048 07:00:59 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:41.048 07:00:59 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:41.048 07:00:59 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:41.048 07:00:59 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:41.048 07:00:59 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:41.048 07:00:59 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:41.048 07:00:59 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:41.048 07:00:59 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:41.048 07:00:59 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:41.048 07:00:59 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:41.048 07:00:59 -- paths/export.sh@5 -- # export PATH 00:05:41.048 07:00:59 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:41.048 07:00:59 -- nvmf/common.sh@46 -- # : 0 00:05:41.048 07:00:59 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:41.048 07:00:59 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:41.048 07:00:59 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:41.048 07:00:59 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:41.048 07:00:59 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:41.048 07:00:59 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:41.048 07:00:59 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:41.048 07:00:59 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:41.048 07:00:59 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:05:41.048 07:00:59 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:05:41.048 07:00:59 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:05:41.048 07:00:59 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:41.048 07:00:59 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:05:41.048 WARNING: No tests are enabled so not running JSON configuration tests 00:05:41.048 07:00:59 -- json_config/json_config.sh@27 -- # exit 0 00:05:41.048 00:05:41.048 real 0m0.191s 00:05:41.048 user 0m0.118s 00:05:41.048 sys 0m0.081s 00:05:41.048 07:00:59 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:41.048 07:00:59 -- common/autotest_common.sh@10 -- # set +x 00:05:41.048 ************************************ 00:05:41.048 END TEST json_config 00:05:41.048 ************************************ 00:05:41.048 07:00:59 -- spdk/autotest.sh@166 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:41.048 07:00:59 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:41.048 07:00:59 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:41.048 07:00:59 -- common/autotest_common.sh@10 -- # set +x 00:05:41.048 ************************************ 00:05:41.048 START TEST json_config_extra_key 00:05:41.048 ************************************ 00:05:41.048 07:00:59 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:41.048 07:00:59 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:41.048 07:00:59 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:41.048 07:00:59 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:41.048 07:00:59 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:41.048 07:00:59 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:41.048 07:00:59 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:41.048 07:00:59 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:41.048 07:00:59 -- scripts/common.sh@335 -- # IFS=.-: 00:05:41.048 07:00:59 -- scripts/common.sh@335 -- # read -ra ver1 00:05:41.048 07:00:59 -- scripts/common.sh@336 -- # IFS=.-: 00:05:41.048 07:00:59 -- scripts/common.sh@336 -- # read -ra ver2 00:05:41.048 07:00:59 -- scripts/common.sh@337 -- # local 'op=<' 00:05:41.048 07:00:59 -- scripts/common.sh@339 -- # ver1_l=2 00:05:41.048 07:00:59 -- scripts/common.sh@340 -- # ver2_l=1 00:05:41.048 07:00:59 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:41.048 07:00:59 -- scripts/common.sh@343 -- # case "$op" in 00:05:41.048 07:00:59 -- scripts/common.sh@344 -- # : 1 00:05:41.048 07:00:59 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:41.048 07:00:59 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:41.048 07:00:59 -- scripts/common.sh@364 -- # decimal 1 00:05:41.313 07:00:59 -- scripts/common.sh@352 -- # local d=1 00:05:41.313 07:00:59 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:41.313 07:00:59 -- scripts/common.sh@354 -- # echo 1 00:05:41.313 07:00:59 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:41.313 07:00:59 -- scripts/common.sh@365 -- # decimal 2 00:05:41.313 07:00:59 -- scripts/common.sh@352 -- # local d=2 00:05:41.313 07:00:59 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:41.313 07:00:59 -- scripts/common.sh@354 -- # echo 2 00:05:41.313 07:00:59 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:41.313 07:00:59 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:41.313 07:00:59 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:41.313 07:00:59 -- scripts/common.sh@367 -- # return 0 00:05:41.313 07:00:59 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:41.313 07:00:59 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:41.313 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:41.313 --rc genhtml_branch_coverage=1 00:05:41.313 --rc genhtml_function_coverage=1 00:05:41.313 --rc genhtml_legend=1 00:05:41.313 --rc geninfo_all_blocks=1 00:05:41.313 --rc geninfo_unexecuted_blocks=1 00:05:41.313 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:41.313 ' 00:05:41.313 07:00:59 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:41.313 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:41.313 --rc genhtml_branch_coverage=1 00:05:41.313 --rc genhtml_function_coverage=1 00:05:41.313 --rc genhtml_legend=1 00:05:41.313 --rc geninfo_all_blocks=1 00:05:41.313 --rc geninfo_unexecuted_blocks=1 00:05:41.313 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:41.313 ' 00:05:41.313 07:00:59 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:41.313 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:41.313 --rc genhtml_branch_coverage=1 00:05:41.313 --rc genhtml_function_coverage=1 00:05:41.313 --rc genhtml_legend=1 00:05:41.313 --rc geninfo_all_blocks=1 00:05:41.313 --rc geninfo_unexecuted_blocks=1 00:05:41.313 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:41.313 ' 00:05:41.313 07:00:59 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:41.313 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:41.313 --rc genhtml_branch_coverage=1 00:05:41.313 --rc genhtml_function_coverage=1 00:05:41.313 --rc genhtml_legend=1 00:05:41.313 --rc geninfo_all_blocks=1 00:05:41.313 --rc geninfo_unexecuted_blocks=1 00:05:41.313 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:41.313 ' 00:05:41.313 07:00:59 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:41.313 07:00:59 -- nvmf/common.sh@7 -- # uname -s 00:05:41.313 07:00:59 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:41.313 07:00:59 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:41.313 07:00:59 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:41.313 07:00:59 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:41.313 07:00:59 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:41.313 07:00:59 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:41.313 07:00:59 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:41.313 07:00:59 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:41.313 07:00:59 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:41.313 07:00:59 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:41.313 07:00:59 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:41.313 07:00:59 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:41.313 07:00:59 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:41.313 07:00:59 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:41.313 07:00:59 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:41.313 07:00:59 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:41.313 07:00:59 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:41.313 07:00:59 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:41.314 07:00:59 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:41.314 07:00:59 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:41.314 07:00:59 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:41.314 07:00:59 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:41.314 07:00:59 -- paths/export.sh@5 -- # export PATH 00:05:41.314 07:00:59 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:41.314 07:00:59 -- nvmf/common.sh@46 -- # : 0 00:05:41.314 07:00:59 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:41.314 07:00:59 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:41.314 07:00:59 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:41.314 07:00:59 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:41.314 07:00:59 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:41.314 07:00:59 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:41.314 07:00:59 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:41.314 07:00:59 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:41.314 07:00:59 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:05:41.314 07:00:59 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:05:41.314 07:00:59 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:41.314 07:00:59 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:05:41.314 07:00:59 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:41.314 07:00:59 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:05:41.314 07:00:59 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:41.314 07:00:59 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:05:41.314 07:00:59 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:41.314 07:00:59 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:05:41.314 INFO: launching applications... 00:05:41.314 07:00:59 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:41.314 07:00:59 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:05:41.314 07:00:59 -- json_config/json_config_extra_key.sh@25 -- # shift 00:05:41.314 07:00:59 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:05:41.314 07:00:59 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:05:41.314 07:00:59 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=467448 00:05:41.314 07:00:59 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:05:41.314 Waiting for target to run... 00:05:41.314 07:00:59 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 467448 /var/tmp/spdk_tgt.sock 00:05:41.314 07:00:59 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:41.314 07:00:59 -- common/autotest_common.sh@829 -- # '[' -z 467448 ']' 00:05:41.314 07:00:59 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:41.314 07:00:59 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:41.314 07:00:59 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:41.314 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:41.314 07:00:59 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:41.314 07:00:59 -- common/autotest_common.sh@10 -- # set +x 00:05:41.314 [2024-12-13 07:00:59.357682] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:41.314 [2024-12-13 07:00:59.357775] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid467448 ] 00:05:41.314 EAL: No free 2048 kB hugepages reported on node 1 00:05:41.582 [2024-12-13 07:00:59.805304] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:41.867 [2024-12-13 07:00:59.833590] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:41.867 [2024-12-13 07:00:59.833698] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:42.134 07:01:00 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:42.134 07:01:00 -- common/autotest_common.sh@862 -- # return 0 00:05:42.134 07:01:00 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:05:42.134 00:05:42.134 07:01:00 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:05:42.134 INFO: shutting down applications... 00:05:42.134 07:01:00 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:05:42.134 07:01:00 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:05:42.134 07:01:00 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:05:42.134 07:01:00 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 467448 ]] 00:05:42.134 07:01:00 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 467448 00:05:42.134 07:01:00 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:05:42.134 07:01:00 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:42.134 07:01:00 -- json_config/json_config_extra_key.sh@50 -- # kill -0 467448 00:05:42.134 07:01:00 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:05:42.735 07:01:00 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:05:42.735 07:01:00 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:42.735 07:01:00 -- json_config/json_config_extra_key.sh@50 -- # kill -0 467448 00:05:42.735 07:01:00 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:05:42.735 07:01:00 -- json_config/json_config_extra_key.sh@52 -- # break 00:05:42.735 07:01:00 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:05:42.735 07:01:00 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:05:42.735 SPDK target shutdown done 00:05:42.735 07:01:00 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:05:42.735 Success 00:05:42.735 00:05:42.735 real 0m1.575s 00:05:42.735 user 0m1.149s 00:05:42.735 sys 0m0.592s 00:05:42.735 07:01:00 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:42.735 07:01:00 -- common/autotest_common.sh@10 -- # set +x 00:05:42.735 ************************************ 00:05:42.735 END TEST json_config_extra_key 00:05:42.735 ************************************ 00:05:42.735 07:01:00 -- spdk/autotest.sh@167 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:42.735 07:01:00 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:42.735 07:01:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:42.735 07:01:00 -- common/autotest_common.sh@10 -- # set +x 00:05:42.735 ************************************ 00:05:42.735 START TEST alias_rpc 00:05:42.735 ************************************ 00:05:42.735 07:01:00 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:42.735 * Looking for test storage... 00:05:42.735 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:05:42.735 07:01:00 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:42.735 07:01:00 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:42.735 07:01:00 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:42.735 07:01:00 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:42.735 07:01:00 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:42.735 07:01:00 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:42.735 07:01:00 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:42.735 07:01:00 -- scripts/common.sh@335 -- # IFS=.-: 00:05:42.735 07:01:00 -- scripts/common.sh@335 -- # read -ra ver1 00:05:42.735 07:01:00 -- scripts/common.sh@336 -- # IFS=.-: 00:05:42.735 07:01:00 -- scripts/common.sh@336 -- # read -ra ver2 00:05:42.735 07:01:00 -- scripts/common.sh@337 -- # local 'op=<' 00:05:42.735 07:01:00 -- scripts/common.sh@339 -- # ver1_l=2 00:05:42.735 07:01:00 -- scripts/common.sh@340 -- # ver2_l=1 00:05:42.735 07:01:00 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:42.735 07:01:00 -- scripts/common.sh@343 -- # case "$op" in 00:05:42.735 07:01:00 -- scripts/common.sh@344 -- # : 1 00:05:42.735 07:01:00 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:42.735 07:01:00 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:42.735 07:01:00 -- scripts/common.sh@364 -- # decimal 1 00:05:42.735 07:01:00 -- scripts/common.sh@352 -- # local d=1 00:05:42.735 07:01:00 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:42.735 07:01:00 -- scripts/common.sh@354 -- # echo 1 00:05:42.735 07:01:00 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:42.735 07:01:00 -- scripts/common.sh@365 -- # decimal 2 00:05:42.735 07:01:00 -- scripts/common.sh@352 -- # local d=2 00:05:42.735 07:01:00 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:42.735 07:01:00 -- scripts/common.sh@354 -- # echo 2 00:05:42.735 07:01:00 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:42.735 07:01:00 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:42.735 07:01:00 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:42.735 07:01:00 -- scripts/common.sh@367 -- # return 0 00:05:42.735 07:01:00 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:42.735 07:01:00 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:42.735 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.735 --rc genhtml_branch_coverage=1 00:05:42.735 --rc genhtml_function_coverage=1 00:05:42.735 --rc genhtml_legend=1 00:05:42.735 --rc geninfo_all_blocks=1 00:05:42.735 --rc geninfo_unexecuted_blocks=1 00:05:42.735 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.735 ' 00:05:42.735 07:01:00 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:42.735 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.735 --rc genhtml_branch_coverage=1 00:05:42.735 --rc genhtml_function_coverage=1 00:05:42.735 --rc genhtml_legend=1 00:05:42.735 --rc geninfo_all_blocks=1 00:05:42.735 --rc geninfo_unexecuted_blocks=1 00:05:42.735 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.735 ' 00:05:42.735 07:01:00 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:42.735 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.735 --rc genhtml_branch_coverage=1 00:05:42.735 --rc genhtml_function_coverage=1 00:05:42.735 --rc genhtml_legend=1 00:05:42.735 --rc geninfo_all_blocks=1 00:05:42.736 --rc geninfo_unexecuted_blocks=1 00:05:42.736 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.736 ' 00:05:42.736 07:01:00 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:42.736 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.736 --rc genhtml_branch_coverage=1 00:05:42.736 --rc genhtml_function_coverage=1 00:05:42.736 --rc genhtml_legend=1 00:05:42.736 --rc geninfo_all_blocks=1 00:05:42.736 --rc geninfo_unexecuted_blocks=1 00:05:42.736 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.736 ' 00:05:42.736 07:01:00 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:42.736 07:01:00 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=467783 00:05:42.736 07:01:00 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:42.736 07:01:00 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 467783 00:05:42.736 07:01:00 -- common/autotest_common.sh@829 -- # '[' -z 467783 ']' 00:05:42.736 07:01:00 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:42.736 07:01:00 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:42.736 07:01:00 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:42.736 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:42.736 07:01:00 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:42.736 07:01:00 -- common/autotest_common.sh@10 -- # set +x 00:05:43.034 [2024-12-13 07:01:00.977848] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:43.034 [2024-12-13 07:01:00.977940] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid467783 ] 00:05:43.034 EAL: No free 2048 kB hugepages reported on node 1 00:05:43.034 [2024-12-13 07:01:01.060753] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:43.034 [2024-12-13 07:01:01.096527] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:43.034 [2024-12-13 07:01:01.096635] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:43.646 07:01:01 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:43.646 07:01:01 -- common/autotest_common.sh@862 -- # return 0 00:05:43.646 07:01:01 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:43.905 07:01:02 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 467783 00:05:43.905 07:01:02 -- common/autotest_common.sh@936 -- # '[' -z 467783 ']' 00:05:43.905 07:01:02 -- common/autotest_common.sh@940 -- # kill -0 467783 00:05:43.905 07:01:02 -- common/autotest_common.sh@941 -- # uname 00:05:43.905 07:01:02 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:43.905 07:01:02 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 467783 00:05:43.905 07:01:02 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:43.905 07:01:02 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:43.905 07:01:02 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 467783' 00:05:43.905 killing process with pid 467783 00:05:43.905 07:01:02 -- common/autotest_common.sh@955 -- # kill 467783 00:05:43.905 07:01:02 -- common/autotest_common.sh@960 -- # wait 467783 00:05:44.164 00:05:44.164 real 0m1.606s 00:05:44.164 user 0m1.703s 00:05:44.164 sys 0m0.496s 00:05:44.164 07:01:02 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:44.164 07:01:02 -- common/autotest_common.sh@10 -- # set +x 00:05:44.164 ************************************ 00:05:44.164 END TEST alias_rpc 00:05:44.164 ************************************ 00:05:44.423 07:01:02 -- spdk/autotest.sh@169 -- # [[ 0 -eq 0 ]] 00:05:44.423 07:01:02 -- spdk/autotest.sh@170 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:44.423 07:01:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:44.423 07:01:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:44.423 07:01:02 -- common/autotest_common.sh@10 -- # set +x 00:05:44.423 ************************************ 00:05:44.423 START TEST spdkcli_tcp 00:05:44.423 ************************************ 00:05:44.423 07:01:02 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:44.423 * Looking for test storage... 00:05:44.423 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:05:44.423 07:01:02 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:44.423 07:01:02 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:44.423 07:01:02 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:44.423 07:01:02 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:44.423 07:01:02 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:44.423 07:01:02 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:44.423 07:01:02 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:44.423 07:01:02 -- scripts/common.sh@335 -- # IFS=.-: 00:05:44.423 07:01:02 -- scripts/common.sh@335 -- # read -ra ver1 00:05:44.423 07:01:02 -- scripts/common.sh@336 -- # IFS=.-: 00:05:44.423 07:01:02 -- scripts/common.sh@336 -- # read -ra ver2 00:05:44.423 07:01:02 -- scripts/common.sh@337 -- # local 'op=<' 00:05:44.423 07:01:02 -- scripts/common.sh@339 -- # ver1_l=2 00:05:44.423 07:01:02 -- scripts/common.sh@340 -- # ver2_l=1 00:05:44.423 07:01:02 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:44.423 07:01:02 -- scripts/common.sh@343 -- # case "$op" in 00:05:44.423 07:01:02 -- scripts/common.sh@344 -- # : 1 00:05:44.423 07:01:02 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:44.423 07:01:02 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:44.423 07:01:02 -- scripts/common.sh@364 -- # decimal 1 00:05:44.423 07:01:02 -- scripts/common.sh@352 -- # local d=1 00:05:44.423 07:01:02 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:44.423 07:01:02 -- scripts/common.sh@354 -- # echo 1 00:05:44.423 07:01:02 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:44.423 07:01:02 -- scripts/common.sh@365 -- # decimal 2 00:05:44.423 07:01:02 -- scripts/common.sh@352 -- # local d=2 00:05:44.423 07:01:02 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:44.423 07:01:02 -- scripts/common.sh@354 -- # echo 2 00:05:44.423 07:01:02 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:44.424 07:01:02 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:44.424 07:01:02 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:44.424 07:01:02 -- scripts/common.sh@367 -- # return 0 00:05:44.424 07:01:02 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:44.424 07:01:02 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:44.424 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:44.424 --rc genhtml_branch_coverage=1 00:05:44.424 --rc genhtml_function_coverage=1 00:05:44.424 --rc genhtml_legend=1 00:05:44.424 --rc geninfo_all_blocks=1 00:05:44.424 --rc geninfo_unexecuted_blocks=1 00:05:44.424 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:44.424 ' 00:05:44.424 07:01:02 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:44.424 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:44.424 --rc genhtml_branch_coverage=1 00:05:44.424 --rc genhtml_function_coverage=1 00:05:44.424 --rc genhtml_legend=1 00:05:44.424 --rc geninfo_all_blocks=1 00:05:44.424 --rc geninfo_unexecuted_blocks=1 00:05:44.424 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:44.424 ' 00:05:44.424 07:01:02 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:44.424 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:44.424 --rc genhtml_branch_coverage=1 00:05:44.424 --rc genhtml_function_coverage=1 00:05:44.424 --rc genhtml_legend=1 00:05:44.424 --rc geninfo_all_blocks=1 00:05:44.424 --rc geninfo_unexecuted_blocks=1 00:05:44.424 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:44.424 ' 00:05:44.424 07:01:02 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:44.424 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:44.424 --rc genhtml_branch_coverage=1 00:05:44.424 --rc genhtml_function_coverage=1 00:05:44.424 --rc genhtml_legend=1 00:05:44.424 --rc geninfo_all_blocks=1 00:05:44.424 --rc geninfo_unexecuted_blocks=1 00:05:44.424 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:44.424 ' 00:05:44.424 07:01:02 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:05:44.424 07:01:02 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:44.424 07:01:02 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:05:44.424 07:01:02 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:44.424 07:01:02 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:44.424 07:01:02 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:44.424 07:01:02 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:44.424 07:01:02 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:44.424 07:01:02 -- common/autotest_common.sh@10 -- # set +x 00:05:44.424 07:01:02 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:44.424 07:01:02 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=468127 00:05:44.424 07:01:02 -- spdkcli/tcp.sh@27 -- # waitforlisten 468127 00:05:44.424 07:01:02 -- common/autotest_common.sh@829 -- # '[' -z 468127 ']' 00:05:44.424 07:01:02 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:44.424 07:01:02 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:44.424 07:01:02 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:44.424 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:44.424 07:01:02 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:44.424 07:01:02 -- common/autotest_common.sh@10 -- # set +x 00:05:44.424 [2024-12-13 07:01:02.643866] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:44.424 [2024-12-13 07:01:02.643939] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid468127 ] 00:05:44.682 EAL: No free 2048 kB hugepages reported on node 1 00:05:44.682 [2024-12-13 07:01:02.721726] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:44.682 [2024-12-13 07:01:02.757887] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:44.682 [2024-12-13 07:01:02.758113] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:44.682 [2024-12-13 07:01:02.758114] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:45.250 07:01:03 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:45.250 07:01:03 -- common/autotest_common.sh@862 -- # return 0 00:05:45.250 07:01:03 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:45.250 07:01:03 -- spdkcli/tcp.sh@31 -- # socat_pid=468385 00:05:45.250 07:01:03 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:45.509 [ 00:05:45.509 "spdk_get_version", 00:05:45.509 "rpc_get_methods", 00:05:45.509 "trace_get_info", 00:05:45.509 "trace_get_tpoint_group_mask", 00:05:45.509 "trace_disable_tpoint_group", 00:05:45.509 "trace_enable_tpoint_group", 00:05:45.509 "trace_clear_tpoint_mask", 00:05:45.509 "trace_set_tpoint_mask", 00:05:45.509 "vfu_tgt_set_base_path", 00:05:45.509 "framework_get_pci_devices", 00:05:45.509 "framework_get_config", 00:05:45.509 "framework_get_subsystems", 00:05:45.509 "iobuf_get_stats", 00:05:45.509 "iobuf_set_options", 00:05:45.509 "sock_set_default_impl", 00:05:45.509 "sock_impl_set_options", 00:05:45.509 "sock_impl_get_options", 00:05:45.509 "vmd_rescan", 00:05:45.509 "vmd_remove_device", 00:05:45.509 "vmd_enable", 00:05:45.509 "accel_get_stats", 00:05:45.509 "accel_set_options", 00:05:45.509 "accel_set_driver", 00:05:45.509 "accel_crypto_key_destroy", 00:05:45.509 "accel_crypto_keys_get", 00:05:45.509 "accel_crypto_key_create", 00:05:45.509 "accel_assign_opc", 00:05:45.509 "accel_get_module_info", 00:05:45.509 "accel_get_opc_assignments", 00:05:45.509 "notify_get_notifications", 00:05:45.509 "notify_get_types", 00:05:45.509 "bdev_get_histogram", 00:05:45.509 "bdev_enable_histogram", 00:05:45.509 "bdev_set_qos_limit", 00:05:45.509 "bdev_set_qd_sampling_period", 00:05:45.509 "bdev_get_bdevs", 00:05:45.509 "bdev_reset_iostat", 00:05:45.509 "bdev_get_iostat", 00:05:45.509 "bdev_examine", 00:05:45.509 "bdev_wait_for_examine", 00:05:45.509 "bdev_set_options", 00:05:45.509 "scsi_get_devices", 00:05:45.509 "thread_set_cpumask", 00:05:45.509 "framework_get_scheduler", 00:05:45.509 "framework_set_scheduler", 00:05:45.509 "framework_get_reactors", 00:05:45.509 "thread_get_io_channels", 00:05:45.509 "thread_get_pollers", 00:05:45.509 "thread_get_stats", 00:05:45.509 "framework_monitor_context_switch", 00:05:45.509 "spdk_kill_instance", 00:05:45.509 "log_enable_timestamps", 00:05:45.509 "log_get_flags", 00:05:45.509 "log_clear_flag", 00:05:45.509 "log_set_flag", 00:05:45.509 "log_get_level", 00:05:45.509 "log_set_level", 00:05:45.509 "log_get_print_level", 00:05:45.509 "log_set_print_level", 00:05:45.509 "framework_enable_cpumask_locks", 00:05:45.509 "framework_disable_cpumask_locks", 00:05:45.509 "framework_wait_init", 00:05:45.509 "framework_start_init", 00:05:45.509 "virtio_blk_create_transport", 00:05:45.509 "virtio_blk_get_transports", 00:05:45.509 "vhost_controller_set_coalescing", 00:05:45.509 "vhost_get_controllers", 00:05:45.509 "vhost_delete_controller", 00:05:45.509 "vhost_create_blk_controller", 00:05:45.509 "vhost_scsi_controller_remove_target", 00:05:45.509 "vhost_scsi_controller_add_target", 00:05:45.509 "vhost_start_scsi_controller", 00:05:45.509 "vhost_create_scsi_controller", 00:05:45.509 "ublk_recover_disk", 00:05:45.509 "ublk_get_disks", 00:05:45.509 "ublk_stop_disk", 00:05:45.509 "ublk_start_disk", 00:05:45.509 "ublk_destroy_target", 00:05:45.509 "ublk_create_target", 00:05:45.509 "nbd_get_disks", 00:05:45.509 "nbd_stop_disk", 00:05:45.509 "nbd_start_disk", 00:05:45.509 "env_dpdk_get_mem_stats", 00:05:45.509 "nvmf_subsystem_get_listeners", 00:05:45.509 "nvmf_subsystem_get_qpairs", 00:05:45.509 "nvmf_subsystem_get_controllers", 00:05:45.509 "nvmf_get_stats", 00:05:45.509 "nvmf_get_transports", 00:05:45.509 "nvmf_create_transport", 00:05:45.509 "nvmf_get_targets", 00:05:45.509 "nvmf_delete_target", 00:05:45.509 "nvmf_create_target", 00:05:45.509 "nvmf_subsystem_allow_any_host", 00:05:45.509 "nvmf_subsystem_remove_host", 00:05:45.509 "nvmf_subsystem_add_host", 00:05:45.509 "nvmf_subsystem_remove_ns", 00:05:45.509 "nvmf_subsystem_add_ns", 00:05:45.510 "nvmf_subsystem_listener_set_ana_state", 00:05:45.510 "nvmf_discovery_get_referrals", 00:05:45.510 "nvmf_discovery_remove_referral", 00:05:45.510 "nvmf_discovery_add_referral", 00:05:45.510 "nvmf_subsystem_remove_listener", 00:05:45.510 "nvmf_subsystem_add_listener", 00:05:45.510 "nvmf_delete_subsystem", 00:05:45.510 "nvmf_create_subsystem", 00:05:45.510 "nvmf_get_subsystems", 00:05:45.510 "nvmf_set_crdt", 00:05:45.510 "nvmf_set_config", 00:05:45.510 "nvmf_set_max_subsystems", 00:05:45.510 "iscsi_set_options", 00:05:45.510 "iscsi_get_auth_groups", 00:05:45.510 "iscsi_auth_group_remove_secret", 00:05:45.510 "iscsi_auth_group_add_secret", 00:05:45.510 "iscsi_delete_auth_group", 00:05:45.510 "iscsi_create_auth_group", 00:05:45.510 "iscsi_set_discovery_auth", 00:05:45.510 "iscsi_get_options", 00:05:45.510 "iscsi_target_node_request_logout", 00:05:45.510 "iscsi_target_node_set_redirect", 00:05:45.510 "iscsi_target_node_set_auth", 00:05:45.510 "iscsi_target_node_add_lun", 00:05:45.510 "iscsi_get_connections", 00:05:45.510 "iscsi_portal_group_set_auth", 00:05:45.510 "iscsi_start_portal_group", 00:05:45.510 "iscsi_delete_portal_group", 00:05:45.510 "iscsi_create_portal_group", 00:05:45.510 "iscsi_get_portal_groups", 00:05:45.510 "iscsi_delete_target_node", 00:05:45.510 "iscsi_target_node_remove_pg_ig_maps", 00:05:45.510 "iscsi_target_node_add_pg_ig_maps", 00:05:45.510 "iscsi_create_target_node", 00:05:45.510 "iscsi_get_target_nodes", 00:05:45.510 "iscsi_delete_initiator_group", 00:05:45.510 "iscsi_initiator_group_remove_initiators", 00:05:45.510 "iscsi_initiator_group_add_initiators", 00:05:45.510 "iscsi_create_initiator_group", 00:05:45.510 "iscsi_get_initiator_groups", 00:05:45.510 "vfu_virtio_create_scsi_endpoint", 00:05:45.510 "vfu_virtio_scsi_remove_target", 00:05:45.510 "vfu_virtio_scsi_add_target", 00:05:45.510 "vfu_virtio_create_blk_endpoint", 00:05:45.510 "vfu_virtio_delete_endpoint", 00:05:45.510 "iaa_scan_accel_module", 00:05:45.510 "dsa_scan_accel_module", 00:05:45.510 "ioat_scan_accel_module", 00:05:45.510 "accel_error_inject_error", 00:05:45.510 "bdev_iscsi_delete", 00:05:45.510 "bdev_iscsi_create", 00:05:45.510 "bdev_iscsi_set_options", 00:05:45.510 "bdev_virtio_attach_controller", 00:05:45.510 "bdev_virtio_scsi_get_devices", 00:05:45.510 "bdev_virtio_detach_controller", 00:05:45.510 "bdev_virtio_blk_set_hotplug", 00:05:45.510 "bdev_ftl_set_property", 00:05:45.510 "bdev_ftl_get_properties", 00:05:45.510 "bdev_ftl_get_stats", 00:05:45.510 "bdev_ftl_unmap", 00:05:45.510 "bdev_ftl_unload", 00:05:45.510 "bdev_ftl_delete", 00:05:45.510 "bdev_ftl_load", 00:05:45.510 "bdev_ftl_create", 00:05:45.510 "bdev_aio_delete", 00:05:45.510 "bdev_aio_rescan", 00:05:45.510 "bdev_aio_create", 00:05:45.510 "blobfs_create", 00:05:45.510 "blobfs_detect", 00:05:45.510 "blobfs_set_cache_size", 00:05:45.510 "bdev_zone_block_delete", 00:05:45.510 "bdev_zone_block_create", 00:05:45.510 "bdev_delay_delete", 00:05:45.510 "bdev_delay_create", 00:05:45.510 "bdev_delay_update_latency", 00:05:45.510 "bdev_split_delete", 00:05:45.510 "bdev_split_create", 00:05:45.510 "bdev_error_inject_error", 00:05:45.510 "bdev_error_delete", 00:05:45.510 "bdev_error_create", 00:05:45.510 "bdev_raid_set_options", 00:05:45.510 "bdev_raid_remove_base_bdev", 00:05:45.510 "bdev_raid_add_base_bdev", 00:05:45.510 "bdev_raid_delete", 00:05:45.510 "bdev_raid_create", 00:05:45.510 "bdev_raid_get_bdevs", 00:05:45.510 "bdev_lvol_grow_lvstore", 00:05:45.510 "bdev_lvol_get_lvols", 00:05:45.510 "bdev_lvol_get_lvstores", 00:05:45.510 "bdev_lvol_delete", 00:05:45.510 "bdev_lvol_set_read_only", 00:05:45.510 "bdev_lvol_resize", 00:05:45.510 "bdev_lvol_decouple_parent", 00:05:45.510 "bdev_lvol_inflate", 00:05:45.510 "bdev_lvol_rename", 00:05:45.510 "bdev_lvol_clone_bdev", 00:05:45.510 "bdev_lvol_clone", 00:05:45.510 "bdev_lvol_snapshot", 00:05:45.510 "bdev_lvol_create", 00:05:45.510 "bdev_lvol_delete_lvstore", 00:05:45.510 "bdev_lvol_rename_lvstore", 00:05:45.510 "bdev_lvol_create_lvstore", 00:05:45.510 "bdev_passthru_delete", 00:05:45.510 "bdev_passthru_create", 00:05:45.510 "bdev_nvme_cuse_unregister", 00:05:45.510 "bdev_nvme_cuse_register", 00:05:45.510 "bdev_opal_new_user", 00:05:45.510 "bdev_opal_set_lock_state", 00:05:45.510 "bdev_opal_delete", 00:05:45.510 "bdev_opal_get_info", 00:05:45.510 "bdev_opal_create", 00:05:45.510 "bdev_nvme_opal_revert", 00:05:45.510 "bdev_nvme_opal_init", 00:05:45.510 "bdev_nvme_send_cmd", 00:05:45.510 "bdev_nvme_get_path_iostat", 00:05:45.510 "bdev_nvme_get_mdns_discovery_info", 00:05:45.510 "bdev_nvme_stop_mdns_discovery", 00:05:45.510 "bdev_nvme_start_mdns_discovery", 00:05:45.510 "bdev_nvme_set_multipath_policy", 00:05:45.510 "bdev_nvme_set_preferred_path", 00:05:45.510 "bdev_nvme_get_io_paths", 00:05:45.510 "bdev_nvme_remove_error_injection", 00:05:45.510 "bdev_nvme_add_error_injection", 00:05:45.510 "bdev_nvme_get_discovery_info", 00:05:45.510 "bdev_nvme_stop_discovery", 00:05:45.510 "bdev_nvme_start_discovery", 00:05:45.510 "bdev_nvme_get_controller_health_info", 00:05:45.510 "bdev_nvme_disable_controller", 00:05:45.510 "bdev_nvme_enable_controller", 00:05:45.510 "bdev_nvme_reset_controller", 00:05:45.510 "bdev_nvme_get_transport_statistics", 00:05:45.510 "bdev_nvme_apply_firmware", 00:05:45.510 "bdev_nvme_detach_controller", 00:05:45.510 "bdev_nvme_get_controllers", 00:05:45.510 "bdev_nvme_attach_controller", 00:05:45.510 "bdev_nvme_set_hotplug", 00:05:45.510 "bdev_nvme_set_options", 00:05:45.510 "bdev_null_resize", 00:05:45.510 "bdev_null_delete", 00:05:45.510 "bdev_null_create", 00:05:45.510 "bdev_malloc_delete", 00:05:45.510 "bdev_malloc_create" 00:05:45.510 ] 00:05:45.510 07:01:03 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:45.510 07:01:03 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:45.510 07:01:03 -- common/autotest_common.sh@10 -- # set +x 00:05:45.510 07:01:03 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:45.510 07:01:03 -- spdkcli/tcp.sh@38 -- # killprocess 468127 00:05:45.510 07:01:03 -- common/autotest_common.sh@936 -- # '[' -z 468127 ']' 00:05:45.510 07:01:03 -- common/autotest_common.sh@940 -- # kill -0 468127 00:05:45.510 07:01:03 -- common/autotest_common.sh@941 -- # uname 00:05:45.510 07:01:03 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:45.510 07:01:03 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 468127 00:05:45.769 07:01:03 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:45.769 07:01:03 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:45.769 07:01:03 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 468127' 00:05:45.769 killing process with pid 468127 00:05:45.769 07:01:03 -- common/autotest_common.sh@955 -- # kill 468127 00:05:45.769 07:01:03 -- common/autotest_common.sh@960 -- # wait 468127 00:05:46.028 00:05:46.028 real 0m1.648s 00:05:46.028 user 0m3.016s 00:05:46.028 sys 0m0.524s 00:05:46.028 07:01:04 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:46.028 07:01:04 -- common/autotest_common.sh@10 -- # set +x 00:05:46.028 ************************************ 00:05:46.028 END TEST spdkcli_tcp 00:05:46.028 ************************************ 00:05:46.028 07:01:04 -- spdk/autotest.sh@173 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:46.028 07:01:04 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:46.028 07:01:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:46.028 07:01:04 -- common/autotest_common.sh@10 -- # set +x 00:05:46.028 ************************************ 00:05:46.028 START TEST dpdk_mem_utility 00:05:46.028 ************************************ 00:05:46.028 07:01:04 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:46.028 * Looking for test storage... 00:05:46.028 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:05:46.028 07:01:04 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:46.028 07:01:04 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:46.028 07:01:04 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:46.287 07:01:04 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:46.287 07:01:04 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:46.287 07:01:04 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:46.287 07:01:04 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:46.287 07:01:04 -- scripts/common.sh@335 -- # IFS=.-: 00:05:46.287 07:01:04 -- scripts/common.sh@335 -- # read -ra ver1 00:05:46.287 07:01:04 -- scripts/common.sh@336 -- # IFS=.-: 00:05:46.287 07:01:04 -- scripts/common.sh@336 -- # read -ra ver2 00:05:46.287 07:01:04 -- scripts/common.sh@337 -- # local 'op=<' 00:05:46.287 07:01:04 -- scripts/common.sh@339 -- # ver1_l=2 00:05:46.287 07:01:04 -- scripts/common.sh@340 -- # ver2_l=1 00:05:46.287 07:01:04 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:46.287 07:01:04 -- scripts/common.sh@343 -- # case "$op" in 00:05:46.287 07:01:04 -- scripts/common.sh@344 -- # : 1 00:05:46.287 07:01:04 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:46.287 07:01:04 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:46.287 07:01:04 -- scripts/common.sh@364 -- # decimal 1 00:05:46.287 07:01:04 -- scripts/common.sh@352 -- # local d=1 00:05:46.287 07:01:04 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:46.287 07:01:04 -- scripts/common.sh@354 -- # echo 1 00:05:46.287 07:01:04 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:46.287 07:01:04 -- scripts/common.sh@365 -- # decimal 2 00:05:46.287 07:01:04 -- scripts/common.sh@352 -- # local d=2 00:05:46.287 07:01:04 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:46.287 07:01:04 -- scripts/common.sh@354 -- # echo 2 00:05:46.287 07:01:04 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:46.287 07:01:04 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:46.287 07:01:04 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:46.287 07:01:04 -- scripts/common.sh@367 -- # return 0 00:05:46.287 07:01:04 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:46.287 07:01:04 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:46.287 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:46.287 --rc genhtml_branch_coverage=1 00:05:46.287 --rc genhtml_function_coverage=1 00:05:46.287 --rc genhtml_legend=1 00:05:46.287 --rc geninfo_all_blocks=1 00:05:46.287 --rc geninfo_unexecuted_blocks=1 00:05:46.287 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:46.287 ' 00:05:46.287 07:01:04 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:46.287 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:46.287 --rc genhtml_branch_coverage=1 00:05:46.287 --rc genhtml_function_coverage=1 00:05:46.287 --rc genhtml_legend=1 00:05:46.287 --rc geninfo_all_blocks=1 00:05:46.287 --rc geninfo_unexecuted_blocks=1 00:05:46.287 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:46.287 ' 00:05:46.287 07:01:04 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:46.287 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:46.287 --rc genhtml_branch_coverage=1 00:05:46.287 --rc genhtml_function_coverage=1 00:05:46.287 --rc genhtml_legend=1 00:05:46.287 --rc geninfo_all_blocks=1 00:05:46.288 --rc geninfo_unexecuted_blocks=1 00:05:46.288 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:46.288 ' 00:05:46.288 07:01:04 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:46.288 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:46.288 --rc genhtml_branch_coverage=1 00:05:46.288 --rc genhtml_function_coverage=1 00:05:46.288 --rc genhtml_legend=1 00:05:46.288 --rc geninfo_all_blocks=1 00:05:46.288 --rc geninfo_unexecuted_blocks=1 00:05:46.288 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:46.288 ' 00:05:46.288 07:01:04 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:46.288 07:01:04 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=468466 00:05:46.288 07:01:04 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 468466 00:05:46.288 07:01:04 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:46.288 07:01:04 -- common/autotest_common.sh@829 -- # '[' -z 468466 ']' 00:05:46.288 07:01:04 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:46.288 07:01:04 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:46.288 07:01:04 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:46.288 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:46.288 07:01:04 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:46.288 07:01:04 -- common/autotest_common.sh@10 -- # set +x 00:05:46.288 [2024-12-13 07:01:04.334390] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:46.288 [2024-12-13 07:01:04.334474] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid468466 ] 00:05:46.288 EAL: No free 2048 kB hugepages reported on node 1 00:05:46.288 [2024-12-13 07:01:04.418643] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:46.288 [2024-12-13 07:01:04.456114] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:46.288 [2024-12-13 07:01:04.456231] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:47.224 07:01:05 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:47.224 07:01:05 -- common/autotest_common.sh@862 -- # return 0 00:05:47.224 07:01:05 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:47.224 07:01:05 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:47.224 07:01:05 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:47.224 07:01:05 -- common/autotest_common.sh@10 -- # set +x 00:05:47.224 { 00:05:47.224 "filename": "/tmp/spdk_mem_dump.txt" 00:05:47.224 } 00:05:47.224 07:01:05 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:47.224 07:01:05 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:47.224 DPDK memory size 814.000000 MiB in 1 heap(s) 00:05:47.224 1 heaps totaling size 814.000000 MiB 00:05:47.224 size: 814.000000 MiB heap id: 0 00:05:47.224 end heaps---------- 00:05:47.224 8 mempools totaling size 598.116089 MiB 00:05:47.224 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:47.224 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:47.224 size: 84.521057 MiB name: bdev_io_468466 00:05:47.224 size: 51.011292 MiB name: evtpool_468466 00:05:47.224 size: 50.003479 MiB name: msgpool_468466 00:05:47.224 size: 21.763794 MiB name: PDU_Pool 00:05:47.224 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:47.224 size: 0.026123 MiB name: Session_Pool 00:05:47.224 end mempools------- 00:05:47.224 6 memzones totaling size 4.142822 MiB 00:05:47.224 size: 1.000366 MiB name: RG_ring_0_468466 00:05:47.224 size: 1.000366 MiB name: RG_ring_1_468466 00:05:47.224 size: 1.000366 MiB name: RG_ring_4_468466 00:05:47.224 size: 1.000366 MiB name: RG_ring_5_468466 00:05:47.224 size: 0.125366 MiB name: RG_ring_2_468466 00:05:47.224 size: 0.015991 MiB name: RG_ring_3_468466 00:05:47.224 end memzones------- 00:05:47.224 07:01:05 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:05:47.224 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:05:47.224 list of free elements. size: 12.519348 MiB 00:05:47.224 element at address: 0x200000400000 with size: 1.999512 MiB 00:05:47.224 element at address: 0x200018e00000 with size: 0.999878 MiB 00:05:47.224 element at address: 0x200019000000 with size: 0.999878 MiB 00:05:47.224 element at address: 0x200003e00000 with size: 0.996277 MiB 00:05:47.224 element at address: 0x200031c00000 with size: 0.994446 MiB 00:05:47.224 element at address: 0x200013800000 with size: 0.978699 MiB 00:05:47.224 element at address: 0x200007000000 with size: 0.959839 MiB 00:05:47.224 element at address: 0x200019200000 with size: 0.936584 MiB 00:05:47.224 element at address: 0x200000200000 with size: 0.841614 MiB 00:05:47.224 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:05:47.224 element at address: 0x20000b200000 with size: 0.490723 MiB 00:05:47.224 element at address: 0x200000800000 with size: 0.487793 MiB 00:05:47.224 element at address: 0x200019400000 with size: 0.485657 MiB 00:05:47.224 element at address: 0x200027e00000 with size: 0.410034 MiB 00:05:47.224 element at address: 0x200003a00000 with size: 0.355530 MiB 00:05:47.224 list of standard malloc elements. size: 199.218079 MiB 00:05:47.224 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:05:47.224 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:05:47.224 element at address: 0x200018efff80 with size: 1.000122 MiB 00:05:47.224 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:05:47.224 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:47.224 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:47.224 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:05:47.224 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:47.224 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:05:47.224 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:05:47.224 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:05:47.224 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:05:47.224 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:05:47.224 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:05:47.224 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:05:47.224 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:05:47.224 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:05:47.224 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:05:47.224 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:05:47.224 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:05:47.224 element at address: 0x200003adb300 with size: 0.000183 MiB 00:05:47.224 element at address: 0x200003adb500 with size: 0.000183 MiB 00:05:47.224 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:05:47.224 element at address: 0x200003affa80 with size: 0.000183 MiB 00:05:47.224 element at address: 0x200003affb40 with size: 0.000183 MiB 00:05:47.224 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:05:47.224 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:05:47.224 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:05:47.224 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:05:47.224 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:05:47.224 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:05:47.224 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:05:47.224 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:05:47.224 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:05:47.224 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:05:47.224 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:05:47.224 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:05:47.224 element at address: 0x200027e69040 with size: 0.000183 MiB 00:05:47.224 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:05:47.224 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:05:47.224 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:05:47.224 list of memzone associated elements. size: 602.262573 MiB 00:05:47.224 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:05:47.224 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:47.224 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:05:47.224 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:47.224 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:05:47.224 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_468466_0 00:05:47.224 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:05:47.224 associated memzone info: size: 48.002930 MiB name: MP_evtpool_468466_0 00:05:47.224 element at address: 0x200003fff380 with size: 48.003052 MiB 00:05:47.224 associated memzone info: size: 48.002930 MiB name: MP_msgpool_468466_0 00:05:47.224 element at address: 0x2000195be940 with size: 20.255554 MiB 00:05:47.224 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:47.224 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:05:47.224 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:47.224 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:05:47.224 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_468466 00:05:47.224 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:05:47.224 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_468466 00:05:47.224 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:05:47.224 associated memzone info: size: 1.007996 MiB name: MP_evtpool_468466 00:05:47.224 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:05:47.224 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:47.224 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:05:47.224 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:47.224 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:05:47.224 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:47.224 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:05:47.224 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:47.224 element at address: 0x200003eff180 with size: 1.000488 MiB 00:05:47.224 associated memzone info: size: 1.000366 MiB name: RG_ring_0_468466 00:05:47.224 element at address: 0x200003affc00 with size: 1.000488 MiB 00:05:47.224 associated memzone info: size: 1.000366 MiB name: RG_ring_1_468466 00:05:47.224 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:05:47.224 associated memzone info: size: 1.000366 MiB name: RG_ring_4_468466 00:05:47.224 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:05:47.224 associated memzone info: size: 1.000366 MiB name: RG_ring_5_468466 00:05:47.224 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:05:47.224 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_468466 00:05:47.224 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:05:47.224 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:47.224 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:05:47.224 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:47.224 element at address: 0x20001947c540 with size: 0.250488 MiB 00:05:47.224 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:47.224 element at address: 0x200003adf880 with size: 0.125488 MiB 00:05:47.224 associated memzone info: size: 0.125366 MiB name: RG_ring_2_468466 00:05:47.225 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:05:47.225 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:47.225 element at address: 0x200027e69100 with size: 0.023743 MiB 00:05:47.225 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:47.225 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:05:47.225 associated memzone info: size: 0.015991 MiB name: RG_ring_3_468466 00:05:47.225 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:05:47.225 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:47.225 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:05:47.225 associated memzone info: size: 0.000183 MiB name: MP_msgpool_468466 00:05:47.225 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:05:47.225 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_468466 00:05:47.225 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:05:47.225 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:47.225 07:01:05 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:47.225 07:01:05 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 468466 00:05:47.225 07:01:05 -- common/autotest_common.sh@936 -- # '[' -z 468466 ']' 00:05:47.225 07:01:05 -- common/autotest_common.sh@940 -- # kill -0 468466 00:05:47.225 07:01:05 -- common/autotest_common.sh@941 -- # uname 00:05:47.225 07:01:05 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:47.225 07:01:05 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 468466 00:05:47.225 07:01:05 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:47.225 07:01:05 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:47.225 07:01:05 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 468466' 00:05:47.225 killing process with pid 468466 00:05:47.225 07:01:05 -- common/autotest_common.sh@955 -- # kill 468466 00:05:47.225 07:01:05 -- common/autotest_common.sh@960 -- # wait 468466 00:05:47.484 00:05:47.484 real 0m1.511s 00:05:47.484 user 0m1.538s 00:05:47.484 sys 0m0.486s 00:05:47.484 07:01:05 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:47.484 07:01:05 -- common/autotest_common.sh@10 -- # set +x 00:05:47.484 ************************************ 00:05:47.484 END TEST dpdk_mem_utility 00:05:47.484 ************************************ 00:05:47.484 07:01:05 -- spdk/autotest.sh@174 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:47.484 07:01:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:47.484 07:01:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:47.484 07:01:05 -- common/autotest_common.sh@10 -- # set +x 00:05:47.484 ************************************ 00:05:47.484 START TEST event 00:05:47.484 ************************************ 00:05:47.484 07:01:05 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:47.743 * Looking for test storage... 00:05:47.743 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:47.743 07:01:05 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:47.743 07:01:05 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:47.743 07:01:05 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:47.743 07:01:05 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:47.743 07:01:05 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:47.743 07:01:05 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:47.743 07:01:05 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:47.743 07:01:05 -- scripts/common.sh@335 -- # IFS=.-: 00:05:47.743 07:01:05 -- scripts/common.sh@335 -- # read -ra ver1 00:05:47.743 07:01:05 -- scripts/common.sh@336 -- # IFS=.-: 00:05:47.743 07:01:05 -- scripts/common.sh@336 -- # read -ra ver2 00:05:47.743 07:01:05 -- scripts/common.sh@337 -- # local 'op=<' 00:05:47.743 07:01:05 -- scripts/common.sh@339 -- # ver1_l=2 00:05:47.743 07:01:05 -- scripts/common.sh@340 -- # ver2_l=1 00:05:47.743 07:01:05 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:47.743 07:01:05 -- scripts/common.sh@343 -- # case "$op" in 00:05:47.743 07:01:05 -- scripts/common.sh@344 -- # : 1 00:05:47.743 07:01:05 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:47.743 07:01:05 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:47.743 07:01:05 -- scripts/common.sh@364 -- # decimal 1 00:05:47.743 07:01:05 -- scripts/common.sh@352 -- # local d=1 00:05:47.743 07:01:05 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:47.743 07:01:05 -- scripts/common.sh@354 -- # echo 1 00:05:47.743 07:01:05 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:47.743 07:01:05 -- scripts/common.sh@365 -- # decimal 2 00:05:47.743 07:01:05 -- scripts/common.sh@352 -- # local d=2 00:05:47.743 07:01:05 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:47.743 07:01:05 -- scripts/common.sh@354 -- # echo 2 00:05:47.743 07:01:05 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:47.743 07:01:05 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:47.743 07:01:05 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:47.743 07:01:05 -- scripts/common.sh@367 -- # return 0 00:05:47.743 07:01:05 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:47.743 07:01:05 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:47.743 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:47.743 --rc genhtml_branch_coverage=1 00:05:47.743 --rc genhtml_function_coverage=1 00:05:47.743 --rc genhtml_legend=1 00:05:47.743 --rc geninfo_all_blocks=1 00:05:47.743 --rc geninfo_unexecuted_blocks=1 00:05:47.743 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:47.743 ' 00:05:47.743 07:01:05 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:47.743 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:47.743 --rc genhtml_branch_coverage=1 00:05:47.743 --rc genhtml_function_coverage=1 00:05:47.743 --rc genhtml_legend=1 00:05:47.743 --rc geninfo_all_blocks=1 00:05:47.743 --rc geninfo_unexecuted_blocks=1 00:05:47.743 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:47.743 ' 00:05:47.743 07:01:05 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:47.743 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:47.743 --rc genhtml_branch_coverage=1 00:05:47.743 --rc genhtml_function_coverage=1 00:05:47.743 --rc genhtml_legend=1 00:05:47.743 --rc geninfo_all_blocks=1 00:05:47.743 --rc geninfo_unexecuted_blocks=1 00:05:47.743 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:47.743 ' 00:05:47.743 07:01:05 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:47.743 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:47.743 --rc genhtml_branch_coverage=1 00:05:47.743 --rc genhtml_function_coverage=1 00:05:47.743 --rc genhtml_legend=1 00:05:47.743 --rc geninfo_all_blocks=1 00:05:47.743 --rc geninfo_unexecuted_blocks=1 00:05:47.743 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:47.743 ' 00:05:47.743 07:01:05 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:05:47.743 07:01:05 -- bdev/nbd_common.sh@6 -- # set -e 00:05:47.743 07:01:05 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:47.743 07:01:05 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:05:47.743 07:01:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:47.743 07:01:05 -- common/autotest_common.sh@10 -- # set +x 00:05:47.743 ************************************ 00:05:47.743 START TEST event_perf 00:05:47.743 ************************************ 00:05:47.743 07:01:05 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:47.743 Running I/O for 1 seconds...[2024-12-13 07:01:05.896258] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:47.743 [2024-12-13 07:01:05.896344] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid468805 ] 00:05:47.743 EAL: No free 2048 kB hugepages reported on node 1 00:05:47.743 [2024-12-13 07:01:05.979033] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:48.002 [2024-12-13 07:01:06.017895] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:48.002 [2024-12-13 07:01:06.018005] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:48.002 [2024-12-13 07:01:06.018092] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:48.002 [2024-12-13 07:01:06.018093] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:48.938 Running I/O for 1 seconds... 00:05:48.938 lcore 0: 193659 00:05:48.938 lcore 1: 193661 00:05:48.938 lcore 2: 193659 00:05:48.938 lcore 3: 193657 00:05:48.938 done. 00:05:48.938 00:05:48.938 real 0m1.194s 00:05:48.938 user 0m4.083s 00:05:48.938 sys 0m0.108s 00:05:48.938 07:01:07 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:48.938 07:01:07 -- common/autotest_common.sh@10 -- # set +x 00:05:48.938 ************************************ 00:05:48.938 END TEST event_perf 00:05:48.938 ************************************ 00:05:48.938 07:01:07 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:48.938 07:01:07 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:05:48.938 07:01:07 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:48.938 07:01:07 -- common/autotest_common.sh@10 -- # set +x 00:05:48.938 ************************************ 00:05:48.938 START TEST event_reactor 00:05:48.938 ************************************ 00:05:48.938 07:01:07 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:48.938 [2024-12-13 07:01:07.138165] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:48.938 [2024-12-13 07:01:07.138305] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid469093 ] 00:05:48.938 EAL: No free 2048 kB hugepages reported on node 1 00:05:49.196 [2024-12-13 07:01:07.221130] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:49.196 [2024-12-13 07:01:07.256143] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:50.133 test_start 00:05:50.133 oneshot 00:05:50.133 tick 100 00:05:50.133 tick 100 00:05:50.133 tick 250 00:05:50.133 tick 100 00:05:50.133 tick 100 00:05:50.133 tick 100 00:05:50.133 tick 250 00:05:50.133 tick 500 00:05:50.133 tick 100 00:05:50.133 tick 100 00:05:50.133 tick 250 00:05:50.133 tick 100 00:05:50.133 tick 100 00:05:50.133 test_end 00:05:50.133 00:05:50.133 real 0m1.189s 00:05:50.133 user 0m1.085s 00:05:50.133 sys 0m0.100s 00:05:50.133 07:01:08 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:50.133 07:01:08 -- common/autotest_common.sh@10 -- # set +x 00:05:50.133 ************************************ 00:05:50.133 END TEST event_reactor 00:05:50.133 ************************************ 00:05:50.133 07:01:08 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:50.133 07:01:08 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:05:50.133 07:01:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:50.133 07:01:08 -- common/autotest_common.sh@10 -- # set +x 00:05:50.133 ************************************ 00:05:50.133 START TEST event_reactor_perf 00:05:50.133 ************************************ 00:05:50.133 07:01:08 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:50.391 [2024-12-13 07:01:08.375485] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:50.391 [2024-12-13 07:01:08.375579] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid469378 ] 00:05:50.392 EAL: No free 2048 kB hugepages reported on node 1 00:05:50.392 [2024-12-13 07:01:08.459733] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:50.392 [2024-12-13 07:01:08.494976] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:51.328 test_start 00:05:51.328 test_end 00:05:51.328 Performance: 965347 events per second 00:05:51.328 00:05:51.328 real 0m1.190s 00:05:51.328 user 0m1.089s 00:05:51.328 sys 0m0.096s 00:05:51.328 07:01:09 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:51.328 07:01:09 -- common/autotest_common.sh@10 -- # set +x 00:05:51.328 ************************************ 00:05:51.328 END TEST event_reactor_perf 00:05:51.328 ************************************ 00:05:51.587 07:01:09 -- event/event.sh@49 -- # uname -s 00:05:51.587 07:01:09 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:51.587 07:01:09 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:51.587 07:01:09 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:51.587 07:01:09 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:51.587 07:01:09 -- common/autotest_common.sh@10 -- # set +x 00:05:51.587 ************************************ 00:05:51.587 START TEST event_scheduler 00:05:51.587 ************************************ 00:05:51.587 07:01:09 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:51.587 * Looking for test storage... 00:05:51.587 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:05:51.587 07:01:09 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:51.587 07:01:09 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:51.587 07:01:09 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:51.587 07:01:09 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:51.587 07:01:09 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:51.587 07:01:09 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:51.587 07:01:09 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:51.587 07:01:09 -- scripts/common.sh@335 -- # IFS=.-: 00:05:51.587 07:01:09 -- scripts/common.sh@335 -- # read -ra ver1 00:05:51.587 07:01:09 -- scripts/common.sh@336 -- # IFS=.-: 00:05:51.587 07:01:09 -- scripts/common.sh@336 -- # read -ra ver2 00:05:51.587 07:01:09 -- scripts/common.sh@337 -- # local 'op=<' 00:05:51.587 07:01:09 -- scripts/common.sh@339 -- # ver1_l=2 00:05:51.587 07:01:09 -- scripts/common.sh@340 -- # ver2_l=1 00:05:51.587 07:01:09 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:51.587 07:01:09 -- scripts/common.sh@343 -- # case "$op" in 00:05:51.587 07:01:09 -- scripts/common.sh@344 -- # : 1 00:05:51.587 07:01:09 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:51.587 07:01:09 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:51.587 07:01:09 -- scripts/common.sh@364 -- # decimal 1 00:05:51.587 07:01:09 -- scripts/common.sh@352 -- # local d=1 00:05:51.587 07:01:09 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:51.587 07:01:09 -- scripts/common.sh@354 -- # echo 1 00:05:51.587 07:01:09 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:51.587 07:01:09 -- scripts/common.sh@365 -- # decimal 2 00:05:51.587 07:01:09 -- scripts/common.sh@352 -- # local d=2 00:05:51.587 07:01:09 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:51.587 07:01:09 -- scripts/common.sh@354 -- # echo 2 00:05:51.587 07:01:09 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:51.587 07:01:09 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:51.587 07:01:09 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:51.587 07:01:09 -- scripts/common.sh@367 -- # return 0 00:05:51.587 07:01:09 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:51.587 07:01:09 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:51.587 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:51.587 --rc genhtml_branch_coverage=1 00:05:51.587 --rc genhtml_function_coverage=1 00:05:51.587 --rc genhtml_legend=1 00:05:51.587 --rc geninfo_all_blocks=1 00:05:51.587 --rc geninfo_unexecuted_blocks=1 00:05:51.587 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:51.587 ' 00:05:51.587 07:01:09 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:51.587 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:51.587 --rc genhtml_branch_coverage=1 00:05:51.587 --rc genhtml_function_coverage=1 00:05:51.587 --rc genhtml_legend=1 00:05:51.587 --rc geninfo_all_blocks=1 00:05:51.587 --rc geninfo_unexecuted_blocks=1 00:05:51.587 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:51.587 ' 00:05:51.587 07:01:09 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:51.587 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:51.587 --rc genhtml_branch_coverage=1 00:05:51.587 --rc genhtml_function_coverage=1 00:05:51.587 --rc genhtml_legend=1 00:05:51.587 --rc geninfo_all_blocks=1 00:05:51.587 --rc geninfo_unexecuted_blocks=1 00:05:51.587 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:51.587 ' 00:05:51.587 07:01:09 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:51.587 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:51.587 --rc genhtml_branch_coverage=1 00:05:51.587 --rc genhtml_function_coverage=1 00:05:51.587 --rc genhtml_legend=1 00:05:51.587 --rc geninfo_all_blocks=1 00:05:51.587 --rc geninfo_unexecuted_blocks=1 00:05:51.587 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:51.587 ' 00:05:51.587 07:01:09 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:51.587 07:01:09 -- scheduler/scheduler.sh@35 -- # scheduler_pid=469694 00:05:51.587 07:01:09 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:51.587 07:01:09 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:51.587 07:01:09 -- scheduler/scheduler.sh@37 -- # waitforlisten 469694 00:05:51.587 07:01:09 -- common/autotest_common.sh@829 -- # '[' -z 469694 ']' 00:05:51.587 07:01:09 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:51.587 07:01:09 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:51.587 07:01:09 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:51.587 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:51.587 07:01:09 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:51.587 07:01:09 -- common/autotest_common.sh@10 -- # set +x 00:05:51.587 [2024-12-13 07:01:09.819865] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:51.587 [2024-12-13 07:01:09.819958] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid469694 ] 00:05:51.846 EAL: No free 2048 kB hugepages reported on node 1 00:05:51.846 [2024-12-13 07:01:09.902069] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:51.846 [2024-12-13 07:01:09.940244] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:51.846 [2024-12-13 07:01:09.940370] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:51.846 [2024-12-13 07:01:09.940454] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:51.846 [2024-12-13 07:01:09.940455] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:51.846 07:01:09 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:51.846 07:01:09 -- common/autotest_common.sh@862 -- # return 0 00:05:51.846 07:01:09 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:51.846 07:01:09 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:51.846 07:01:09 -- common/autotest_common.sh@10 -- # set +x 00:05:51.846 POWER: Env isn't set yet! 00:05:51.846 POWER: Attempting to initialise ACPI cpufreq power management... 00:05:51.846 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:51.846 POWER: Cannot set governor of lcore 0 to userspace 00:05:51.846 POWER: Attempting to initialise PSTAT power management... 00:05:51.846 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:05:51.846 POWER: Initialized successfully for lcore 0 power management 00:05:51.846 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:05:51.846 POWER: Initialized successfully for lcore 1 power management 00:05:51.846 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:05:51.846 POWER: Initialized successfully for lcore 2 power management 00:05:51.846 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:05:51.846 POWER: Initialized successfully for lcore 3 power management 00:05:51.846 [2024-12-13 07:01:10.038406] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:05:51.846 [2024-12-13 07:01:10.038419] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:05:51.846 [2024-12-13 07:01:10.038426] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:05:51.846 07:01:10 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:51.846 07:01:10 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:51.846 07:01:10 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:51.846 07:01:10 -- common/autotest_common.sh@10 -- # set +x 00:05:52.106 [2024-12-13 07:01:10.105102] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:52.106 07:01:10 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:52.106 07:01:10 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:52.106 07:01:10 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:52.106 07:01:10 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:52.106 07:01:10 -- common/autotest_common.sh@10 -- # set +x 00:05:52.106 ************************************ 00:05:52.106 START TEST scheduler_create_thread 00:05:52.106 ************************************ 00:05:52.106 07:01:10 -- common/autotest_common.sh@1114 -- # scheduler_create_thread 00:05:52.106 07:01:10 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:52.106 07:01:10 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:52.106 07:01:10 -- common/autotest_common.sh@10 -- # set +x 00:05:52.106 2 00:05:52.106 07:01:10 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:52.106 07:01:10 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:52.106 07:01:10 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:52.106 07:01:10 -- common/autotest_common.sh@10 -- # set +x 00:05:52.106 3 00:05:52.106 07:01:10 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:52.106 07:01:10 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:52.106 07:01:10 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:52.106 07:01:10 -- common/autotest_common.sh@10 -- # set +x 00:05:52.106 4 00:05:52.106 07:01:10 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:52.106 07:01:10 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:52.106 07:01:10 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:52.106 07:01:10 -- common/autotest_common.sh@10 -- # set +x 00:05:52.106 5 00:05:52.106 07:01:10 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:52.106 07:01:10 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:52.106 07:01:10 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:52.106 07:01:10 -- common/autotest_common.sh@10 -- # set +x 00:05:52.106 6 00:05:52.106 07:01:10 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:52.106 07:01:10 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:52.106 07:01:10 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:52.106 07:01:10 -- common/autotest_common.sh@10 -- # set +x 00:05:52.106 7 00:05:52.106 07:01:10 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:52.106 07:01:10 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:52.106 07:01:10 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:52.106 07:01:10 -- common/autotest_common.sh@10 -- # set +x 00:05:52.106 8 00:05:52.106 07:01:10 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:52.106 07:01:10 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:52.106 07:01:10 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:52.106 07:01:10 -- common/autotest_common.sh@10 -- # set +x 00:05:52.106 9 00:05:52.106 07:01:10 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:52.106 07:01:10 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:52.106 07:01:10 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:52.106 07:01:10 -- common/autotest_common.sh@10 -- # set +x 00:05:52.106 10 00:05:52.106 07:01:10 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:52.106 07:01:10 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:52.106 07:01:10 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:52.106 07:01:10 -- common/autotest_common.sh@10 -- # set +x 00:05:52.106 07:01:10 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:52.106 07:01:10 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:52.106 07:01:10 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:52.106 07:01:10 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:52.106 07:01:10 -- common/autotest_common.sh@10 -- # set +x 00:05:53.042 07:01:11 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:53.042 07:01:11 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:05:53.042 07:01:11 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:53.042 07:01:11 -- common/autotest_common.sh@10 -- # set +x 00:05:54.419 07:01:12 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:54.419 07:01:12 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:05:54.419 07:01:12 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:05:54.419 07:01:12 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:54.419 07:01:12 -- common/autotest_common.sh@10 -- # set +x 00:05:55.355 07:01:13 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:55.355 00:05:55.355 real 0m3.382s 00:05:55.355 user 0m0.023s 00:05:55.355 sys 0m0.007s 00:05:55.355 07:01:13 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:55.355 07:01:13 -- common/autotest_common.sh@10 -- # set +x 00:05:55.355 ************************************ 00:05:55.355 END TEST scheduler_create_thread 00:05:55.355 ************************************ 00:05:55.355 07:01:13 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:05:55.355 07:01:13 -- scheduler/scheduler.sh@46 -- # killprocess 469694 00:05:55.355 07:01:13 -- common/autotest_common.sh@936 -- # '[' -z 469694 ']' 00:05:55.355 07:01:13 -- common/autotest_common.sh@940 -- # kill -0 469694 00:05:55.355 07:01:13 -- common/autotest_common.sh@941 -- # uname 00:05:55.355 07:01:13 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:55.355 07:01:13 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 469694 00:05:55.613 07:01:13 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:05:55.613 07:01:13 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:05:55.613 07:01:13 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 469694' 00:05:55.613 killing process with pid 469694 00:05:55.613 07:01:13 -- common/autotest_common.sh@955 -- # kill 469694 00:05:55.613 07:01:13 -- common/autotest_common.sh@960 -- # wait 469694 00:05:55.873 [2024-12-13 07:01:13.876860] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:05:55.873 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:05:55.873 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:05:55.873 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:05:55.874 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:05:55.874 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:05:55.874 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:05:55.874 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:05:55.874 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:05:55.874 00:05:55.874 real 0m4.486s 00:05:55.874 user 0m7.837s 00:05:55.874 sys 0m0.405s 00:05:55.874 07:01:14 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:55.874 07:01:14 -- common/autotest_common.sh@10 -- # set +x 00:05:55.874 ************************************ 00:05:55.874 END TEST event_scheduler 00:05:55.874 ************************************ 00:05:56.134 07:01:14 -- event/event.sh@51 -- # modprobe -n nbd 00:05:56.134 07:01:14 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:05:56.134 07:01:14 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:56.134 07:01:14 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:56.134 07:01:14 -- common/autotest_common.sh@10 -- # set +x 00:05:56.134 ************************************ 00:05:56.134 START TEST app_repeat 00:05:56.134 ************************************ 00:05:56.134 07:01:14 -- common/autotest_common.sh@1114 -- # app_repeat_test 00:05:56.134 07:01:14 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:56.134 07:01:14 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:56.134 07:01:14 -- event/event.sh@13 -- # local nbd_list 00:05:56.134 07:01:14 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:56.134 07:01:14 -- event/event.sh@14 -- # local bdev_list 00:05:56.134 07:01:14 -- event/event.sh@15 -- # local repeat_times=4 00:05:56.134 07:01:14 -- event/event.sh@17 -- # modprobe nbd 00:05:56.134 07:01:14 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:05:56.134 07:01:14 -- event/event.sh@19 -- # repeat_pid=470553 00:05:56.134 07:01:14 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:05:56.134 07:01:14 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 470553' 00:05:56.134 Process app_repeat pid: 470553 00:05:56.134 07:01:14 -- event/event.sh@23 -- # for i in {0..2} 00:05:56.134 07:01:14 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:05:56.134 spdk_app_start Round 0 00:05:56.134 07:01:14 -- event/event.sh@25 -- # waitforlisten 470553 /var/tmp/spdk-nbd.sock 00:05:56.134 07:01:14 -- common/autotest_common.sh@829 -- # '[' -z 470553 ']' 00:05:56.134 07:01:14 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:56.134 07:01:14 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:56.134 07:01:14 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:56.134 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:56.134 07:01:14 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:56.134 07:01:14 -- common/autotest_common.sh@10 -- # set +x 00:05:56.134 [2024-12-13 07:01:14.167601] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:56.134 [2024-12-13 07:01:14.167681] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid470553 ] 00:05:56.134 EAL: No free 2048 kB hugepages reported on node 1 00:05:56.134 [2024-12-13 07:01:14.233313] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:56.134 [2024-12-13 07:01:14.269383] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:56.134 [2024-12-13 07:01:14.269385] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:57.071 07:01:15 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:57.071 07:01:15 -- common/autotest_common.sh@862 -- # return 0 00:05:57.071 07:01:15 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:57.071 Malloc0 00:05:57.071 07:01:15 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:57.329 Malloc1 00:05:57.329 07:01:15 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:57.329 07:01:15 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:57.329 07:01:15 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:57.329 07:01:15 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:57.329 07:01:15 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:57.329 07:01:15 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:57.329 07:01:15 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:57.329 07:01:15 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:57.329 07:01:15 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:57.329 07:01:15 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:57.329 07:01:15 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:57.329 07:01:15 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:57.329 07:01:15 -- bdev/nbd_common.sh@12 -- # local i 00:05:57.329 07:01:15 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:57.329 07:01:15 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:57.329 07:01:15 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:57.588 /dev/nbd0 00:05:57.588 07:01:15 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:57.588 07:01:15 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:57.588 07:01:15 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:05:57.588 07:01:15 -- common/autotest_common.sh@867 -- # local i 00:05:57.588 07:01:15 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:57.588 07:01:15 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:57.588 07:01:15 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:05:57.588 07:01:15 -- common/autotest_common.sh@871 -- # break 00:05:57.588 07:01:15 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:57.588 07:01:15 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:57.588 07:01:15 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:57.588 1+0 records in 00:05:57.588 1+0 records out 00:05:57.588 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000229449 s, 17.9 MB/s 00:05:57.588 07:01:15 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:57.588 07:01:15 -- common/autotest_common.sh@884 -- # size=4096 00:05:57.588 07:01:15 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:57.588 07:01:15 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:57.588 07:01:15 -- common/autotest_common.sh@887 -- # return 0 00:05:57.588 07:01:15 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:57.588 07:01:15 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:57.588 07:01:15 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:57.588 /dev/nbd1 00:05:57.588 07:01:15 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:57.588 07:01:15 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:57.588 07:01:15 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:05:57.588 07:01:15 -- common/autotest_common.sh@867 -- # local i 00:05:57.588 07:01:15 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:57.588 07:01:15 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:57.588 07:01:15 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:05:57.588 07:01:15 -- common/autotest_common.sh@871 -- # break 00:05:57.588 07:01:15 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:57.588 07:01:15 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:57.588 07:01:15 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:57.588 1+0 records in 00:05:57.588 1+0 records out 00:05:57.588 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000231159 s, 17.7 MB/s 00:05:57.588 07:01:15 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:57.588 07:01:15 -- common/autotest_common.sh@884 -- # size=4096 00:05:57.588 07:01:15 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:57.847 07:01:15 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:57.847 07:01:15 -- common/autotest_common.sh@887 -- # return 0 00:05:57.847 07:01:15 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:57.847 07:01:15 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:57.847 07:01:15 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:57.847 07:01:15 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:57.847 07:01:15 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:57.847 07:01:16 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:57.847 { 00:05:57.847 "nbd_device": "/dev/nbd0", 00:05:57.847 "bdev_name": "Malloc0" 00:05:57.847 }, 00:05:57.847 { 00:05:57.847 "nbd_device": "/dev/nbd1", 00:05:57.847 "bdev_name": "Malloc1" 00:05:57.847 } 00:05:57.847 ]' 00:05:57.847 07:01:16 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:57.847 { 00:05:57.847 "nbd_device": "/dev/nbd0", 00:05:57.847 "bdev_name": "Malloc0" 00:05:57.847 }, 00:05:57.847 { 00:05:57.847 "nbd_device": "/dev/nbd1", 00:05:57.847 "bdev_name": "Malloc1" 00:05:57.847 } 00:05:57.847 ]' 00:05:57.847 07:01:16 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:57.847 07:01:16 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:57.847 /dev/nbd1' 00:05:57.847 07:01:16 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:57.847 /dev/nbd1' 00:05:57.847 07:01:16 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:57.847 07:01:16 -- bdev/nbd_common.sh@65 -- # count=2 00:05:57.847 07:01:16 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:57.847 07:01:16 -- bdev/nbd_common.sh@95 -- # count=2 00:05:57.847 07:01:16 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:57.847 07:01:16 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:57.847 07:01:16 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:57.847 07:01:16 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:57.847 07:01:16 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:57.847 07:01:16 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:57.847 07:01:16 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:57.847 07:01:16 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:57.847 256+0 records in 00:05:57.847 256+0 records out 00:05:57.847 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0116463 s, 90.0 MB/s 00:05:57.847 07:01:16 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:57.847 07:01:16 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:58.106 256+0 records in 00:05:58.106 256+0 records out 00:05:58.106 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0195771 s, 53.6 MB/s 00:05:58.106 07:01:16 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:58.106 07:01:16 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:58.106 256+0 records in 00:05:58.106 256+0 records out 00:05:58.106 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0212176 s, 49.4 MB/s 00:05:58.106 07:01:16 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:58.106 07:01:16 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:58.106 07:01:16 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:58.106 07:01:16 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:58.106 07:01:16 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:58.106 07:01:16 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:58.106 07:01:16 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:58.106 07:01:16 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:58.107 07:01:16 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:58.107 07:01:16 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:58.107 07:01:16 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:58.107 07:01:16 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:58.107 07:01:16 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:58.107 07:01:16 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:58.107 07:01:16 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:58.107 07:01:16 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:58.107 07:01:16 -- bdev/nbd_common.sh@51 -- # local i 00:05:58.107 07:01:16 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:58.107 07:01:16 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:58.107 07:01:16 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:58.107 07:01:16 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:58.107 07:01:16 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:58.107 07:01:16 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:58.107 07:01:16 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:58.107 07:01:16 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:58.107 07:01:16 -- bdev/nbd_common.sh@41 -- # break 00:05:58.107 07:01:16 -- bdev/nbd_common.sh@45 -- # return 0 00:05:58.107 07:01:16 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:58.107 07:01:16 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:58.365 07:01:16 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:58.365 07:01:16 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:58.365 07:01:16 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:58.365 07:01:16 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:58.365 07:01:16 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:58.365 07:01:16 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:58.365 07:01:16 -- bdev/nbd_common.sh@41 -- # break 00:05:58.365 07:01:16 -- bdev/nbd_common.sh@45 -- # return 0 00:05:58.365 07:01:16 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:58.365 07:01:16 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:58.365 07:01:16 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:58.624 07:01:16 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:58.624 07:01:16 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:58.624 07:01:16 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:58.624 07:01:16 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:58.624 07:01:16 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:58.624 07:01:16 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:58.624 07:01:16 -- bdev/nbd_common.sh@65 -- # true 00:05:58.624 07:01:16 -- bdev/nbd_common.sh@65 -- # count=0 00:05:58.624 07:01:16 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:58.624 07:01:16 -- bdev/nbd_common.sh@104 -- # count=0 00:05:58.624 07:01:16 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:58.624 07:01:16 -- bdev/nbd_common.sh@109 -- # return 0 00:05:58.624 07:01:16 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:58.883 07:01:16 -- event/event.sh@35 -- # sleep 3 00:05:59.142 [2024-12-13 07:01:17.131982] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:59.142 [2024-12-13 07:01:17.164700] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:59.142 [2024-12-13 07:01:17.164702] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:59.142 [2024-12-13 07:01:17.204281] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:59.142 [2024-12-13 07:01:17.204328] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:02.429 07:01:19 -- event/event.sh@23 -- # for i in {0..2} 00:06:02.429 07:01:19 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:02.429 spdk_app_start Round 1 00:06:02.429 07:01:19 -- event/event.sh@25 -- # waitforlisten 470553 /var/tmp/spdk-nbd.sock 00:06:02.429 07:01:19 -- common/autotest_common.sh@829 -- # '[' -z 470553 ']' 00:06:02.429 07:01:19 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:02.429 07:01:19 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:02.429 07:01:19 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:02.429 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:02.429 07:01:19 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:02.429 07:01:19 -- common/autotest_common.sh@10 -- # set +x 00:06:02.429 07:01:20 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:02.429 07:01:20 -- common/autotest_common.sh@862 -- # return 0 00:06:02.429 07:01:20 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:02.429 Malloc0 00:06:02.429 07:01:20 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:02.429 Malloc1 00:06:02.429 07:01:20 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:02.429 07:01:20 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:02.429 07:01:20 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:02.429 07:01:20 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:02.429 07:01:20 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:02.429 07:01:20 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:02.429 07:01:20 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:02.429 07:01:20 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:02.429 07:01:20 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:02.429 07:01:20 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:02.429 07:01:20 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:02.429 07:01:20 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:02.429 07:01:20 -- bdev/nbd_common.sh@12 -- # local i 00:06:02.429 07:01:20 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:02.429 07:01:20 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:02.429 07:01:20 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:02.688 /dev/nbd0 00:06:02.688 07:01:20 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:02.688 07:01:20 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:02.688 07:01:20 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:02.688 07:01:20 -- common/autotest_common.sh@867 -- # local i 00:06:02.688 07:01:20 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:02.688 07:01:20 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:02.688 07:01:20 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:02.688 07:01:20 -- common/autotest_common.sh@871 -- # break 00:06:02.688 07:01:20 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:02.688 07:01:20 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:02.688 07:01:20 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:02.688 1+0 records in 00:06:02.688 1+0 records out 00:06:02.688 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000267088 s, 15.3 MB/s 00:06:02.688 07:01:20 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:02.688 07:01:20 -- common/autotest_common.sh@884 -- # size=4096 00:06:02.688 07:01:20 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:02.688 07:01:20 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:02.688 07:01:20 -- common/autotest_common.sh@887 -- # return 0 00:06:02.688 07:01:20 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:02.688 07:01:20 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:02.688 07:01:20 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:02.688 /dev/nbd1 00:06:02.952 07:01:20 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:02.952 07:01:20 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:02.952 07:01:20 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:02.952 07:01:20 -- common/autotest_common.sh@867 -- # local i 00:06:02.952 07:01:20 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:02.952 07:01:20 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:02.952 07:01:20 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:02.952 07:01:20 -- common/autotest_common.sh@871 -- # break 00:06:02.952 07:01:20 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:02.952 07:01:20 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:02.952 07:01:20 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:02.952 1+0 records in 00:06:02.952 1+0 records out 00:06:02.952 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000189478 s, 21.6 MB/s 00:06:02.952 07:01:20 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:02.952 07:01:20 -- common/autotest_common.sh@884 -- # size=4096 00:06:02.952 07:01:20 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:02.952 07:01:20 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:02.952 07:01:20 -- common/autotest_common.sh@887 -- # return 0 00:06:02.952 07:01:20 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:02.952 07:01:20 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:02.952 07:01:20 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:02.953 07:01:20 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:02.953 07:01:20 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:02.953 07:01:21 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:02.953 { 00:06:02.953 "nbd_device": "/dev/nbd0", 00:06:02.953 "bdev_name": "Malloc0" 00:06:02.953 }, 00:06:02.953 { 00:06:02.953 "nbd_device": "/dev/nbd1", 00:06:02.953 "bdev_name": "Malloc1" 00:06:02.953 } 00:06:02.953 ]' 00:06:02.953 07:01:21 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:02.953 { 00:06:02.953 "nbd_device": "/dev/nbd0", 00:06:02.953 "bdev_name": "Malloc0" 00:06:02.953 }, 00:06:02.953 { 00:06:02.953 "nbd_device": "/dev/nbd1", 00:06:02.953 "bdev_name": "Malloc1" 00:06:02.953 } 00:06:02.953 ]' 00:06:02.953 07:01:21 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:02.953 07:01:21 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:02.953 /dev/nbd1' 00:06:02.953 07:01:21 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:02.953 /dev/nbd1' 00:06:02.953 07:01:21 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:02.953 07:01:21 -- bdev/nbd_common.sh@65 -- # count=2 00:06:02.953 07:01:21 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:02.953 07:01:21 -- bdev/nbd_common.sh@95 -- # count=2 00:06:02.953 07:01:21 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:02.953 07:01:21 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:02.953 07:01:21 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:02.953 07:01:21 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:02.953 07:01:21 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:02.953 07:01:21 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:02.953 07:01:21 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:02.953 07:01:21 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:03.216 256+0 records in 00:06:03.216 256+0 records out 00:06:03.216 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0116524 s, 90.0 MB/s 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:03.216 256+0 records in 00:06:03.216 256+0 records out 00:06:03.216 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0193678 s, 54.1 MB/s 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:03.216 256+0 records in 00:06:03.216 256+0 records out 00:06:03.216 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0210765 s, 49.8 MB/s 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@51 -- # local i 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:03.216 07:01:21 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:03.474 07:01:21 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:03.474 07:01:21 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:03.474 07:01:21 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:03.475 07:01:21 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:03.475 07:01:21 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:03.475 07:01:21 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:03.475 07:01:21 -- bdev/nbd_common.sh@41 -- # break 00:06:03.475 07:01:21 -- bdev/nbd_common.sh@45 -- # return 0 00:06:03.475 07:01:21 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:03.475 07:01:21 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:03.475 07:01:21 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:03.475 07:01:21 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:03.475 07:01:21 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:03.475 07:01:21 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:03.475 07:01:21 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:03.475 07:01:21 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:03.475 07:01:21 -- bdev/nbd_common.sh@41 -- # break 00:06:03.475 07:01:21 -- bdev/nbd_common.sh@45 -- # return 0 00:06:03.475 07:01:21 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:03.475 07:01:21 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:03.475 07:01:21 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:03.733 07:01:21 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:03.733 07:01:21 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:03.733 07:01:21 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:03.733 07:01:21 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:03.733 07:01:21 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:03.733 07:01:21 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:03.733 07:01:21 -- bdev/nbd_common.sh@65 -- # true 00:06:03.733 07:01:21 -- bdev/nbd_common.sh@65 -- # count=0 00:06:03.733 07:01:21 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:03.733 07:01:21 -- bdev/nbd_common.sh@104 -- # count=0 00:06:03.733 07:01:21 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:03.733 07:01:21 -- bdev/nbd_common.sh@109 -- # return 0 00:06:03.733 07:01:21 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:03.992 07:01:22 -- event/event.sh@35 -- # sleep 3 00:06:04.251 [2024-12-13 07:01:22.263286] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:04.251 [2024-12-13 07:01:22.296246] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:04.251 [2024-12-13 07:01:22.296249] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:04.251 [2024-12-13 07:01:22.335846] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:04.251 [2024-12-13 07:01:22.335887] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:07.538 07:01:25 -- event/event.sh@23 -- # for i in {0..2} 00:06:07.538 07:01:25 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:07.538 spdk_app_start Round 2 00:06:07.538 07:01:25 -- event/event.sh@25 -- # waitforlisten 470553 /var/tmp/spdk-nbd.sock 00:06:07.538 07:01:25 -- common/autotest_common.sh@829 -- # '[' -z 470553 ']' 00:06:07.538 07:01:25 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:07.538 07:01:25 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:07.538 07:01:25 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:07.538 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:07.538 07:01:25 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:07.538 07:01:25 -- common/autotest_common.sh@10 -- # set +x 00:06:07.538 07:01:25 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:07.538 07:01:25 -- common/autotest_common.sh@862 -- # return 0 00:06:07.538 07:01:25 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:07.538 Malloc0 00:06:07.538 07:01:25 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:07.538 Malloc1 00:06:07.538 07:01:25 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:07.538 07:01:25 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:07.538 07:01:25 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:07.538 07:01:25 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:07.538 07:01:25 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:07.538 07:01:25 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:07.538 07:01:25 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:07.538 07:01:25 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:07.538 07:01:25 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:07.538 07:01:25 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:07.538 07:01:25 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:07.538 07:01:25 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:07.538 07:01:25 -- bdev/nbd_common.sh@12 -- # local i 00:06:07.538 07:01:25 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:07.538 07:01:25 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:07.538 07:01:25 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:07.797 /dev/nbd0 00:06:07.797 07:01:25 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:07.797 07:01:25 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:07.797 07:01:25 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:07.797 07:01:25 -- common/autotest_common.sh@867 -- # local i 00:06:07.797 07:01:25 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:07.797 07:01:25 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:07.797 07:01:25 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:07.797 07:01:25 -- common/autotest_common.sh@871 -- # break 00:06:07.797 07:01:25 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:07.797 07:01:25 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:07.797 07:01:25 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:07.797 1+0 records in 00:06:07.797 1+0 records out 00:06:07.797 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000211015 s, 19.4 MB/s 00:06:07.797 07:01:25 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:07.797 07:01:25 -- common/autotest_common.sh@884 -- # size=4096 00:06:07.797 07:01:25 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:07.797 07:01:25 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:07.797 07:01:25 -- common/autotest_common.sh@887 -- # return 0 00:06:07.797 07:01:25 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:07.797 07:01:25 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:07.797 07:01:25 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:08.055 /dev/nbd1 00:06:08.056 07:01:26 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:08.056 07:01:26 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:08.056 07:01:26 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:08.056 07:01:26 -- common/autotest_common.sh@867 -- # local i 00:06:08.056 07:01:26 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:08.056 07:01:26 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:08.056 07:01:26 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:08.056 07:01:26 -- common/autotest_common.sh@871 -- # break 00:06:08.056 07:01:26 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:08.056 07:01:26 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:08.056 07:01:26 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:08.056 1+0 records in 00:06:08.056 1+0 records out 00:06:08.056 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000255544 s, 16.0 MB/s 00:06:08.056 07:01:26 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:08.056 07:01:26 -- common/autotest_common.sh@884 -- # size=4096 00:06:08.056 07:01:26 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:08.056 07:01:26 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:08.056 07:01:26 -- common/autotest_common.sh@887 -- # return 0 00:06:08.056 07:01:26 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:08.056 07:01:26 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:08.056 07:01:26 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:08.056 07:01:26 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:08.056 07:01:26 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:08.314 07:01:26 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:08.314 { 00:06:08.314 "nbd_device": "/dev/nbd0", 00:06:08.314 "bdev_name": "Malloc0" 00:06:08.314 }, 00:06:08.314 { 00:06:08.314 "nbd_device": "/dev/nbd1", 00:06:08.314 "bdev_name": "Malloc1" 00:06:08.314 } 00:06:08.314 ]' 00:06:08.314 07:01:26 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:08.315 { 00:06:08.315 "nbd_device": "/dev/nbd0", 00:06:08.315 "bdev_name": "Malloc0" 00:06:08.315 }, 00:06:08.315 { 00:06:08.315 "nbd_device": "/dev/nbd1", 00:06:08.315 "bdev_name": "Malloc1" 00:06:08.315 } 00:06:08.315 ]' 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:08.315 /dev/nbd1' 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:08.315 /dev/nbd1' 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@65 -- # count=2 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@95 -- # count=2 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:08.315 256+0 records in 00:06:08.315 256+0 records out 00:06:08.315 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.010758 s, 97.5 MB/s 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:08.315 256+0 records in 00:06:08.315 256+0 records out 00:06:08.315 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0198074 s, 52.9 MB/s 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:08.315 256+0 records in 00:06:08.315 256+0 records out 00:06:08.315 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0213876 s, 49.0 MB/s 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@51 -- # local i 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:08.315 07:01:26 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:08.574 07:01:26 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:08.574 07:01:26 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:08.574 07:01:26 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:08.574 07:01:26 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:08.574 07:01:26 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:08.574 07:01:26 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:08.574 07:01:26 -- bdev/nbd_common.sh@41 -- # break 00:06:08.574 07:01:26 -- bdev/nbd_common.sh@45 -- # return 0 00:06:08.574 07:01:26 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:08.574 07:01:26 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:08.833 07:01:26 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:08.833 07:01:26 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:08.833 07:01:26 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:08.833 07:01:26 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:08.833 07:01:26 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:08.833 07:01:26 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:08.833 07:01:26 -- bdev/nbd_common.sh@41 -- # break 00:06:08.833 07:01:26 -- bdev/nbd_common.sh@45 -- # return 0 00:06:08.833 07:01:26 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:08.833 07:01:26 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:08.833 07:01:26 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:08.833 07:01:27 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:08.833 07:01:27 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:08.833 07:01:27 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:08.833 07:01:27 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:08.833 07:01:27 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:08.833 07:01:27 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:08.833 07:01:27 -- bdev/nbd_common.sh@65 -- # true 00:06:08.833 07:01:27 -- bdev/nbd_common.sh@65 -- # count=0 00:06:08.833 07:01:27 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:08.833 07:01:27 -- bdev/nbd_common.sh@104 -- # count=0 00:06:08.833 07:01:27 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:08.833 07:01:27 -- bdev/nbd_common.sh@109 -- # return 0 00:06:08.833 07:01:27 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:09.092 07:01:27 -- event/event.sh@35 -- # sleep 3 00:06:09.351 [2024-12-13 07:01:27.414660] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:09.351 [2024-12-13 07:01:27.447611] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:09.351 [2024-12-13 07:01:27.447613] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:09.351 [2024-12-13 07:01:27.487234] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:09.351 [2024-12-13 07:01:27.487276] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:12.638 07:01:30 -- event/event.sh@38 -- # waitforlisten 470553 /var/tmp/spdk-nbd.sock 00:06:12.638 07:01:30 -- common/autotest_common.sh@829 -- # '[' -z 470553 ']' 00:06:12.638 07:01:30 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:12.638 07:01:30 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:12.638 07:01:30 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:12.638 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:12.638 07:01:30 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:12.638 07:01:30 -- common/autotest_common.sh@10 -- # set +x 00:06:12.638 07:01:30 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:12.638 07:01:30 -- common/autotest_common.sh@862 -- # return 0 00:06:12.638 07:01:30 -- event/event.sh@39 -- # killprocess 470553 00:06:12.638 07:01:30 -- common/autotest_common.sh@936 -- # '[' -z 470553 ']' 00:06:12.638 07:01:30 -- common/autotest_common.sh@940 -- # kill -0 470553 00:06:12.638 07:01:30 -- common/autotest_common.sh@941 -- # uname 00:06:12.638 07:01:30 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:12.638 07:01:30 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 470553 00:06:12.638 07:01:30 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:12.638 07:01:30 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:12.638 07:01:30 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 470553' 00:06:12.638 killing process with pid 470553 00:06:12.638 07:01:30 -- common/autotest_common.sh@955 -- # kill 470553 00:06:12.638 07:01:30 -- common/autotest_common.sh@960 -- # wait 470553 00:06:12.638 spdk_app_start is called in Round 0. 00:06:12.638 Shutdown signal received, stop current app iteration 00:06:12.638 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 reinitialization... 00:06:12.638 spdk_app_start is called in Round 1. 00:06:12.638 Shutdown signal received, stop current app iteration 00:06:12.638 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 reinitialization... 00:06:12.638 spdk_app_start is called in Round 2. 00:06:12.638 Shutdown signal received, stop current app iteration 00:06:12.638 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 reinitialization... 00:06:12.638 spdk_app_start is called in Round 3. 00:06:12.638 Shutdown signal received, stop current app iteration 00:06:12.638 07:01:30 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:12.638 07:01:30 -- event/event.sh@42 -- # return 0 00:06:12.638 00:06:12.638 real 0m16.495s 00:06:12.638 user 0m35.433s 00:06:12.638 sys 0m3.021s 00:06:12.638 07:01:30 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:12.638 07:01:30 -- common/autotest_common.sh@10 -- # set +x 00:06:12.638 ************************************ 00:06:12.638 END TEST app_repeat 00:06:12.638 ************************************ 00:06:12.638 07:01:30 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:12.638 07:01:30 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:12.638 07:01:30 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:12.638 07:01:30 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:12.638 07:01:30 -- common/autotest_common.sh@10 -- # set +x 00:06:12.638 ************************************ 00:06:12.638 START TEST cpu_locks 00:06:12.638 ************************************ 00:06:12.638 07:01:30 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:12.638 * Looking for test storage... 00:06:12.638 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:12.638 07:01:30 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:12.638 07:01:30 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:12.638 07:01:30 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:12.638 07:01:30 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:12.638 07:01:30 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:12.638 07:01:30 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:12.638 07:01:30 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:12.638 07:01:30 -- scripts/common.sh@335 -- # IFS=.-: 00:06:12.638 07:01:30 -- scripts/common.sh@335 -- # read -ra ver1 00:06:12.638 07:01:30 -- scripts/common.sh@336 -- # IFS=.-: 00:06:12.638 07:01:30 -- scripts/common.sh@336 -- # read -ra ver2 00:06:12.638 07:01:30 -- scripts/common.sh@337 -- # local 'op=<' 00:06:12.638 07:01:30 -- scripts/common.sh@339 -- # ver1_l=2 00:06:12.638 07:01:30 -- scripts/common.sh@340 -- # ver2_l=1 00:06:12.638 07:01:30 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:12.638 07:01:30 -- scripts/common.sh@343 -- # case "$op" in 00:06:12.638 07:01:30 -- scripts/common.sh@344 -- # : 1 00:06:12.638 07:01:30 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:12.638 07:01:30 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:12.638 07:01:30 -- scripts/common.sh@364 -- # decimal 1 00:06:12.638 07:01:30 -- scripts/common.sh@352 -- # local d=1 00:06:12.638 07:01:30 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:12.638 07:01:30 -- scripts/common.sh@354 -- # echo 1 00:06:12.638 07:01:30 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:12.638 07:01:30 -- scripts/common.sh@365 -- # decimal 2 00:06:12.638 07:01:30 -- scripts/common.sh@352 -- # local d=2 00:06:12.638 07:01:30 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:12.638 07:01:30 -- scripts/common.sh@354 -- # echo 2 00:06:12.638 07:01:30 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:12.638 07:01:30 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:12.638 07:01:30 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:12.638 07:01:30 -- scripts/common.sh@367 -- # return 0 00:06:12.638 07:01:30 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:12.638 07:01:30 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:12.638 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:12.638 --rc genhtml_branch_coverage=1 00:06:12.638 --rc genhtml_function_coverage=1 00:06:12.638 --rc genhtml_legend=1 00:06:12.638 --rc geninfo_all_blocks=1 00:06:12.638 --rc geninfo_unexecuted_blocks=1 00:06:12.638 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:12.638 ' 00:06:12.638 07:01:30 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:12.639 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:12.639 --rc genhtml_branch_coverage=1 00:06:12.639 --rc genhtml_function_coverage=1 00:06:12.639 --rc genhtml_legend=1 00:06:12.639 --rc geninfo_all_blocks=1 00:06:12.639 --rc geninfo_unexecuted_blocks=1 00:06:12.639 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:12.639 ' 00:06:12.639 07:01:30 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:12.639 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:12.639 --rc genhtml_branch_coverage=1 00:06:12.639 --rc genhtml_function_coverage=1 00:06:12.639 --rc genhtml_legend=1 00:06:12.639 --rc geninfo_all_blocks=1 00:06:12.639 --rc geninfo_unexecuted_blocks=1 00:06:12.639 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:12.639 ' 00:06:12.639 07:01:30 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:12.639 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:12.639 --rc genhtml_branch_coverage=1 00:06:12.639 --rc genhtml_function_coverage=1 00:06:12.639 --rc genhtml_legend=1 00:06:12.639 --rc geninfo_all_blocks=1 00:06:12.639 --rc geninfo_unexecuted_blocks=1 00:06:12.639 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:12.639 ' 00:06:12.639 07:01:30 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:06:12.639 07:01:30 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:06:12.639 07:01:30 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:06:12.639 07:01:30 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:06:12.639 07:01:30 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:12.639 07:01:30 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:12.639 07:01:30 -- common/autotest_common.sh@10 -- # set +x 00:06:12.639 ************************************ 00:06:12.639 START TEST default_locks 00:06:12.639 ************************************ 00:06:12.897 07:01:30 -- common/autotest_common.sh@1114 -- # default_locks 00:06:12.897 07:01:30 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=473752 00:06:12.897 07:01:30 -- event/cpu_locks.sh@47 -- # waitforlisten 473752 00:06:12.897 07:01:30 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:12.898 07:01:30 -- common/autotest_common.sh@829 -- # '[' -z 473752 ']' 00:06:12.898 07:01:30 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:12.898 07:01:30 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:12.898 07:01:30 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:12.898 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:12.898 07:01:30 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:12.898 07:01:30 -- common/autotest_common.sh@10 -- # set +x 00:06:12.898 [2024-12-13 07:01:30.901068] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:12.898 [2024-12-13 07:01:30.901138] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid473752 ] 00:06:12.898 EAL: No free 2048 kB hugepages reported on node 1 00:06:12.898 [2024-12-13 07:01:30.968055] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:12.898 [2024-12-13 07:01:31.004460] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:12.898 [2024-12-13 07:01:31.004573] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.836 07:01:31 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:13.836 07:01:31 -- common/autotest_common.sh@862 -- # return 0 00:06:13.836 07:01:31 -- event/cpu_locks.sh@49 -- # locks_exist 473752 00:06:13.836 07:01:31 -- event/cpu_locks.sh@22 -- # lslocks -p 473752 00:06:13.836 07:01:31 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:13.836 lslocks: write error 00:06:13.836 07:01:32 -- event/cpu_locks.sh@50 -- # killprocess 473752 00:06:13.836 07:01:32 -- common/autotest_common.sh@936 -- # '[' -z 473752 ']' 00:06:13.836 07:01:32 -- common/autotest_common.sh@940 -- # kill -0 473752 00:06:13.836 07:01:32 -- common/autotest_common.sh@941 -- # uname 00:06:13.836 07:01:32 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:13.836 07:01:32 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 473752 00:06:14.095 07:01:32 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:14.095 07:01:32 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:14.095 07:01:32 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 473752' 00:06:14.095 killing process with pid 473752 00:06:14.095 07:01:32 -- common/autotest_common.sh@955 -- # kill 473752 00:06:14.095 07:01:32 -- common/autotest_common.sh@960 -- # wait 473752 00:06:14.354 07:01:32 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 473752 00:06:14.354 07:01:32 -- common/autotest_common.sh@650 -- # local es=0 00:06:14.354 07:01:32 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 473752 00:06:14.354 07:01:32 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:14.354 07:01:32 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:14.354 07:01:32 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:14.354 07:01:32 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:14.354 07:01:32 -- common/autotest_common.sh@653 -- # waitforlisten 473752 00:06:14.354 07:01:32 -- common/autotest_common.sh@829 -- # '[' -z 473752 ']' 00:06:14.354 07:01:32 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:14.354 07:01:32 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:14.354 07:01:32 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:14.354 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:14.354 07:01:32 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:14.354 07:01:32 -- common/autotest_common.sh@10 -- # set +x 00:06:14.354 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (473752) - No such process 00:06:14.354 ERROR: process (pid: 473752) is no longer running 00:06:14.354 07:01:32 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:14.354 07:01:32 -- common/autotest_common.sh@862 -- # return 1 00:06:14.354 07:01:32 -- common/autotest_common.sh@653 -- # es=1 00:06:14.354 07:01:32 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:14.354 07:01:32 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:14.354 07:01:32 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:14.354 07:01:32 -- event/cpu_locks.sh@54 -- # no_locks 00:06:14.354 07:01:32 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:14.354 07:01:32 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:14.354 07:01:32 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:14.354 00:06:14.354 real 0m1.525s 00:06:14.354 user 0m1.621s 00:06:14.354 sys 0m0.518s 00:06:14.354 07:01:32 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:14.354 07:01:32 -- common/autotest_common.sh@10 -- # set +x 00:06:14.354 ************************************ 00:06:14.354 END TEST default_locks 00:06:14.354 ************************************ 00:06:14.354 07:01:32 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:06:14.355 07:01:32 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:14.355 07:01:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:14.355 07:01:32 -- common/autotest_common.sh@10 -- # set +x 00:06:14.355 ************************************ 00:06:14.355 START TEST default_locks_via_rpc 00:06:14.355 ************************************ 00:06:14.355 07:01:32 -- common/autotest_common.sh@1114 -- # default_locks_via_rpc 00:06:14.355 07:01:32 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=474052 00:06:14.355 07:01:32 -- event/cpu_locks.sh@63 -- # waitforlisten 474052 00:06:14.355 07:01:32 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:14.355 07:01:32 -- common/autotest_common.sh@829 -- # '[' -z 474052 ']' 00:06:14.355 07:01:32 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:14.355 07:01:32 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:14.355 07:01:32 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:14.355 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:14.355 07:01:32 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:14.355 07:01:32 -- common/autotest_common.sh@10 -- # set +x 00:06:14.355 [2024-12-13 07:01:32.478522] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:14.355 [2024-12-13 07:01:32.478619] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid474052 ] 00:06:14.355 EAL: No free 2048 kB hugepages reported on node 1 00:06:14.355 [2024-12-13 07:01:32.545057] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:14.355 [2024-12-13 07:01:32.578375] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:14.355 [2024-12-13 07:01:32.578487] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:15.291 07:01:33 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:15.291 07:01:33 -- common/autotest_common.sh@862 -- # return 0 00:06:15.291 07:01:33 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:06:15.291 07:01:33 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:15.291 07:01:33 -- common/autotest_common.sh@10 -- # set +x 00:06:15.291 07:01:33 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:15.291 07:01:33 -- event/cpu_locks.sh@67 -- # no_locks 00:06:15.291 07:01:33 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:15.291 07:01:33 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:15.291 07:01:33 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:15.291 07:01:33 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:06:15.291 07:01:33 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:15.291 07:01:33 -- common/autotest_common.sh@10 -- # set +x 00:06:15.291 07:01:33 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:15.291 07:01:33 -- event/cpu_locks.sh@71 -- # locks_exist 474052 00:06:15.291 07:01:33 -- event/cpu_locks.sh@22 -- # lslocks -p 474052 00:06:15.292 07:01:33 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:15.859 07:01:33 -- event/cpu_locks.sh@73 -- # killprocess 474052 00:06:15.859 07:01:33 -- common/autotest_common.sh@936 -- # '[' -z 474052 ']' 00:06:15.859 07:01:33 -- common/autotest_common.sh@940 -- # kill -0 474052 00:06:15.859 07:01:33 -- common/autotest_common.sh@941 -- # uname 00:06:15.859 07:01:33 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:15.859 07:01:33 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 474052 00:06:15.859 07:01:33 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:15.859 07:01:33 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:15.859 07:01:33 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 474052' 00:06:15.859 killing process with pid 474052 00:06:15.859 07:01:33 -- common/autotest_common.sh@955 -- # kill 474052 00:06:15.859 07:01:33 -- common/autotest_common.sh@960 -- # wait 474052 00:06:16.117 00:06:16.117 real 0m1.808s 00:06:16.117 user 0m1.939s 00:06:16.117 sys 0m0.596s 00:06:16.117 07:01:34 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:16.117 07:01:34 -- common/autotest_common.sh@10 -- # set +x 00:06:16.117 ************************************ 00:06:16.117 END TEST default_locks_via_rpc 00:06:16.117 ************************************ 00:06:16.117 07:01:34 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:06:16.117 07:01:34 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:16.117 07:01:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:16.117 07:01:34 -- common/autotest_common.sh@10 -- # set +x 00:06:16.117 ************************************ 00:06:16.117 START TEST non_locking_app_on_locked_coremask 00:06:16.117 ************************************ 00:06:16.118 07:01:34 -- common/autotest_common.sh@1114 -- # non_locking_app_on_locked_coremask 00:06:16.118 07:01:34 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=474356 00:06:16.118 07:01:34 -- event/cpu_locks.sh@81 -- # waitforlisten 474356 /var/tmp/spdk.sock 00:06:16.118 07:01:34 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:16.118 07:01:34 -- common/autotest_common.sh@829 -- # '[' -z 474356 ']' 00:06:16.118 07:01:34 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:16.118 07:01:34 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:16.118 07:01:34 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:16.118 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:16.118 07:01:34 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:16.118 07:01:34 -- common/autotest_common.sh@10 -- # set +x 00:06:16.118 [2024-12-13 07:01:34.338113] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:16.118 [2024-12-13 07:01:34.338214] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid474356 ] 00:06:16.376 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.376 [2024-12-13 07:01:34.407011] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.376 [2024-12-13 07:01:34.443096] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:16.376 [2024-12-13 07:01:34.443236] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.943 07:01:35 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:16.943 07:01:35 -- common/autotest_common.sh@862 -- # return 0 00:06:16.943 07:01:35 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=474566 00:06:16.943 07:01:35 -- event/cpu_locks.sh@85 -- # waitforlisten 474566 /var/tmp/spdk2.sock 00:06:16.943 07:01:35 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:06:16.943 07:01:35 -- common/autotest_common.sh@829 -- # '[' -z 474566 ']' 00:06:16.943 07:01:35 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:16.943 07:01:35 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:16.943 07:01:35 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:16.943 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:16.943 07:01:35 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:16.943 07:01:35 -- common/autotest_common.sh@10 -- # set +x 00:06:17.202 [2024-12-13 07:01:35.193664] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:17.202 [2024-12-13 07:01:35.193729] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid474566 ] 00:06:17.202 EAL: No free 2048 kB hugepages reported on node 1 00:06:17.202 [2024-12-13 07:01:35.283883] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:17.202 [2024-12-13 07:01:35.283915] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:17.202 [2024-12-13 07:01:35.356148] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:17.202 [2024-12-13 07:01:35.356284] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:18.137 07:01:36 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:18.137 07:01:36 -- common/autotest_common.sh@862 -- # return 0 00:06:18.137 07:01:36 -- event/cpu_locks.sh@87 -- # locks_exist 474356 00:06:18.137 07:01:36 -- event/cpu_locks.sh@22 -- # lslocks -p 474356 00:06:18.137 07:01:36 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:19.073 lslocks: write error 00:06:19.073 07:01:36 -- event/cpu_locks.sh@89 -- # killprocess 474356 00:06:19.073 07:01:36 -- common/autotest_common.sh@936 -- # '[' -z 474356 ']' 00:06:19.073 07:01:36 -- common/autotest_common.sh@940 -- # kill -0 474356 00:06:19.073 07:01:36 -- common/autotest_common.sh@941 -- # uname 00:06:19.073 07:01:36 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:19.073 07:01:36 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 474356 00:06:19.073 07:01:37 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:19.073 07:01:37 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:19.073 07:01:37 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 474356' 00:06:19.073 killing process with pid 474356 00:06:19.073 07:01:37 -- common/autotest_common.sh@955 -- # kill 474356 00:06:19.073 07:01:37 -- common/autotest_common.sh@960 -- # wait 474356 00:06:19.641 07:01:37 -- event/cpu_locks.sh@90 -- # killprocess 474566 00:06:19.641 07:01:37 -- common/autotest_common.sh@936 -- # '[' -z 474566 ']' 00:06:19.641 07:01:37 -- common/autotest_common.sh@940 -- # kill -0 474566 00:06:19.641 07:01:37 -- common/autotest_common.sh@941 -- # uname 00:06:19.641 07:01:37 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:19.641 07:01:37 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 474566 00:06:19.641 07:01:37 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:19.641 07:01:37 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:19.641 07:01:37 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 474566' 00:06:19.641 killing process with pid 474566 00:06:19.641 07:01:37 -- common/autotest_common.sh@955 -- # kill 474566 00:06:19.641 07:01:37 -- common/autotest_common.sh@960 -- # wait 474566 00:06:19.899 00:06:19.899 real 0m3.679s 00:06:19.899 user 0m3.966s 00:06:19.899 sys 0m1.176s 00:06:19.899 07:01:37 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:19.899 07:01:37 -- common/autotest_common.sh@10 -- # set +x 00:06:19.899 ************************************ 00:06:19.900 END TEST non_locking_app_on_locked_coremask 00:06:19.900 ************************************ 00:06:19.900 07:01:38 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:06:19.900 07:01:38 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:19.900 07:01:38 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:19.900 07:01:38 -- common/autotest_common.sh@10 -- # set +x 00:06:19.900 ************************************ 00:06:19.900 START TEST locking_app_on_unlocked_coremask 00:06:19.900 ************************************ 00:06:19.900 07:01:38 -- common/autotest_common.sh@1114 -- # locking_app_on_unlocked_coremask 00:06:19.900 07:01:38 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=475073 00:06:19.900 07:01:38 -- event/cpu_locks.sh@99 -- # waitforlisten 475073 /var/tmp/spdk.sock 00:06:19.900 07:01:38 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:06:19.900 07:01:38 -- common/autotest_common.sh@829 -- # '[' -z 475073 ']' 00:06:19.900 07:01:38 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:19.900 07:01:38 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:19.900 07:01:38 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:19.900 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:19.900 07:01:38 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:19.900 07:01:38 -- common/autotest_common.sh@10 -- # set +x 00:06:19.900 [2024-12-13 07:01:38.067214] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:19.900 [2024-12-13 07:01:38.067329] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid475073 ] 00:06:19.900 EAL: No free 2048 kB hugepages reported on node 1 00:06:19.900 [2024-12-13 07:01:38.132846] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:19.900 [2024-12-13 07:01:38.132874] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:20.158 [2024-12-13 07:01:38.167877] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:20.158 [2024-12-13 07:01:38.168006] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.725 07:01:38 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:20.725 07:01:38 -- common/autotest_common.sh@862 -- # return 0 00:06:20.725 07:01:38 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:20.725 07:01:38 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=475210 00:06:20.725 07:01:38 -- event/cpu_locks.sh@103 -- # waitforlisten 475210 /var/tmp/spdk2.sock 00:06:20.725 07:01:38 -- common/autotest_common.sh@829 -- # '[' -z 475210 ']' 00:06:20.725 07:01:38 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:20.725 07:01:38 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:20.725 07:01:38 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:20.725 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:20.725 07:01:38 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:20.725 07:01:38 -- common/autotest_common.sh@10 -- # set +x 00:06:20.725 [2024-12-13 07:01:38.910765] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:20.725 [2024-12-13 07:01:38.910844] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid475210 ] 00:06:20.725 EAL: No free 2048 kB hugepages reported on node 1 00:06:20.984 [2024-12-13 07:01:39.007447] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:20.984 [2024-12-13 07:01:39.079477] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:20.984 [2024-12-13 07:01:39.079595] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:21.551 07:01:39 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:21.551 07:01:39 -- common/autotest_common.sh@862 -- # return 0 00:06:21.551 07:01:39 -- event/cpu_locks.sh@105 -- # locks_exist 475210 00:06:21.551 07:01:39 -- event/cpu_locks.sh@22 -- # lslocks -p 475210 00:06:21.551 07:01:39 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:22.928 lslocks: write error 00:06:22.928 07:01:40 -- event/cpu_locks.sh@107 -- # killprocess 475073 00:06:22.928 07:01:40 -- common/autotest_common.sh@936 -- # '[' -z 475073 ']' 00:06:22.928 07:01:40 -- common/autotest_common.sh@940 -- # kill -0 475073 00:06:22.928 07:01:40 -- common/autotest_common.sh@941 -- # uname 00:06:22.928 07:01:40 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:22.928 07:01:40 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 475073 00:06:22.928 07:01:40 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:22.928 07:01:40 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:22.928 07:01:40 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 475073' 00:06:22.928 killing process with pid 475073 00:06:22.928 07:01:40 -- common/autotest_common.sh@955 -- # kill 475073 00:06:22.928 07:01:40 -- common/autotest_common.sh@960 -- # wait 475073 00:06:23.495 07:01:41 -- event/cpu_locks.sh@108 -- # killprocess 475210 00:06:23.495 07:01:41 -- common/autotest_common.sh@936 -- # '[' -z 475210 ']' 00:06:23.495 07:01:41 -- common/autotest_common.sh@940 -- # kill -0 475210 00:06:23.495 07:01:41 -- common/autotest_common.sh@941 -- # uname 00:06:23.495 07:01:41 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:23.495 07:01:41 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 475210 00:06:23.495 07:01:41 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:23.495 07:01:41 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:23.495 07:01:41 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 475210' 00:06:23.495 killing process with pid 475210 00:06:23.495 07:01:41 -- common/autotest_common.sh@955 -- # kill 475210 00:06:23.495 07:01:41 -- common/autotest_common.sh@960 -- # wait 475210 00:06:23.755 00:06:23.755 real 0m3.862s 00:06:23.755 user 0m4.137s 00:06:23.755 sys 0m1.315s 00:06:23.755 07:01:41 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:23.755 07:01:41 -- common/autotest_common.sh@10 -- # set +x 00:06:23.755 ************************************ 00:06:23.755 END TEST locking_app_on_unlocked_coremask 00:06:23.755 ************************************ 00:06:23.755 07:01:41 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:06:23.755 07:01:41 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:23.755 07:01:41 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:23.755 07:01:41 -- common/autotest_common.sh@10 -- # set +x 00:06:23.755 ************************************ 00:06:23.755 START TEST locking_app_on_locked_coremask 00:06:23.755 ************************************ 00:06:23.755 07:01:41 -- common/autotest_common.sh@1114 -- # locking_app_on_locked_coremask 00:06:23.755 07:01:41 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=475788 00:06:23.755 07:01:41 -- event/cpu_locks.sh@116 -- # waitforlisten 475788 /var/tmp/spdk.sock 00:06:23.755 07:01:41 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:23.755 07:01:41 -- common/autotest_common.sh@829 -- # '[' -z 475788 ']' 00:06:23.755 07:01:41 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:23.755 07:01:41 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:23.755 07:01:41 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:23.755 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:23.755 07:01:41 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:23.755 07:01:41 -- common/autotest_common.sh@10 -- # set +x 00:06:23.755 [2024-12-13 07:01:41.979695] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:23.755 [2024-12-13 07:01:41.979768] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid475788 ] 00:06:24.013 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.013 [2024-12-13 07:01:42.045932] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:24.013 [2024-12-13 07:01:42.078432] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:24.014 [2024-12-13 07:01:42.078560] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.581 07:01:42 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:24.581 07:01:42 -- common/autotest_common.sh@862 -- # return 0 00:06:24.581 07:01:42 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=476032 00:06:24.581 07:01:42 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 476032 /var/tmp/spdk2.sock 00:06:24.581 07:01:42 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:24.581 07:01:42 -- common/autotest_common.sh@650 -- # local es=0 00:06:24.581 07:01:42 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 476032 /var/tmp/spdk2.sock 00:06:24.581 07:01:42 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:24.581 07:01:42 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:24.581 07:01:42 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:24.581 07:01:42 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:24.581 07:01:42 -- common/autotest_common.sh@653 -- # waitforlisten 476032 /var/tmp/spdk2.sock 00:06:24.581 07:01:42 -- common/autotest_common.sh@829 -- # '[' -z 476032 ']' 00:06:24.581 07:01:42 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:24.581 07:01:42 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:24.581 07:01:42 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:24.581 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:24.581 07:01:42 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:24.581 07:01:42 -- common/autotest_common.sh@10 -- # set +x 00:06:24.841 [2024-12-13 07:01:42.826911] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:24.841 [2024-12-13 07:01:42.827001] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid476032 ] 00:06:24.841 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.841 [2024-12-13 07:01:42.914699] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 475788 has claimed it. 00:06:24.842 [2024-12-13 07:01:42.914737] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:25.409 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (476032) - No such process 00:06:25.409 ERROR: process (pid: 476032) is no longer running 00:06:25.409 07:01:43 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:25.409 07:01:43 -- common/autotest_common.sh@862 -- # return 1 00:06:25.409 07:01:43 -- common/autotest_common.sh@653 -- # es=1 00:06:25.409 07:01:43 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:25.409 07:01:43 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:25.409 07:01:43 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:25.409 07:01:43 -- event/cpu_locks.sh@122 -- # locks_exist 475788 00:06:25.409 07:01:43 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:25.409 07:01:43 -- event/cpu_locks.sh@22 -- # lslocks -p 475788 00:06:25.976 lslocks: write error 00:06:25.976 07:01:44 -- event/cpu_locks.sh@124 -- # killprocess 475788 00:06:25.976 07:01:44 -- common/autotest_common.sh@936 -- # '[' -z 475788 ']' 00:06:25.976 07:01:44 -- common/autotest_common.sh@940 -- # kill -0 475788 00:06:25.976 07:01:44 -- common/autotest_common.sh@941 -- # uname 00:06:25.976 07:01:44 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:25.976 07:01:44 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 475788 00:06:25.976 07:01:44 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:25.976 07:01:44 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:25.976 07:01:44 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 475788' 00:06:25.976 killing process with pid 475788 00:06:25.976 07:01:44 -- common/autotest_common.sh@955 -- # kill 475788 00:06:25.976 07:01:44 -- common/autotest_common.sh@960 -- # wait 475788 00:06:26.234 00:06:26.234 real 0m2.476s 00:06:26.234 user 0m2.729s 00:06:26.234 sys 0m0.758s 00:06:26.234 07:01:44 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:26.234 07:01:44 -- common/autotest_common.sh@10 -- # set +x 00:06:26.234 ************************************ 00:06:26.235 END TEST locking_app_on_locked_coremask 00:06:26.235 ************************************ 00:06:26.235 07:01:44 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:06:26.235 07:01:44 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:26.493 07:01:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:26.493 07:01:44 -- common/autotest_common.sh@10 -- # set +x 00:06:26.493 ************************************ 00:06:26.493 START TEST locking_overlapped_coremask 00:06:26.493 ************************************ 00:06:26.493 07:01:44 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask 00:06:26.493 07:01:44 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=476350 00:06:26.493 07:01:44 -- event/cpu_locks.sh@133 -- # waitforlisten 476350 /var/tmp/spdk.sock 00:06:26.493 07:01:44 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:06:26.493 07:01:44 -- common/autotest_common.sh@829 -- # '[' -z 476350 ']' 00:06:26.494 07:01:44 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:26.494 07:01:44 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:26.494 07:01:44 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:26.494 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:26.494 07:01:44 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:26.494 07:01:44 -- common/autotest_common.sh@10 -- # set +x 00:06:26.494 [2024-12-13 07:01:44.507338] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:26.494 [2024-12-13 07:01:44.507431] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid476350 ] 00:06:26.494 EAL: No free 2048 kB hugepages reported on node 1 00:06:26.494 [2024-12-13 07:01:44.575090] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:26.494 [2024-12-13 07:01:44.609129] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:26.494 [2024-12-13 07:01:44.609321] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:26.494 [2024-12-13 07:01:44.609424] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.494 [2024-12-13 07:01:44.609425] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:27.428 07:01:45 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:27.428 07:01:45 -- common/autotest_common.sh@862 -- # return 0 00:06:27.428 07:01:45 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=476372 00:06:27.428 07:01:45 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 476372 /var/tmp/spdk2.sock 00:06:27.428 07:01:45 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:06:27.428 07:01:45 -- common/autotest_common.sh@650 -- # local es=0 00:06:27.428 07:01:45 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 476372 /var/tmp/spdk2.sock 00:06:27.428 07:01:45 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:27.428 07:01:45 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:27.428 07:01:45 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:27.428 07:01:45 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:27.428 07:01:45 -- common/autotest_common.sh@653 -- # waitforlisten 476372 /var/tmp/spdk2.sock 00:06:27.428 07:01:45 -- common/autotest_common.sh@829 -- # '[' -z 476372 ']' 00:06:27.428 07:01:45 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:27.428 07:01:45 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:27.428 07:01:45 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:27.428 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:27.428 07:01:45 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:27.428 07:01:45 -- common/autotest_common.sh@10 -- # set +x 00:06:27.428 [2024-12-13 07:01:45.355175] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:27.429 [2024-12-13 07:01:45.355277] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid476372 ] 00:06:27.429 EAL: No free 2048 kB hugepages reported on node 1 00:06:27.429 [2024-12-13 07:01:45.449549] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 476350 has claimed it. 00:06:27.429 [2024-12-13 07:01:45.449589] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:27.996 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (476372) - No such process 00:06:27.996 ERROR: process (pid: 476372) is no longer running 00:06:27.996 07:01:46 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:27.996 07:01:46 -- common/autotest_common.sh@862 -- # return 1 00:06:27.996 07:01:46 -- common/autotest_common.sh@653 -- # es=1 00:06:27.996 07:01:46 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:27.996 07:01:46 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:27.996 07:01:46 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:27.996 07:01:46 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:06:27.996 07:01:46 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:27.996 07:01:46 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:27.996 07:01:46 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:27.996 07:01:46 -- event/cpu_locks.sh@141 -- # killprocess 476350 00:06:27.996 07:01:46 -- common/autotest_common.sh@936 -- # '[' -z 476350 ']' 00:06:27.996 07:01:46 -- common/autotest_common.sh@940 -- # kill -0 476350 00:06:27.996 07:01:46 -- common/autotest_common.sh@941 -- # uname 00:06:27.996 07:01:46 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:27.996 07:01:46 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 476350 00:06:27.996 07:01:46 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:27.996 07:01:46 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:27.996 07:01:46 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 476350' 00:06:27.996 killing process with pid 476350 00:06:27.996 07:01:46 -- common/autotest_common.sh@955 -- # kill 476350 00:06:27.996 07:01:46 -- common/autotest_common.sh@960 -- # wait 476350 00:06:28.255 00:06:28.255 real 0m1.888s 00:06:28.255 user 0m5.454s 00:06:28.255 sys 0m0.436s 00:06:28.255 07:01:46 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:28.255 07:01:46 -- common/autotest_common.sh@10 -- # set +x 00:06:28.255 ************************************ 00:06:28.255 END TEST locking_overlapped_coremask 00:06:28.255 ************************************ 00:06:28.255 07:01:46 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:06:28.255 07:01:46 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:28.255 07:01:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:28.255 07:01:46 -- common/autotest_common.sh@10 -- # set +x 00:06:28.255 ************************************ 00:06:28.255 START TEST locking_overlapped_coremask_via_rpc 00:06:28.255 ************************************ 00:06:28.255 07:01:46 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask_via_rpc 00:06:28.255 07:01:46 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=476660 00:06:28.255 07:01:46 -- event/cpu_locks.sh@149 -- # waitforlisten 476660 /var/tmp/spdk.sock 00:06:28.255 07:01:46 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:06:28.255 07:01:46 -- common/autotest_common.sh@829 -- # '[' -z 476660 ']' 00:06:28.255 07:01:46 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:28.255 07:01:46 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:28.255 07:01:46 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:28.255 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:28.255 07:01:46 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:28.255 07:01:46 -- common/autotest_common.sh@10 -- # set +x 00:06:28.255 [2024-12-13 07:01:46.445543] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:28.255 [2024-12-13 07:01:46.445631] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid476660 ] 00:06:28.255 EAL: No free 2048 kB hugepages reported on node 1 00:06:28.514 [2024-12-13 07:01:46.512863] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:28.514 [2024-12-13 07:01:46.512892] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:28.514 [2024-12-13 07:01:46.546852] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:28.514 [2024-12-13 07:01:46.547002] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:28.514 [2024-12-13 07:01:46.547117] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.514 [2024-12-13 07:01:46.547117] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:29.081 07:01:47 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:29.081 07:01:47 -- common/autotest_common.sh@862 -- # return 0 00:06:29.081 07:01:47 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=476878 00:06:29.081 07:01:47 -- event/cpu_locks.sh@153 -- # waitforlisten 476878 /var/tmp/spdk2.sock 00:06:29.081 07:01:47 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:06:29.081 07:01:47 -- common/autotest_common.sh@829 -- # '[' -z 476878 ']' 00:06:29.081 07:01:47 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:29.081 07:01:47 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:29.081 07:01:47 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:29.081 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:29.081 07:01:47 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:29.081 07:01:47 -- common/autotest_common.sh@10 -- # set +x 00:06:29.081 [2024-12-13 07:01:47.307670] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:29.081 [2024-12-13 07:01:47.307757] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid476878 ] 00:06:29.339 EAL: No free 2048 kB hugepages reported on node 1 00:06:29.339 [2024-12-13 07:01:47.402586] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:29.339 [2024-12-13 07:01:47.402615] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:29.339 [2024-12-13 07:01:47.476069] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:29.339 [2024-12-13 07:01:47.476241] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:29.339 [2024-12-13 07:01:47.480233] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:29.339 [2024-12-13 07:01:47.480235] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:06:29.906 07:01:48 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:29.906 07:01:48 -- common/autotest_common.sh@862 -- # return 0 00:06:29.906 07:01:48 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:06:29.906 07:01:48 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:29.906 07:01:48 -- common/autotest_common.sh@10 -- # set +x 00:06:29.906 07:01:48 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:30.165 07:01:48 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:30.165 07:01:48 -- common/autotest_common.sh@650 -- # local es=0 00:06:30.165 07:01:48 -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:30.165 07:01:48 -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:06:30.165 07:01:48 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:30.165 07:01:48 -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:06:30.165 07:01:48 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:30.165 07:01:48 -- common/autotest_common.sh@653 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:30.165 07:01:48 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:30.165 07:01:48 -- common/autotest_common.sh@10 -- # set +x 00:06:30.165 [2024-12-13 07:01:48.157246] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 476660 has claimed it. 00:06:30.165 request: 00:06:30.165 { 00:06:30.165 "method": "framework_enable_cpumask_locks", 00:06:30.165 "req_id": 1 00:06:30.165 } 00:06:30.165 Got JSON-RPC error response 00:06:30.165 response: 00:06:30.165 { 00:06:30.165 "code": -32603, 00:06:30.165 "message": "Failed to claim CPU core: 2" 00:06:30.165 } 00:06:30.165 07:01:48 -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:30.165 07:01:48 -- common/autotest_common.sh@653 -- # es=1 00:06:30.165 07:01:48 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:30.165 07:01:48 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:30.165 07:01:48 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:30.165 07:01:48 -- event/cpu_locks.sh@158 -- # waitforlisten 476660 /var/tmp/spdk.sock 00:06:30.165 07:01:48 -- common/autotest_common.sh@829 -- # '[' -z 476660 ']' 00:06:30.165 07:01:48 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:30.165 07:01:48 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:30.165 07:01:48 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:30.165 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:30.165 07:01:48 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:30.165 07:01:48 -- common/autotest_common.sh@10 -- # set +x 00:06:30.165 07:01:48 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:30.165 07:01:48 -- common/autotest_common.sh@862 -- # return 0 00:06:30.165 07:01:48 -- event/cpu_locks.sh@159 -- # waitforlisten 476878 /var/tmp/spdk2.sock 00:06:30.165 07:01:48 -- common/autotest_common.sh@829 -- # '[' -z 476878 ']' 00:06:30.165 07:01:48 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:30.165 07:01:48 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:30.165 07:01:48 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:30.165 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:30.165 07:01:48 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:30.165 07:01:48 -- common/autotest_common.sh@10 -- # set +x 00:06:30.424 07:01:48 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:30.424 07:01:48 -- common/autotest_common.sh@862 -- # return 0 00:06:30.424 07:01:48 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:06:30.424 07:01:48 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:30.424 07:01:48 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:30.424 07:01:48 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:30.424 00:06:30.424 real 0m2.129s 00:06:30.424 user 0m0.868s 00:06:30.424 sys 0m0.190s 00:06:30.424 07:01:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:30.424 07:01:48 -- common/autotest_common.sh@10 -- # set +x 00:06:30.424 ************************************ 00:06:30.424 END TEST locking_overlapped_coremask_via_rpc 00:06:30.424 ************************************ 00:06:30.424 07:01:48 -- event/cpu_locks.sh@174 -- # cleanup 00:06:30.424 07:01:48 -- event/cpu_locks.sh@15 -- # [[ -z 476660 ]] 00:06:30.424 07:01:48 -- event/cpu_locks.sh@15 -- # killprocess 476660 00:06:30.424 07:01:48 -- common/autotest_common.sh@936 -- # '[' -z 476660 ']' 00:06:30.424 07:01:48 -- common/autotest_common.sh@940 -- # kill -0 476660 00:06:30.424 07:01:48 -- common/autotest_common.sh@941 -- # uname 00:06:30.424 07:01:48 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:30.424 07:01:48 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 476660 00:06:30.424 07:01:48 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:30.424 07:01:48 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:30.424 07:01:48 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 476660' 00:06:30.424 killing process with pid 476660 00:06:30.424 07:01:48 -- common/autotest_common.sh@955 -- # kill 476660 00:06:30.424 07:01:48 -- common/autotest_common.sh@960 -- # wait 476660 00:06:30.991 07:01:48 -- event/cpu_locks.sh@16 -- # [[ -z 476878 ]] 00:06:30.991 07:01:48 -- event/cpu_locks.sh@16 -- # killprocess 476878 00:06:30.991 07:01:48 -- common/autotest_common.sh@936 -- # '[' -z 476878 ']' 00:06:30.991 07:01:48 -- common/autotest_common.sh@940 -- # kill -0 476878 00:06:30.991 07:01:48 -- common/autotest_common.sh@941 -- # uname 00:06:30.991 07:01:48 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:30.991 07:01:48 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 476878 00:06:30.991 07:01:49 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:06:30.991 07:01:49 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:06:30.991 07:01:49 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 476878' 00:06:30.991 killing process with pid 476878 00:06:30.991 07:01:49 -- common/autotest_common.sh@955 -- # kill 476878 00:06:30.992 07:01:49 -- common/autotest_common.sh@960 -- # wait 476878 00:06:31.251 07:01:49 -- event/cpu_locks.sh@18 -- # rm -f 00:06:31.251 07:01:49 -- event/cpu_locks.sh@1 -- # cleanup 00:06:31.251 07:01:49 -- event/cpu_locks.sh@15 -- # [[ -z 476660 ]] 00:06:31.251 07:01:49 -- event/cpu_locks.sh@15 -- # killprocess 476660 00:06:31.251 07:01:49 -- common/autotest_common.sh@936 -- # '[' -z 476660 ']' 00:06:31.251 07:01:49 -- common/autotest_common.sh@940 -- # kill -0 476660 00:06:31.251 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (476660) - No such process 00:06:31.251 07:01:49 -- common/autotest_common.sh@963 -- # echo 'Process with pid 476660 is not found' 00:06:31.251 Process with pid 476660 is not found 00:06:31.251 07:01:49 -- event/cpu_locks.sh@16 -- # [[ -z 476878 ]] 00:06:31.251 07:01:49 -- event/cpu_locks.sh@16 -- # killprocess 476878 00:06:31.251 07:01:49 -- common/autotest_common.sh@936 -- # '[' -z 476878 ']' 00:06:31.251 07:01:49 -- common/autotest_common.sh@940 -- # kill -0 476878 00:06:31.251 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (476878) - No such process 00:06:31.251 07:01:49 -- common/autotest_common.sh@963 -- # echo 'Process with pid 476878 is not found' 00:06:31.251 Process with pid 476878 is not found 00:06:31.251 07:01:49 -- event/cpu_locks.sh@18 -- # rm -f 00:06:31.251 00:06:31.251 real 0m18.634s 00:06:31.251 user 0m31.697s 00:06:31.251 sys 0m5.955s 00:06:31.251 07:01:49 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:31.251 07:01:49 -- common/autotest_common.sh@10 -- # set +x 00:06:31.251 ************************************ 00:06:31.251 END TEST cpu_locks 00:06:31.251 ************************************ 00:06:31.251 00:06:31.251 real 0m43.686s 00:06:31.251 user 1m21.402s 00:06:31.251 sys 0m10.069s 00:06:31.251 07:01:49 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:31.251 07:01:49 -- common/autotest_common.sh@10 -- # set +x 00:06:31.251 ************************************ 00:06:31.251 END TEST event 00:06:31.251 ************************************ 00:06:31.251 07:01:49 -- spdk/autotest.sh@175 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:31.251 07:01:49 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:31.251 07:01:49 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:31.251 07:01:49 -- common/autotest_common.sh@10 -- # set +x 00:06:31.251 ************************************ 00:06:31.251 START TEST thread 00:06:31.251 ************************************ 00:06:31.251 07:01:49 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:31.510 * Looking for test storage... 00:06:31.510 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:06:31.510 07:01:49 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:31.510 07:01:49 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:31.510 07:01:49 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:31.510 07:01:49 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:31.510 07:01:49 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:31.510 07:01:49 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:31.510 07:01:49 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:31.510 07:01:49 -- scripts/common.sh@335 -- # IFS=.-: 00:06:31.510 07:01:49 -- scripts/common.sh@335 -- # read -ra ver1 00:06:31.510 07:01:49 -- scripts/common.sh@336 -- # IFS=.-: 00:06:31.510 07:01:49 -- scripts/common.sh@336 -- # read -ra ver2 00:06:31.510 07:01:49 -- scripts/common.sh@337 -- # local 'op=<' 00:06:31.510 07:01:49 -- scripts/common.sh@339 -- # ver1_l=2 00:06:31.510 07:01:49 -- scripts/common.sh@340 -- # ver2_l=1 00:06:31.510 07:01:49 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:31.510 07:01:49 -- scripts/common.sh@343 -- # case "$op" in 00:06:31.510 07:01:49 -- scripts/common.sh@344 -- # : 1 00:06:31.510 07:01:49 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:31.510 07:01:49 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:31.510 07:01:49 -- scripts/common.sh@364 -- # decimal 1 00:06:31.510 07:01:49 -- scripts/common.sh@352 -- # local d=1 00:06:31.510 07:01:49 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:31.510 07:01:49 -- scripts/common.sh@354 -- # echo 1 00:06:31.510 07:01:49 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:31.510 07:01:49 -- scripts/common.sh@365 -- # decimal 2 00:06:31.510 07:01:49 -- scripts/common.sh@352 -- # local d=2 00:06:31.510 07:01:49 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:31.510 07:01:49 -- scripts/common.sh@354 -- # echo 2 00:06:31.510 07:01:49 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:31.510 07:01:49 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:31.510 07:01:49 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:31.510 07:01:49 -- scripts/common.sh@367 -- # return 0 00:06:31.510 07:01:49 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:31.510 07:01:49 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:31.510 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:31.510 --rc genhtml_branch_coverage=1 00:06:31.510 --rc genhtml_function_coverage=1 00:06:31.510 --rc genhtml_legend=1 00:06:31.510 --rc geninfo_all_blocks=1 00:06:31.510 --rc geninfo_unexecuted_blocks=1 00:06:31.510 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:31.510 ' 00:06:31.510 07:01:49 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:31.510 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:31.510 --rc genhtml_branch_coverage=1 00:06:31.510 --rc genhtml_function_coverage=1 00:06:31.510 --rc genhtml_legend=1 00:06:31.510 --rc geninfo_all_blocks=1 00:06:31.510 --rc geninfo_unexecuted_blocks=1 00:06:31.510 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:31.510 ' 00:06:31.511 07:01:49 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:31.511 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:31.511 --rc genhtml_branch_coverage=1 00:06:31.511 --rc genhtml_function_coverage=1 00:06:31.511 --rc genhtml_legend=1 00:06:31.511 --rc geninfo_all_blocks=1 00:06:31.511 --rc geninfo_unexecuted_blocks=1 00:06:31.511 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:31.511 ' 00:06:31.511 07:01:49 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:31.511 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:31.511 --rc genhtml_branch_coverage=1 00:06:31.511 --rc genhtml_function_coverage=1 00:06:31.511 --rc genhtml_legend=1 00:06:31.511 --rc geninfo_all_blocks=1 00:06:31.511 --rc geninfo_unexecuted_blocks=1 00:06:31.511 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:31.511 ' 00:06:31.511 07:01:49 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:31.511 07:01:49 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:31.511 07:01:49 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:31.511 07:01:49 -- common/autotest_common.sh@10 -- # set +x 00:06:31.511 ************************************ 00:06:31.511 START TEST thread_poller_perf 00:06:31.511 ************************************ 00:06:31.511 07:01:49 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:31.511 [2024-12-13 07:01:49.616249] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:31.511 [2024-12-13 07:01:49.616318] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid477313 ] 00:06:31.511 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.511 [2024-12-13 07:01:49.679744] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:31.511 [2024-12-13 07:01:49.715865] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.511 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:32.887 [2024-12-13T06:01:51.126Z] ====================================== 00:06:32.887 [2024-12-13T06:01:51.126Z] busy:2506632962 (cyc) 00:06:32.887 [2024-12-13T06:01:51.126Z] total_run_count: 732000 00:06:32.887 [2024-12-13T06:01:51.126Z] tsc_hz: 2500000000 (cyc) 00:06:32.887 [2024-12-13T06:01:51.126Z] ====================================== 00:06:32.887 [2024-12-13T06:01:51.126Z] poller_cost: 3424 (cyc), 1369 (nsec) 00:06:32.887 00:06:32.887 real 0m1.172s 00:06:32.887 user 0m1.094s 00:06:32.887 sys 0m0.073s 00:06:32.887 07:01:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:32.887 07:01:50 -- common/autotest_common.sh@10 -- # set +x 00:06:32.887 ************************************ 00:06:32.887 END TEST thread_poller_perf 00:06:32.887 ************************************ 00:06:32.887 07:01:50 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:32.887 07:01:50 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:32.887 07:01:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:32.887 07:01:50 -- common/autotest_common.sh@10 -- # set +x 00:06:32.887 ************************************ 00:06:32.887 START TEST thread_poller_perf 00:06:32.887 ************************************ 00:06:32.887 07:01:50 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:32.888 [2024-12-13 07:01:50.835793] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:32.888 [2024-12-13 07:01:50.835855] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid477597 ] 00:06:32.888 EAL: No free 2048 kB hugepages reported on node 1 00:06:32.888 [2024-12-13 07:01:50.897379] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.888 [2024-12-13 07:01:50.932330] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:32.888 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:33.824 [2024-12-13T06:01:52.063Z] ====================================== 00:06:33.824 [2024-12-13T06:01:52.063Z] busy:2501899050 (cyc) 00:06:33.824 [2024-12-13T06:01:52.063Z] total_run_count: 13067000 00:06:33.824 [2024-12-13T06:01:52.063Z] tsc_hz: 2500000000 (cyc) 00:06:33.824 [2024-12-13T06:01:52.063Z] ====================================== 00:06:33.824 [2024-12-13T06:01:52.063Z] poller_cost: 191 (cyc), 76 (nsec) 00:06:33.824 00:06:33.824 real 0m1.163s 00:06:33.824 user 0m1.079s 00:06:33.824 sys 0m0.080s 00:06:33.824 07:01:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:33.824 07:01:51 -- common/autotest_common.sh@10 -- # set +x 00:06:33.824 ************************************ 00:06:33.824 END TEST thread_poller_perf 00:06:33.824 ************************************ 00:06:33.824 07:01:52 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:06:33.824 07:01:52 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:33.824 07:01:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:33.824 07:01:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:33.824 07:01:52 -- common/autotest_common.sh@10 -- # set +x 00:06:33.824 ************************************ 00:06:33.824 START TEST thread_spdk_lock 00:06:33.824 ************************************ 00:06:33.824 07:01:52 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:33.824 [2024-12-13 07:01:52.056038] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:33.824 [2024-12-13 07:01:52.056127] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid477880 ] 00:06:34.083 EAL: No free 2048 kB hugepages reported on node 1 00:06:34.083 [2024-12-13 07:01:52.122469] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:34.083 [2024-12-13 07:01:52.158241] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:34.083 [2024-12-13 07:01:52.158244] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:34.651 [2024-12-13 07:01:52.639618] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 957:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:34.651 [2024-12-13 07:01:52.639655] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3064:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:06:34.651 [2024-12-13 07:01:52.639665] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3019:sspin_stacks_print: *ERROR*: spinlock 0x12e2e40 00:06:34.651 [2024-12-13 07:01:52.640470] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:34.651 [2024-12-13 07:01:52.640572] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1018:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:34.651 [2024-12-13 07:01:52.640591] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:34.651 Starting test contend 00:06:34.651 Worker Delay Wait us Hold us Total us 00:06:34.651 0 3 162791 179348 342140 00:06:34.651 1 5 80885 280639 361524 00:06:34.651 PASS test contend 00:06:34.651 Starting test hold_by_poller 00:06:34.651 PASS test hold_by_poller 00:06:34.651 Starting test hold_by_message 00:06:34.651 PASS test hold_by_message 00:06:34.651 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:06:34.651 100014 assertions passed 00:06:34.651 0 assertions failed 00:06:34.651 00:06:34.651 real 0m0.652s 00:06:34.651 user 0m1.048s 00:06:34.651 sys 0m0.083s 00:06:34.651 07:01:52 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:34.651 07:01:52 -- common/autotest_common.sh@10 -- # set +x 00:06:34.651 ************************************ 00:06:34.651 END TEST thread_spdk_lock 00:06:34.651 ************************************ 00:06:34.651 00:06:34.651 real 0m3.319s 00:06:34.651 user 0m3.389s 00:06:34.651 sys 0m0.443s 00:06:34.651 07:01:52 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:34.651 07:01:52 -- common/autotest_common.sh@10 -- # set +x 00:06:34.651 ************************************ 00:06:34.651 END TEST thread 00:06:34.651 ************************************ 00:06:34.651 07:01:52 -- spdk/autotest.sh@176 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:34.651 07:01:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:34.651 07:01:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:34.651 07:01:52 -- common/autotest_common.sh@10 -- # set +x 00:06:34.651 ************************************ 00:06:34.651 START TEST accel 00:06:34.651 ************************************ 00:06:34.651 07:01:52 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:34.651 * Looking for test storage... 00:06:34.651 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:34.651 07:01:52 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:34.651 07:01:52 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:34.651 07:01:52 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:34.911 07:01:52 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:34.911 07:01:52 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:34.911 07:01:52 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:34.911 07:01:52 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:34.911 07:01:52 -- scripts/common.sh@335 -- # IFS=.-: 00:06:34.911 07:01:52 -- scripts/common.sh@335 -- # read -ra ver1 00:06:34.911 07:01:52 -- scripts/common.sh@336 -- # IFS=.-: 00:06:34.911 07:01:52 -- scripts/common.sh@336 -- # read -ra ver2 00:06:34.911 07:01:52 -- scripts/common.sh@337 -- # local 'op=<' 00:06:34.911 07:01:52 -- scripts/common.sh@339 -- # ver1_l=2 00:06:34.911 07:01:52 -- scripts/common.sh@340 -- # ver2_l=1 00:06:34.911 07:01:52 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:34.911 07:01:52 -- scripts/common.sh@343 -- # case "$op" in 00:06:34.911 07:01:52 -- scripts/common.sh@344 -- # : 1 00:06:34.911 07:01:52 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:34.911 07:01:52 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:34.911 07:01:52 -- scripts/common.sh@364 -- # decimal 1 00:06:34.911 07:01:52 -- scripts/common.sh@352 -- # local d=1 00:06:34.911 07:01:52 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:34.911 07:01:52 -- scripts/common.sh@354 -- # echo 1 00:06:34.911 07:01:52 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:34.911 07:01:52 -- scripts/common.sh@365 -- # decimal 2 00:06:34.911 07:01:52 -- scripts/common.sh@352 -- # local d=2 00:06:34.911 07:01:52 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:34.911 07:01:52 -- scripts/common.sh@354 -- # echo 2 00:06:34.911 07:01:52 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:34.911 07:01:52 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:34.911 07:01:52 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:34.911 07:01:52 -- scripts/common.sh@367 -- # return 0 00:06:34.911 07:01:52 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:34.911 07:01:52 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:34.911 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.911 --rc genhtml_branch_coverage=1 00:06:34.911 --rc genhtml_function_coverage=1 00:06:34.911 --rc genhtml_legend=1 00:06:34.911 --rc geninfo_all_blocks=1 00:06:34.911 --rc geninfo_unexecuted_blocks=1 00:06:34.911 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.911 ' 00:06:34.911 07:01:52 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:34.911 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.911 --rc genhtml_branch_coverage=1 00:06:34.911 --rc genhtml_function_coverage=1 00:06:34.911 --rc genhtml_legend=1 00:06:34.911 --rc geninfo_all_blocks=1 00:06:34.911 --rc geninfo_unexecuted_blocks=1 00:06:34.911 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.911 ' 00:06:34.911 07:01:52 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:34.911 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.911 --rc genhtml_branch_coverage=1 00:06:34.911 --rc genhtml_function_coverage=1 00:06:34.911 --rc genhtml_legend=1 00:06:34.911 --rc geninfo_all_blocks=1 00:06:34.911 --rc geninfo_unexecuted_blocks=1 00:06:34.911 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.911 ' 00:06:34.911 07:01:52 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:34.911 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.911 --rc genhtml_branch_coverage=1 00:06:34.911 --rc genhtml_function_coverage=1 00:06:34.911 --rc genhtml_legend=1 00:06:34.911 --rc geninfo_all_blocks=1 00:06:34.911 --rc geninfo_unexecuted_blocks=1 00:06:34.911 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.911 ' 00:06:34.911 07:01:52 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:06:34.911 07:01:52 -- accel/accel.sh@74 -- # get_expected_opcs 00:06:34.911 07:01:52 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:34.911 07:01:52 -- accel/accel.sh@59 -- # spdk_tgt_pid=477962 00:06:34.911 07:01:52 -- accel/accel.sh@60 -- # waitforlisten 477962 00:06:34.911 07:01:52 -- common/autotest_common.sh@829 -- # '[' -z 477962 ']' 00:06:34.911 07:01:52 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:34.911 07:01:52 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:34.911 07:01:52 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:34.911 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:34.911 07:01:52 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:34.911 07:01:52 -- common/autotest_common.sh@10 -- # set +x 00:06:34.911 07:01:52 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:34.911 07:01:52 -- accel/accel.sh@58 -- # build_accel_config 00:06:34.911 07:01:52 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:34.911 07:01:52 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:34.911 07:01:52 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:34.911 07:01:52 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:34.911 07:01:52 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:34.911 07:01:52 -- accel/accel.sh@41 -- # local IFS=, 00:06:34.911 07:01:52 -- accel/accel.sh@42 -- # jq -r . 00:06:34.911 [2024-12-13 07:01:52.967856] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:34.911 [2024-12-13 07:01:52.967933] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid477962 ] 00:06:34.911 EAL: No free 2048 kB hugepages reported on node 1 00:06:34.911 [2024-12-13 07:01:53.035287] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:34.911 [2024-12-13 07:01:53.071702] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:34.911 [2024-12-13 07:01:53.071808] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.848 07:01:53 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:35.848 07:01:53 -- common/autotest_common.sh@862 -- # return 0 00:06:35.848 07:01:53 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:35.848 07:01:53 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:06:35.848 07:01:53 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.848 07:01:53 -- common/autotest_common.sh@10 -- # set +x 00:06:35.848 07:01:53 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:35.848 07:01:53 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.848 07:01:53 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # IFS== 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.848 07:01:53 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.848 07:01:53 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # IFS== 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.848 07:01:53 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.848 07:01:53 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # IFS== 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.848 07:01:53 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.848 07:01:53 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # IFS== 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.848 07:01:53 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.848 07:01:53 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # IFS== 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.848 07:01:53 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.848 07:01:53 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # IFS== 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.848 07:01:53 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.848 07:01:53 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # IFS== 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.848 07:01:53 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.848 07:01:53 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # IFS== 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.848 07:01:53 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.848 07:01:53 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # IFS== 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.848 07:01:53 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.848 07:01:53 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # IFS== 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.848 07:01:53 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.848 07:01:53 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # IFS== 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.848 07:01:53 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.848 07:01:53 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # IFS== 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.848 07:01:53 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.848 07:01:53 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # IFS== 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.848 07:01:53 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.848 07:01:53 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # IFS== 00:06:35.848 07:01:53 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.848 07:01:53 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.848 07:01:53 -- accel/accel.sh@67 -- # killprocess 477962 00:06:35.848 07:01:53 -- common/autotest_common.sh@936 -- # '[' -z 477962 ']' 00:06:35.848 07:01:53 -- common/autotest_common.sh@940 -- # kill -0 477962 00:06:35.848 07:01:53 -- common/autotest_common.sh@941 -- # uname 00:06:35.848 07:01:53 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:35.848 07:01:53 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 477962 00:06:35.848 07:01:53 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:35.848 07:01:53 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:35.848 07:01:53 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 477962' 00:06:35.848 killing process with pid 477962 00:06:35.848 07:01:53 -- common/autotest_common.sh@955 -- # kill 477962 00:06:35.848 07:01:53 -- common/autotest_common.sh@960 -- # wait 477962 00:06:36.107 07:01:54 -- accel/accel.sh@68 -- # trap - ERR 00:06:36.107 07:01:54 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:06:36.107 07:01:54 -- common/autotest_common.sh@1087 -- # '[' 3 -le 1 ']' 00:06:36.107 07:01:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:36.107 07:01:54 -- common/autotest_common.sh@10 -- # set +x 00:06:36.107 07:01:54 -- common/autotest_common.sh@1114 -- # accel_perf -h 00:06:36.107 07:01:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:36.107 07:01:54 -- accel/accel.sh@12 -- # build_accel_config 00:06:36.107 07:01:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:36.107 07:01:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:36.107 07:01:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:36.107 07:01:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:36.107 07:01:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:36.107 07:01:54 -- accel/accel.sh@41 -- # local IFS=, 00:06:36.107 07:01:54 -- accel/accel.sh@42 -- # jq -r . 00:06:36.107 07:01:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:36.107 07:01:54 -- common/autotest_common.sh@10 -- # set +x 00:06:36.107 07:01:54 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:36.107 07:01:54 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:36.107 07:01:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:36.107 07:01:54 -- common/autotest_common.sh@10 -- # set +x 00:06:36.107 ************************************ 00:06:36.107 START TEST accel_missing_filename 00:06:36.107 ************************************ 00:06:36.107 07:01:54 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress 00:06:36.107 07:01:54 -- common/autotest_common.sh@650 -- # local es=0 00:06:36.107 07:01:54 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:36.107 07:01:54 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:36.107 07:01:54 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:36.107 07:01:54 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:36.107 07:01:54 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:36.107 07:01:54 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress 00:06:36.107 07:01:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:36.107 07:01:54 -- accel/accel.sh@12 -- # build_accel_config 00:06:36.107 07:01:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:36.107 07:01:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:36.107 07:01:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:36.107 07:01:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:36.107 07:01:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:36.107 07:01:54 -- accel/accel.sh@41 -- # local IFS=, 00:06:36.107 07:01:54 -- accel/accel.sh@42 -- # jq -r . 00:06:36.107 [2024-12-13 07:01:54.268266] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:36.107 [2024-12-13 07:01:54.268354] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid478266 ] 00:06:36.108 EAL: No free 2048 kB hugepages reported on node 1 00:06:36.108 [2024-12-13 07:01:54.335901] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.366 [2024-12-13 07:01:54.372083] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.366 [2024-12-13 07:01:54.411647] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:36.366 [2024-12-13 07:01:54.471326] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:36.366 A filename is required. 00:06:36.366 07:01:54 -- common/autotest_common.sh@653 -- # es=234 00:06:36.366 07:01:54 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:36.366 07:01:54 -- common/autotest_common.sh@662 -- # es=106 00:06:36.366 07:01:54 -- common/autotest_common.sh@663 -- # case "$es" in 00:06:36.366 07:01:54 -- common/autotest_common.sh@670 -- # es=1 00:06:36.366 07:01:54 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:36.366 00:06:36.366 real 0m0.281s 00:06:36.366 user 0m0.188s 00:06:36.366 sys 0m0.133s 00:06:36.366 07:01:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:36.366 07:01:54 -- common/autotest_common.sh@10 -- # set +x 00:06:36.366 ************************************ 00:06:36.366 END TEST accel_missing_filename 00:06:36.366 ************************************ 00:06:36.366 07:01:54 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:36.366 07:01:54 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:36.366 07:01:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:36.366 07:01:54 -- common/autotest_common.sh@10 -- # set +x 00:06:36.366 ************************************ 00:06:36.366 START TEST accel_compress_verify 00:06:36.366 ************************************ 00:06:36.366 07:01:54 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:36.366 07:01:54 -- common/autotest_common.sh@650 -- # local es=0 00:06:36.366 07:01:54 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:36.366 07:01:54 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:36.366 07:01:54 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:36.366 07:01:54 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:36.366 07:01:54 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:36.366 07:01:54 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:36.366 07:01:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:36.366 07:01:54 -- accel/accel.sh@12 -- # build_accel_config 00:06:36.366 07:01:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:36.366 07:01:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:36.366 07:01:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:36.366 07:01:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:36.366 07:01:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:36.366 07:01:54 -- accel/accel.sh@41 -- # local IFS=, 00:06:36.366 07:01:54 -- accel/accel.sh@42 -- # jq -r . 00:06:36.366 [2024-12-13 07:01:54.587649] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:36.366 [2024-12-13 07:01:54.587777] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid478305 ] 00:06:36.626 EAL: No free 2048 kB hugepages reported on node 1 00:06:36.626 [2024-12-13 07:01:54.656678] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.626 [2024-12-13 07:01:54.691934] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.626 [2024-12-13 07:01:54.731434] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:36.626 [2024-12-13 07:01:54.791354] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:36.626 00:06:36.626 Compression does not support the verify option, aborting. 00:06:36.626 07:01:54 -- common/autotest_common.sh@653 -- # es=161 00:06:36.626 07:01:54 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:36.626 07:01:54 -- common/autotest_common.sh@662 -- # es=33 00:06:36.626 07:01:54 -- common/autotest_common.sh@663 -- # case "$es" in 00:06:36.626 07:01:54 -- common/autotest_common.sh@670 -- # es=1 00:06:36.626 07:01:54 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:36.626 00:06:36.626 real 0m0.282s 00:06:36.626 user 0m0.191s 00:06:36.626 sys 0m0.130s 00:06:36.626 07:01:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:36.626 07:01:54 -- common/autotest_common.sh@10 -- # set +x 00:06:36.626 ************************************ 00:06:36.626 END TEST accel_compress_verify 00:06:36.626 ************************************ 00:06:36.885 07:01:54 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:36.885 07:01:54 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:36.885 07:01:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:36.885 07:01:54 -- common/autotest_common.sh@10 -- # set +x 00:06:36.885 ************************************ 00:06:36.885 START TEST accel_wrong_workload 00:06:36.885 ************************************ 00:06:36.885 07:01:54 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w foobar 00:06:36.885 07:01:54 -- common/autotest_common.sh@650 -- # local es=0 00:06:36.885 07:01:54 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:36.885 07:01:54 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:36.885 07:01:54 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:36.885 07:01:54 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:36.885 07:01:54 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:36.885 07:01:54 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w foobar 00:06:36.885 07:01:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:36.885 07:01:54 -- accel/accel.sh@12 -- # build_accel_config 00:06:36.885 07:01:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:36.885 07:01:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:36.885 07:01:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:36.885 07:01:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:36.885 07:01:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:36.885 07:01:54 -- accel/accel.sh@41 -- # local IFS=, 00:06:36.885 07:01:54 -- accel/accel.sh@42 -- # jq -r . 00:06:36.885 Unsupported workload type: foobar 00:06:36.885 [2024-12-13 07:01:54.905827] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:36.885 accel_perf options: 00:06:36.885 [-h help message] 00:06:36.885 [-q queue depth per core] 00:06:36.885 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:36.885 [-T number of threads per core 00:06:36.885 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:36.885 [-t time in seconds] 00:06:36.885 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:36.885 [ dif_verify, , dif_generate, dif_generate_copy 00:06:36.885 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:36.885 [-l for compress/decompress workloads, name of uncompressed input file 00:06:36.885 [-S for crc32c workload, use this seed value (default 0) 00:06:36.885 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:36.885 [-f for fill workload, use this BYTE value (default 255) 00:06:36.885 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:36.885 [-y verify result if this switch is on] 00:06:36.885 [-a tasks to allocate per core (default: same value as -q)] 00:06:36.885 Can be used to spread operations across a wider range of memory. 00:06:36.885 07:01:54 -- common/autotest_common.sh@653 -- # es=1 00:06:36.885 07:01:54 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:36.885 07:01:54 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:36.885 07:01:54 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:36.885 00:06:36.885 real 0m0.023s 00:06:36.885 user 0m0.007s 00:06:36.885 sys 0m0.016s 00:06:36.885 07:01:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:36.885 07:01:54 -- common/autotest_common.sh@10 -- # set +x 00:06:36.885 ************************************ 00:06:36.885 END TEST accel_wrong_workload 00:06:36.885 ************************************ 00:06:36.885 Error: writing output failed: Broken pipe 00:06:36.885 07:01:54 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:36.885 07:01:54 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:36.885 07:01:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:36.885 07:01:54 -- common/autotest_common.sh@10 -- # set +x 00:06:36.885 ************************************ 00:06:36.885 START TEST accel_negative_buffers 00:06:36.885 ************************************ 00:06:36.885 07:01:54 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:36.885 07:01:54 -- common/autotest_common.sh@650 -- # local es=0 00:06:36.885 07:01:54 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:36.885 07:01:54 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:36.885 07:01:54 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:36.885 07:01:54 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:36.885 07:01:54 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:36.885 07:01:54 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w xor -y -x -1 00:06:36.885 07:01:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:36.885 07:01:54 -- accel/accel.sh@12 -- # build_accel_config 00:06:36.885 07:01:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:36.885 07:01:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:36.885 07:01:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:36.885 07:01:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:36.885 07:01:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:36.885 07:01:54 -- accel/accel.sh@41 -- # local IFS=, 00:06:36.885 07:01:54 -- accel/accel.sh@42 -- # jq -r . 00:06:36.885 -x option must be non-negative. 00:06:36.885 [2024-12-13 07:01:54.966015] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:36.885 accel_perf options: 00:06:36.885 [-h help message] 00:06:36.885 [-q queue depth per core] 00:06:36.885 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:36.885 [-T number of threads per core 00:06:36.885 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:36.885 [-t time in seconds] 00:06:36.885 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:36.885 [ dif_verify, , dif_generate, dif_generate_copy 00:06:36.885 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:36.885 [-l for compress/decompress workloads, name of uncompressed input file 00:06:36.885 [-S for crc32c workload, use this seed value (default 0) 00:06:36.885 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:36.885 [-f for fill workload, use this BYTE value (default 255) 00:06:36.885 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:36.885 [-y verify result if this switch is on] 00:06:36.885 [-a tasks to allocate per core (default: same value as -q)] 00:06:36.885 Can be used to spread operations across a wider range of memory. 00:06:36.885 07:01:54 -- common/autotest_common.sh@653 -- # es=1 00:06:36.885 07:01:54 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:36.885 07:01:54 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:36.885 07:01:54 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:36.885 00:06:36.885 real 0m0.023s 00:06:36.885 user 0m0.012s 00:06:36.885 sys 0m0.011s 00:06:36.885 07:01:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:36.885 07:01:54 -- common/autotest_common.sh@10 -- # set +x 00:06:36.885 ************************************ 00:06:36.885 END TEST accel_negative_buffers 00:06:36.885 ************************************ 00:06:36.885 Error: writing output failed: Broken pipe 00:06:36.885 07:01:55 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:36.885 07:01:55 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:36.885 07:01:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:36.885 07:01:55 -- common/autotest_common.sh@10 -- # set +x 00:06:36.885 ************************************ 00:06:36.885 START TEST accel_crc32c 00:06:36.885 ************************************ 00:06:36.885 07:01:55 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:36.885 07:01:55 -- accel/accel.sh@16 -- # local accel_opc 00:06:36.885 07:01:55 -- accel/accel.sh@17 -- # local accel_module 00:06:36.885 07:01:55 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:36.885 07:01:55 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:36.885 07:01:55 -- accel/accel.sh@12 -- # build_accel_config 00:06:36.885 07:01:55 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:36.885 07:01:55 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:36.885 07:01:55 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:36.885 07:01:55 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:36.886 07:01:55 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:36.886 07:01:55 -- accel/accel.sh@41 -- # local IFS=, 00:06:36.886 07:01:55 -- accel/accel.sh@42 -- # jq -r . 00:06:36.886 [2024-12-13 07:01:55.029576] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:36.886 [2024-12-13 07:01:55.029654] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid478574 ] 00:06:36.886 EAL: No free 2048 kB hugepages reported on node 1 00:06:36.886 [2024-12-13 07:01:55.097072] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.144 [2024-12-13 07:01:55.133731] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.080 07:01:56 -- accel/accel.sh@18 -- # out=' 00:06:38.080 SPDK Configuration: 00:06:38.080 Core mask: 0x1 00:06:38.080 00:06:38.080 Accel Perf Configuration: 00:06:38.080 Workload Type: crc32c 00:06:38.080 CRC-32C seed: 32 00:06:38.080 Transfer size: 4096 bytes 00:06:38.080 Vector count 1 00:06:38.080 Module: software 00:06:38.080 Queue depth: 32 00:06:38.080 Allocate depth: 32 00:06:38.080 # threads/core: 1 00:06:38.080 Run time: 1 seconds 00:06:38.080 Verify: Yes 00:06:38.080 00:06:38.080 Running for 1 seconds... 00:06:38.080 00:06:38.080 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:38.080 ------------------------------------------------------------------------------------ 00:06:38.080 0,0 846176/s 3305 MiB/s 0 0 00:06:38.080 ==================================================================================== 00:06:38.080 Total 846176/s 3305 MiB/s 0 0' 00:06:38.080 07:01:56 -- accel/accel.sh@20 -- # IFS=: 00:06:38.080 07:01:56 -- accel/accel.sh@20 -- # read -r var val 00:06:38.080 07:01:56 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:38.080 07:01:56 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:38.080 07:01:56 -- accel/accel.sh@12 -- # build_accel_config 00:06:38.080 07:01:56 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:38.080 07:01:56 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:38.080 07:01:56 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:38.080 07:01:56 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:38.080 07:01:56 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:38.080 07:01:56 -- accel/accel.sh@41 -- # local IFS=, 00:06:38.080 07:01:56 -- accel/accel.sh@42 -- # jq -r . 00:06:38.080 [2024-12-13 07:01:56.314567] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:38.080 [2024-12-13 07:01:56.314655] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid478725 ] 00:06:38.339 EAL: No free 2048 kB hugepages reported on node 1 00:06:38.339 [2024-12-13 07:01:56.382095] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.339 [2024-12-13 07:01:56.416865] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.339 07:01:56 -- accel/accel.sh@21 -- # val= 00:06:38.339 07:01:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.339 07:01:56 -- accel/accel.sh@20 -- # IFS=: 00:06:38.339 07:01:56 -- accel/accel.sh@20 -- # read -r var val 00:06:38.339 07:01:56 -- accel/accel.sh@21 -- # val= 00:06:38.339 07:01:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.339 07:01:56 -- accel/accel.sh@20 -- # IFS=: 00:06:38.339 07:01:56 -- accel/accel.sh@20 -- # read -r var val 00:06:38.339 07:01:56 -- accel/accel.sh@21 -- # val=0x1 00:06:38.339 07:01:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.339 07:01:56 -- accel/accel.sh@20 -- # IFS=: 00:06:38.339 07:01:56 -- accel/accel.sh@20 -- # read -r var val 00:06:38.339 07:01:56 -- accel/accel.sh@21 -- # val= 00:06:38.339 07:01:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.339 07:01:56 -- accel/accel.sh@20 -- # IFS=: 00:06:38.339 07:01:56 -- accel/accel.sh@20 -- # read -r var val 00:06:38.339 07:01:56 -- accel/accel.sh@21 -- # val= 00:06:38.339 07:01:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.339 07:01:56 -- accel/accel.sh@20 -- # IFS=: 00:06:38.339 07:01:56 -- accel/accel.sh@20 -- # read -r var val 00:06:38.339 07:01:56 -- accel/accel.sh@21 -- # val=crc32c 00:06:38.339 07:01:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.339 07:01:56 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:38.339 07:01:56 -- accel/accel.sh@20 -- # IFS=: 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # read -r var val 00:06:38.340 07:01:56 -- accel/accel.sh@21 -- # val=32 00:06:38.340 07:01:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # IFS=: 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # read -r var val 00:06:38.340 07:01:56 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:38.340 07:01:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # IFS=: 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # read -r var val 00:06:38.340 07:01:56 -- accel/accel.sh@21 -- # val= 00:06:38.340 07:01:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # IFS=: 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # read -r var val 00:06:38.340 07:01:56 -- accel/accel.sh@21 -- # val=software 00:06:38.340 07:01:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.340 07:01:56 -- accel/accel.sh@23 -- # accel_module=software 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # IFS=: 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # read -r var val 00:06:38.340 07:01:56 -- accel/accel.sh@21 -- # val=32 00:06:38.340 07:01:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # IFS=: 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # read -r var val 00:06:38.340 07:01:56 -- accel/accel.sh@21 -- # val=32 00:06:38.340 07:01:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # IFS=: 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # read -r var val 00:06:38.340 07:01:56 -- accel/accel.sh@21 -- # val=1 00:06:38.340 07:01:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # IFS=: 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # read -r var val 00:06:38.340 07:01:56 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:38.340 07:01:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # IFS=: 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # read -r var val 00:06:38.340 07:01:56 -- accel/accel.sh@21 -- # val=Yes 00:06:38.340 07:01:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # IFS=: 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # read -r var val 00:06:38.340 07:01:56 -- accel/accel.sh@21 -- # val= 00:06:38.340 07:01:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # IFS=: 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # read -r var val 00:06:38.340 07:01:56 -- accel/accel.sh@21 -- # val= 00:06:38.340 07:01:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # IFS=: 00:06:38.340 07:01:56 -- accel/accel.sh@20 -- # read -r var val 00:06:39.717 07:01:57 -- accel/accel.sh@21 -- # val= 00:06:39.717 07:01:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.717 07:01:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.717 07:01:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.717 07:01:57 -- accel/accel.sh@21 -- # val= 00:06:39.717 07:01:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.717 07:01:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.717 07:01:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.717 07:01:57 -- accel/accel.sh@21 -- # val= 00:06:39.717 07:01:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.717 07:01:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.717 07:01:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.717 07:01:57 -- accel/accel.sh@21 -- # val= 00:06:39.717 07:01:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.717 07:01:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.717 07:01:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.717 07:01:57 -- accel/accel.sh@21 -- # val= 00:06:39.717 07:01:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.717 07:01:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.717 07:01:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.717 07:01:57 -- accel/accel.sh@21 -- # val= 00:06:39.717 07:01:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.717 07:01:57 -- accel/accel.sh@20 -- # IFS=: 00:06:39.717 07:01:57 -- accel/accel.sh@20 -- # read -r var val 00:06:39.717 07:01:57 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:39.717 07:01:57 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:39.717 07:01:57 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:39.717 00:06:39.717 real 0m2.568s 00:06:39.717 user 0m2.312s 00:06:39.717 sys 0m0.253s 00:06:39.718 07:01:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:39.718 07:01:57 -- common/autotest_common.sh@10 -- # set +x 00:06:39.718 ************************************ 00:06:39.718 END TEST accel_crc32c 00:06:39.718 ************************************ 00:06:39.718 07:01:57 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:39.718 07:01:57 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:39.718 07:01:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:39.718 07:01:57 -- common/autotest_common.sh@10 -- # set +x 00:06:39.718 ************************************ 00:06:39.718 START TEST accel_crc32c_C2 00:06:39.718 ************************************ 00:06:39.718 07:01:57 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:39.718 07:01:57 -- accel/accel.sh@16 -- # local accel_opc 00:06:39.718 07:01:57 -- accel/accel.sh@17 -- # local accel_module 00:06:39.718 07:01:57 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:39.718 07:01:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:39.718 07:01:57 -- accel/accel.sh@12 -- # build_accel_config 00:06:39.718 07:01:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:39.718 07:01:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:39.718 07:01:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:39.718 07:01:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:39.718 07:01:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:39.718 07:01:57 -- accel/accel.sh@41 -- # local IFS=, 00:06:39.718 07:01:57 -- accel/accel.sh@42 -- # jq -r . 00:06:39.718 [2024-12-13 07:01:57.639736] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:39.718 [2024-12-13 07:01:57.639824] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid478910 ] 00:06:39.718 EAL: No free 2048 kB hugepages reported on node 1 00:06:39.718 [2024-12-13 07:01:57.708038] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.718 [2024-12-13 07:01:57.743650] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.095 07:01:58 -- accel/accel.sh@18 -- # out=' 00:06:41.095 SPDK Configuration: 00:06:41.095 Core mask: 0x1 00:06:41.095 00:06:41.095 Accel Perf Configuration: 00:06:41.095 Workload Type: crc32c 00:06:41.095 CRC-32C seed: 0 00:06:41.095 Transfer size: 4096 bytes 00:06:41.095 Vector count 2 00:06:41.095 Module: software 00:06:41.095 Queue depth: 32 00:06:41.095 Allocate depth: 32 00:06:41.095 # threads/core: 1 00:06:41.095 Run time: 1 seconds 00:06:41.095 Verify: Yes 00:06:41.095 00:06:41.095 Running for 1 seconds... 00:06:41.095 00:06:41.095 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:41.095 ------------------------------------------------------------------------------------ 00:06:41.095 0,0 615424/s 4808 MiB/s 0 0 00:06:41.095 ==================================================================================== 00:06:41.095 Total 615424/s 2404 MiB/s 0 0' 00:06:41.095 07:01:58 -- accel/accel.sh@20 -- # IFS=: 00:06:41.095 07:01:58 -- accel/accel.sh@20 -- # read -r var val 00:06:41.095 07:01:58 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:41.095 07:01:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:41.095 07:01:58 -- accel/accel.sh@12 -- # build_accel_config 00:06:41.095 07:01:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:41.095 07:01:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:41.095 07:01:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:41.095 07:01:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:41.095 07:01:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:41.095 07:01:58 -- accel/accel.sh@41 -- # local IFS=, 00:06:41.095 07:01:58 -- accel/accel.sh@42 -- # jq -r . 00:06:41.095 [2024-12-13 07:01:58.914909] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:41.095 [2024-12-13 07:01:58.914971] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid479167 ] 00:06:41.095 EAL: No free 2048 kB hugepages reported on node 1 00:06:41.095 [2024-12-13 07:01:58.976881] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:41.095 [2024-12-13 07:01:59.011018] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.095 07:01:59 -- accel/accel.sh@21 -- # val= 00:06:41.095 07:01:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.095 07:01:59 -- accel/accel.sh@20 -- # IFS=: 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # read -r var val 00:06:41.096 07:01:59 -- accel/accel.sh@21 -- # val= 00:06:41.096 07:01:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # IFS=: 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # read -r var val 00:06:41.096 07:01:59 -- accel/accel.sh@21 -- # val=0x1 00:06:41.096 07:01:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # IFS=: 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # read -r var val 00:06:41.096 07:01:59 -- accel/accel.sh@21 -- # val= 00:06:41.096 07:01:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # IFS=: 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # read -r var val 00:06:41.096 07:01:59 -- accel/accel.sh@21 -- # val= 00:06:41.096 07:01:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # IFS=: 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # read -r var val 00:06:41.096 07:01:59 -- accel/accel.sh@21 -- # val=crc32c 00:06:41.096 07:01:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.096 07:01:59 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # IFS=: 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # read -r var val 00:06:41.096 07:01:59 -- accel/accel.sh@21 -- # val=0 00:06:41.096 07:01:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # IFS=: 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # read -r var val 00:06:41.096 07:01:59 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:41.096 07:01:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # IFS=: 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # read -r var val 00:06:41.096 07:01:59 -- accel/accel.sh@21 -- # val= 00:06:41.096 07:01:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # IFS=: 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # read -r var val 00:06:41.096 07:01:59 -- accel/accel.sh@21 -- # val=software 00:06:41.096 07:01:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.096 07:01:59 -- accel/accel.sh@23 -- # accel_module=software 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # IFS=: 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # read -r var val 00:06:41.096 07:01:59 -- accel/accel.sh@21 -- # val=32 00:06:41.096 07:01:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # IFS=: 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # read -r var val 00:06:41.096 07:01:59 -- accel/accel.sh@21 -- # val=32 00:06:41.096 07:01:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # IFS=: 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # read -r var val 00:06:41.096 07:01:59 -- accel/accel.sh@21 -- # val=1 00:06:41.096 07:01:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # IFS=: 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # read -r var val 00:06:41.096 07:01:59 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:41.096 07:01:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # IFS=: 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # read -r var val 00:06:41.096 07:01:59 -- accel/accel.sh@21 -- # val=Yes 00:06:41.096 07:01:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # IFS=: 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # read -r var val 00:06:41.096 07:01:59 -- accel/accel.sh@21 -- # val= 00:06:41.096 07:01:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # IFS=: 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # read -r var val 00:06:41.096 07:01:59 -- accel/accel.sh@21 -- # val= 00:06:41.096 07:01:59 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # IFS=: 00:06:41.096 07:01:59 -- accel/accel.sh@20 -- # read -r var val 00:06:42.032 07:02:00 -- accel/accel.sh@21 -- # val= 00:06:42.032 07:02:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.032 07:02:00 -- accel/accel.sh@20 -- # IFS=: 00:06:42.032 07:02:00 -- accel/accel.sh@20 -- # read -r var val 00:06:42.032 07:02:00 -- accel/accel.sh@21 -- # val= 00:06:42.032 07:02:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.032 07:02:00 -- accel/accel.sh@20 -- # IFS=: 00:06:42.032 07:02:00 -- accel/accel.sh@20 -- # read -r var val 00:06:42.032 07:02:00 -- accel/accel.sh@21 -- # val= 00:06:42.032 07:02:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.032 07:02:00 -- accel/accel.sh@20 -- # IFS=: 00:06:42.032 07:02:00 -- accel/accel.sh@20 -- # read -r var val 00:06:42.032 07:02:00 -- accel/accel.sh@21 -- # val= 00:06:42.032 07:02:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.032 07:02:00 -- accel/accel.sh@20 -- # IFS=: 00:06:42.032 07:02:00 -- accel/accel.sh@20 -- # read -r var val 00:06:42.032 07:02:00 -- accel/accel.sh@21 -- # val= 00:06:42.032 07:02:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.032 07:02:00 -- accel/accel.sh@20 -- # IFS=: 00:06:42.032 07:02:00 -- accel/accel.sh@20 -- # read -r var val 00:06:42.032 07:02:00 -- accel/accel.sh@21 -- # val= 00:06:42.032 07:02:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.032 07:02:00 -- accel/accel.sh@20 -- # IFS=: 00:06:42.032 07:02:00 -- accel/accel.sh@20 -- # read -r var val 00:06:42.032 07:02:00 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:42.032 07:02:00 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:42.032 07:02:00 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:42.032 00:06:42.032 real 0m2.552s 00:06:42.032 user 0m2.306s 00:06:42.032 sys 0m0.244s 00:06:42.032 07:02:00 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:42.032 07:02:00 -- common/autotest_common.sh@10 -- # set +x 00:06:42.032 ************************************ 00:06:42.032 END TEST accel_crc32c_C2 00:06:42.032 ************************************ 00:06:42.032 07:02:00 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:42.032 07:02:00 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:42.032 07:02:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:42.032 07:02:00 -- common/autotest_common.sh@10 -- # set +x 00:06:42.033 ************************************ 00:06:42.033 START TEST accel_copy 00:06:42.033 ************************************ 00:06:42.033 07:02:00 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy -y 00:06:42.033 07:02:00 -- accel/accel.sh@16 -- # local accel_opc 00:06:42.033 07:02:00 -- accel/accel.sh@17 -- # local accel_module 00:06:42.033 07:02:00 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:06:42.033 07:02:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:42.033 07:02:00 -- accel/accel.sh@12 -- # build_accel_config 00:06:42.033 07:02:00 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:42.033 07:02:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:42.033 07:02:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:42.033 07:02:00 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:42.033 07:02:00 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:42.033 07:02:00 -- accel/accel.sh@41 -- # local IFS=, 00:06:42.033 07:02:00 -- accel/accel.sh@42 -- # jq -r . 00:06:42.033 [2024-12-13 07:02:00.231183] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:42.033 [2024-12-13 07:02:00.231277] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid479448 ] 00:06:42.033 EAL: No free 2048 kB hugepages reported on node 1 00:06:42.291 [2024-12-13 07:02:00.298687] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.291 [2024-12-13 07:02:00.334462] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.668 07:02:01 -- accel/accel.sh@18 -- # out=' 00:06:43.668 SPDK Configuration: 00:06:43.668 Core mask: 0x1 00:06:43.668 00:06:43.668 Accel Perf Configuration: 00:06:43.668 Workload Type: copy 00:06:43.668 Transfer size: 4096 bytes 00:06:43.668 Vector count 1 00:06:43.668 Module: software 00:06:43.668 Queue depth: 32 00:06:43.668 Allocate depth: 32 00:06:43.668 # threads/core: 1 00:06:43.668 Run time: 1 seconds 00:06:43.668 Verify: Yes 00:06:43.668 00:06:43.668 Running for 1 seconds... 00:06:43.668 00:06:43.668 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:43.668 ------------------------------------------------------------------------------------ 00:06:43.668 0,0 558048/s 2179 MiB/s 0 0 00:06:43.668 ==================================================================================== 00:06:43.668 Total 558048/s 2179 MiB/s 0 0' 00:06:43.668 07:02:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.668 07:02:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.668 07:02:01 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:43.668 07:02:01 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:43.668 07:02:01 -- accel/accel.sh@12 -- # build_accel_config 00:06:43.668 07:02:01 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:43.668 07:02:01 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.668 07:02:01 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.668 07:02:01 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:43.668 07:02:01 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:43.668 07:02:01 -- accel/accel.sh@41 -- # local IFS=, 00:06:43.668 07:02:01 -- accel/accel.sh@42 -- # jq -r . 00:06:43.668 [2024-12-13 07:02:01.506143] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:43.668 [2024-12-13 07:02:01.506209] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid479720 ] 00:06:43.668 EAL: No free 2048 kB hugepages reported on node 1 00:06:43.668 [2024-12-13 07:02:01.567417] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.668 [2024-12-13 07:02:01.602480] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.668 07:02:01 -- accel/accel.sh@21 -- # val= 00:06:43.668 07:02:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.668 07:02:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.668 07:02:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.668 07:02:01 -- accel/accel.sh@21 -- # val= 00:06:43.668 07:02:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.668 07:02:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.668 07:02:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.668 07:02:01 -- accel/accel.sh@21 -- # val=0x1 00:06:43.668 07:02:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.668 07:02:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.668 07:02:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.668 07:02:01 -- accel/accel.sh@21 -- # val= 00:06:43.668 07:02:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.669 07:02:01 -- accel/accel.sh@21 -- # val= 00:06:43.669 07:02:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.669 07:02:01 -- accel/accel.sh@21 -- # val=copy 00:06:43.669 07:02:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.669 07:02:01 -- accel/accel.sh@24 -- # accel_opc=copy 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.669 07:02:01 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:43.669 07:02:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.669 07:02:01 -- accel/accel.sh@21 -- # val= 00:06:43.669 07:02:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.669 07:02:01 -- accel/accel.sh@21 -- # val=software 00:06:43.669 07:02:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.669 07:02:01 -- accel/accel.sh@23 -- # accel_module=software 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.669 07:02:01 -- accel/accel.sh@21 -- # val=32 00:06:43.669 07:02:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.669 07:02:01 -- accel/accel.sh@21 -- # val=32 00:06:43.669 07:02:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.669 07:02:01 -- accel/accel.sh@21 -- # val=1 00:06:43.669 07:02:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.669 07:02:01 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:43.669 07:02:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.669 07:02:01 -- accel/accel.sh@21 -- # val=Yes 00:06:43.669 07:02:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.669 07:02:01 -- accel/accel.sh@21 -- # val= 00:06:43.669 07:02:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # read -r var val 00:06:43.669 07:02:01 -- accel/accel.sh@21 -- # val= 00:06:43.669 07:02:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # IFS=: 00:06:43.669 07:02:01 -- accel/accel.sh@20 -- # read -r var val 00:06:44.605 07:02:02 -- accel/accel.sh@21 -- # val= 00:06:44.606 07:02:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.606 07:02:02 -- accel/accel.sh@20 -- # IFS=: 00:06:44.606 07:02:02 -- accel/accel.sh@20 -- # read -r var val 00:06:44.606 07:02:02 -- accel/accel.sh@21 -- # val= 00:06:44.606 07:02:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.606 07:02:02 -- accel/accel.sh@20 -- # IFS=: 00:06:44.606 07:02:02 -- accel/accel.sh@20 -- # read -r var val 00:06:44.606 07:02:02 -- accel/accel.sh@21 -- # val= 00:06:44.606 07:02:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.606 07:02:02 -- accel/accel.sh@20 -- # IFS=: 00:06:44.606 07:02:02 -- accel/accel.sh@20 -- # read -r var val 00:06:44.606 07:02:02 -- accel/accel.sh@21 -- # val= 00:06:44.606 07:02:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.606 07:02:02 -- accel/accel.sh@20 -- # IFS=: 00:06:44.606 07:02:02 -- accel/accel.sh@20 -- # read -r var val 00:06:44.606 07:02:02 -- accel/accel.sh@21 -- # val= 00:06:44.606 07:02:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.606 07:02:02 -- accel/accel.sh@20 -- # IFS=: 00:06:44.606 07:02:02 -- accel/accel.sh@20 -- # read -r var val 00:06:44.606 07:02:02 -- accel/accel.sh@21 -- # val= 00:06:44.606 07:02:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.606 07:02:02 -- accel/accel.sh@20 -- # IFS=: 00:06:44.606 07:02:02 -- accel/accel.sh@20 -- # read -r var val 00:06:44.606 07:02:02 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:44.606 07:02:02 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:06:44.606 07:02:02 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:44.606 00:06:44.606 real 0m2.552s 00:06:44.606 user 0m2.298s 00:06:44.606 sys 0m0.250s 00:06:44.606 07:02:02 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:44.606 07:02:02 -- common/autotest_common.sh@10 -- # set +x 00:06:44.606 ************************************ 00:06:44.606 END TEST accel_copy 00:06:44.606 ************************************ 00:06:44.606 07:02:02 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:44.606 07:02:02 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:06:44.606 07:02:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:44.606 07:02:02 -- common/autotest_common.sh@10 -- # set +x 00:06:44.606 ************************************ 00:06:44.606 START TEST accel_fill 00:06:44.606 ************************************ 00:06:44.606 07:02:02 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:44.606 07:02:02 -- accel/accel.sh@16 -- # local accel_opc 00:06:44.606 07:02:02 -- accel/accel.sh@17 -- # local accel_module 00:06:44.606 07:02:02 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:44.606 07:02:02 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:44.606 07:02:02 -- accel/accel.sh@12 -- # build_accel_config 00:06:44.606 07:02:02 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:44.606 07:02:02 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:44.606 07:02:02 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:44.606 07:02:02 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:44.606 07:02:02 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:44.606 07:02:02 -- accel/accel.sh@41 -- # local IFS=, 00:06:44.606 07:02:02 -- accel/accel.sh@42 -- # jq -r . 00:06:44.606 [2024-12-13 07:02:02.827476] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:44.606 [2024-12-13 07:02:02.827562] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid480004 ] 00:06:44.865 EAL: No free 2048 kB hugepages reported on node 1 00:06:44.865 [2024-12-13 07:02:02.895854] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:44.865 [2024-12-13 07:02:02.931410] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.239 07:02:04 -- accel/accel.sh@18 -- # out=' 00:06:46.239 SPDK Configuration: 00:06:46.239 Core mask: 0x1 00:06:46.239 00:06:46.239 Accel Perf Configuration: 00:06:46.239 Workload Type: fill 00:06:46.239 Fill pattern: 0x80 00:06:46.239 Transfer size: 4096 bytes 00:06:46.239 Vector count 1 00:06:46.239 Module: software 00:06:46.239 Queue depth: 64 00:06:46.239 Allocate depth: 64 00:06:46.239 # threads/core: 1 00:06:46.239 Run time: 1 seconds 00:06:46.239 Verify: Yes 00:06:46.239 00:06:46.239 Running for 1 seconds... 00:06:46.239 00:06:46.239 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:46.239 ------------------------------------------------------------------------------------ 00:06:46.239 0,0 972032/s 3797 MiB/s 0 0 00:06:46.239 ==================================================================================== 00:06:46.239 Total 972032/s 3797 MiB/s 0 0' 00:06:46.239 07:02:04 -- accel/accel.sh@20 -- # IFS=: 00:06:46.239 07:02:04 -- accel/accel.sh@20 -- # read -r var val 00:06:46.239 07:02:04 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:46.239 07:02:04 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:46.239 07:02:04 -- accel/accel.sh@12 -- # build_accel_config 00:06:46.239 07:02:04 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:46.239 07:02:04 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:46.239 07:02:04 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:46.239 07:02:04 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:46.239 07:02:04 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:46.239 07:02:04 -- accel/accel.sh@41 -- # local IFS=, 00:06:46.239 07:02:04 -- accel/accel.sh@42 -- # jq -r . 00:06:46.239 [2024-12-13 07:02:04.103551] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:46.239 [2024-12-13 07:02:04.103614] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid480206 ] 00:06:46.239 EAL: No free 2048 kB hugepages reported on node 1 00:06:46.239 [2024-12-13 07:02:04.165862] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.239 [2024-12-13 07:02:04.200964] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.239 07:02:04 -- accel/accel.sh@21 -- # val= 00:06:46.239 07:02:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.239 07:02:04 -- accel/accel.sh@20 -- # IFS=: 00:06:46.239 07:02:04 -- accel/accel.sh@20 -- # read -r var val 00:06:46.239 07:02:04 -- accel/accel.sh@21 -- # val= 00:06:46.239 07:02:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.239 07:02:04 -- accel/accel.sh@20 -- # IFS=: 00:06:46.239 07:02:04 -- accel/accel.sh@20 -- # read -r var val 00:06:46.239 07:02:04 -- accel/accel.sh@21 -- # val=0x1 00:06:46.239 07:02:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.239 07:02:04 -- accel/accel.sh@20 -- # IFS=: 00:06:46.239 07:02:04 -- accel/accel.sh@20 -- # read -r var val 00:06:46.239 07:02:04 -- accel/accel.sh@21 -- # val= 00:06:46.239 07:02:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.239 07:02:04 -- accel/accel.sh@20 -- # IFS=: 00:06:46.239 07:02:04 -- accel/accel.sh@20 -- # read -r var val 00:06:46.239 07:02:04 -- accel/accel.sh@21 -- # val= 00:06:46.239 07:02:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.239 07:02:04 -- accel/accel.sh@20 -- # IFS=: 00:06:46.239 07:02:04 -- accel/accel.sh@20 -- # read -r var val 00:06:46.239 07:02:04 -- accel/accel.sh@21 -- # val=fill 00:06:46.239 07:02:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.239 07:02:04 -- accel/accel.sh@24 -- # accel_opc=fill 00:06:46.239 07:02:04 -- accel/accel.sh@20 -- # IFS=: 00:06:46.239 07:02:04 -- accel/accel.sh@20 -- # read -r var val 00:06:46.240 07:02:04 -- accel/accel.sh@21 -- # val=0x80 00:06:46.240 07:02:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # IFS=: 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # read -r var val 00:06:46.240 07:02:04 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:46.240 07:02:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # IFS=: 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # read -r var val 00:06:46.240 07:02:04 -- accel/accel.sh@21 -- # val= 00:06:46.240 07:02:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # IFS=: 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # read -r var val 00:06:46.240 07:02:04 -- accel/accel.sh@21 -- # val=software 00:06:46.240 07:02:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.240 07:02:04 -- accel/accel.sh@23 -- # accel_module=software 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # IFS=: 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # read -r var val 00:06:46.240 07:02:04 -- accel/accel.sh@21 -- # val=64 00:06:46.240 07:02:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # IFS=: 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # read -r var val 00:06:46.240 07:02:04 -- accel/accel.sh@21 -- # val=64 00:06:46.240 07:02:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # IFS=: 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # read -r var val 00:06:46.240 07:02:04 -- accel/accel.sh@21 -- # val=1 00:06:46.240 07:02:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # IFS=: 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # read -r var val 00:06:46.240 07:02:04 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:46.240 07:02:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # IFS=: 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # read -r var val 00:06:46.240 07:02:04 -- accel/accel.sh@21 -- # val=Yes 00:06:46.240 07:02:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # IFS=: 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # read -r var val 00:06:46.240 07:02:04 -- accel/accel.sh@21 -- # val= 00:06:46.240 07:02:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # IFS=: 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # read -r var val 00:06:46.240 07:02:04 -- accel/accel.sh@21 -- # val= 00:06:46.240 07:02:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # IFS=: 00:06:46.240 07:02:04 -- accel/accel.sh@20 -- # read -r var val 00:06:47.173 07:02:05 -- accel/accel.sh@21 -- # val= 00:06:47.173 07:02:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.173 07:02:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.173 07:02:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.173 07:02:05 -- accel/accel.sh@21 -- # val= 00:06:47.173 07:02:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.173 07:02:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.173 07:02:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.173 07:02:05 -- accel/accel.sh@21 -- # val= 00:06:47.173 07:02:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.173 07:02:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.173 07:02:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.173 07:02:05 -- accel/accel.sh@21 -- # val= 00:06:47.173 07:02:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.173 07:02:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.173 07:02:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.173 07:02:05 -- accel/accel.sh@21 -- # val= 00:06:47.173 07:02:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.173 07:02:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.173 07:02:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.173 07:02:05 -- accel/accel.sh@21 -- # val= 00:06:47.173 07:02:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:47.173 07:02:05 -- accel/accel.sh@20 -- # IFS=: 00:06:47.173 07:02:05 -- accel/accel.sh@20 -- # read -r var val 00:06:47.173 07:02:05 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:47.173 07:02:05 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:06:47.173 07:02:05 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:47.173 00:06:47.173 real 0m2.557s 00:06:47.173 user 0m2.309s 00:06:47.173 sys 0m0.245s 00:06:47.173 07:02:05 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:47.173 07:02:05 -- common/autotest_common.sh@10 -- # set +x 00:06:47.173 ************************************ 00:06:47.173 END TEST accel_fill 00:06:47.173 ************************************ 00:06:47.173 07:02:05 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:47.173 07:02:05 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:47.173 07:02:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:47.173 07:02:05 -- common/autotest_common.sh@10 -- # set +x 00:06:47.173 ************************************ 00:06:47.173 START TEST accel_copy_crc32c 00:06:47.173 ************************************ 00:06:47.173 07:02:05 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y 00:06:47.173 07:02:05 -- accel/accel.sh@16 -- # local accel_opc 00:06:47.173 07:02:05 -- accel/accel.sh@17 -- # local accel_module 00:06:47.173 07:02:05 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:47.173 07:02:05 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:47.173 07:02:05 -- accel/accel.sh@12 -- # build_accel_config 00:06:47.173 07:02:05 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:47.173 07:02:05 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:47.173 07:02:05 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:47.173 07:02:05 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:47.173 07:02:05 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:47.173 07:02:05 -- accel/accel.sh@41 -- # local IFS=, 00:06:47.432 07:02:05 -- accel/accel.sh@42 -- # jq -r . 00:06:47.432 [2024-12-13 07:02:05.426738] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:47.432 [2024-12-13 07:02:05.426824] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid480380 ] 00:06:47.432 EAL: No free 2048 kB hugepages reported on node 1 00:06:47.432 [2024-12-13 07:02:05.493697] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.432 [2024-12-13 07:02:05.529164] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.809 07:02:06 -- accel/accel.sh@18 -- # out=' 00:06:48.809 SPDK Configuration: 00:06:48.809 Core mask: 0x1 00:06:48.809 00:06:48.809 Accel Perf Configuration: 00:06:48.809 Workload Type: copy_crc32c 00:06:48.809 CRC-32C seed: 0 00:06:48.809 Vector size: 4096 bytes 00:06:48.809 Transfer size: 4096 bytes 00:06:48.809 Vector count 1 00:06:48.809 Module: software 00:06:48.809 Queue depth: 32 00:06:48.809 Allocate depth: 32 00:06:48.809 # threads/core: 1 00:06:48.809 Run time: 1 seconds 00:06:48.809 Verify: Yes 00:06:48.809 00:06:48.809 Running for 1 seconds... 00:06:48.809 00:06:48.809 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:48.809 ------------------------------------------------------------------------------------ 00:06:48.809 0,0 424608/s 1658 MiB/s 0 0 00:06:48.809 ==================================================================================== 00:06:48.809 Total 424608/s 1658 MiB/s 0 0' 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.809 07:02:06 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:48.809 07:02:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:48.809 07:02:06 -- accel/accel.sh@12 -- # build_accel_config 00:06:48.809 07:02:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:48.809 07:02:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:48.809 07:02:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:48.809 07:02:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:48.809 07:02:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:48.809 07:02:06 -- accel/accel.sh@41 -- # local IFS=, 00:06:48.809 07:02:06 -- accel/accel.sh@42 -- # jq -r . 00:06:48.809 [2024-12-13 07:02:06.699894] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:48.809 [2024-12-13 07:02:06.699957] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid480583 ] 00:06:48.809 EAL: No free 2048 kB hugepages reported on node 1 00:06:48.809 [2024-12-13 07:02:06.763996] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.809 [2024-12-13 07:02:06.798236] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.809 07:02:06 -- accel/accel.sh@21 -- # val= 00:06:48.809 07:02:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.809 07:02:06 -- accel/accel.sh@21 -- # val= 00:06:48.809 07:02:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.809 07:02:06 -- accel/accel.sh@21 -- # val=0x1 00:06:48.809 07:02:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.809 07:02:06 -- accel/accel.sh@21 -- # val= 00:06:48.809 07:02:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.809 07:02:06 -- accel/accel.sh@21 -- # val= 00:06:48.809 07:02:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.809 07:02:06 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:48.809 07:02:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.809 07:02:06 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.809 07:02:06 -- accel/accel.sh@21 -- # val=0 00:06:48.809 07:02:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.809 07:02:06 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:48.809 07:02:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.809 07:02:06 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:48.809 07:02:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.809 07:02:06 -- accel/accel.sh@21 -- # val= 00:06:48.809 07:02:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.809 07:02:06 -- accel/accel.sh@21 -- # val=software 00:06:48.809 07:02:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.809 07:02:06 -- accel/accel.sh@23 -- # accel_module=software 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.809 07:02:06 -- accel/accel.sh@21 -- # val=32 00:06:48.809 07:02:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.809 07:02:06 -- accel/accel.sh@21 -- # val=32 00:06:48.809 07:02:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.809 07:02:06 -- accel/accel.sh@21 -- # val=1 00:06:48.809 07:02:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.809 07:02:06 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:48.809 07:02:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.809 07:02:06 -- accel/accel.sh@21 -- # val=Yes 00:06:48.809 07:02:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.809 07:02:06 -- accel/accel.sh@21 -- # val= 00:06:48.809 07:02:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # read -r var val 00:06:48.809 07:02:06 -- accel/accel.sh@21 -- # val= 00:06:48.809 07:02:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # IFS=: 00:06:48.809 07:02:06 -- accel/accel.sh@20 -- # read -r var val 00:06:49.745 07:02:07 -- accel/accel.sh@21 -- # val= 00:06:49.745 07:02:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.745 07:02:07 -- accel/accel.sh@20 -- # IFS=: 00:06:49.745 07:02:07 -- accel/accel.sh@20 -- # read -r var val 00:06:49.745 07:02:07 -- accel/accel.sh@21 -- # val= 00:06:49.745 07:02:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.745 07:02:07 -- accel/accel.sh@20 -- # IFS=: 00:06:49.745 07:02:07 -- accel/accel.sh@20 -- # read -r var val 00:06:49.745 07:02:07 -- accel/accel.sh@21 -- # val= 00:06:49.745 07:02:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.745 07:02:07 -- accel/accel.sh@20 -- # IFS=: 00:06:49.745 07:02:07 -- accel/accel.sh@20 -- # read -r var val 00:06:49.745 07:02:07 -- accel/accel.sh@21 -- # val= 00:06:49.745 07:02:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.745 07:02:07 -- accel/accel.sh@20 -- # IFS=: 00:06:49.745 07:02:07 -- accel/accel.sh@20 -- # read -r var val 00:06:49.745 07:02:07 -- accel/accel.sh@21 -- # val= 00:06:49.745 07:02:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.745 07:02:07 -- accel/accel.sh@20 -- # IFS=: 00:06:49.745 07:02:07 -- accel/accel.sh@20 -- # read -r var val 00:06:49.745 07:02:07 -- accel/accel.sh@21 -- # val= 00:06:49.745 07:02:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.745 07:02:07 -- accel/accel.sh@20 -- # IFS=: 00:06:49.745 07:02:07 -- accel/accel.sh@20 -- # read -r var val 00:06:49.745 07:02:07 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:49.745 07:02:07 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:49.745 07:02:07 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:49.745 00:06:49.745 real 0m2.552s 00:06:49.745 user 0m2.306s 00:06:49.745 sys 0m0.244s 00:06:49.745 07:02:07 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:49.745 07:02:07 -- common/autotest_common.sh@10 -- # set +x 00:06:49.745 ************************************ 00:06:49.745 END TEST accel_copy_crc32c 00:06:49.745 ************************************ 00:06:50.004 07:02:07 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:50.004 07:02:07 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:50.004 07:02:07 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:50.004 07:02:07 -- common/autotest_common.sh@10 -- # set +x 00:06:50.004 ************************************ 00:06:50.004 START TEST accel_copy_crc32c_C2 00:06:50.004 ************************************ 00:06:50.004 07:02:08 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:50.004 07:02:08 -- accel/accel.sh@16 -- # local accel_opc 00:06:50.004 07:02:08 -- accel/accel.sh@17 -- # local accel_module 00:06:50.004 07:02:08 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:50.004 07:02:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:50.004 07:02:08 -- accel/accel.sh@12 -- # build_accel_config 00:06:50.004 07:02:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:50.004 07:02:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:50.004 07:02:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:50.004 07:02:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:50.004 07:02:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:50.004 07:02:08 -- accel/accel.sh@41 -- # local IFS=, 00:06:50.004 07:02:08 -- accel/accel.sh@42 -- # jq -r . 00:06:50.004 [2024-12-13 07:02:08.023081] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:50.004 [2024-12-13 07:02:08.023178] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid480866 ] 00:06:50.004 EAL: No free 2048 kB hugepages reported on node 1 00:06:50.004 [2024-12-13 07:02:08.091005] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.004 [2024-12-13 07:02:08.127477] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.381 07:02:09 -- accel/accel.sh@18 -- # out=' 00:06:51.381 SPDK Configuration: 00:06:51.381 Core mask: 0x1 00:06:51.381 00:06:51.381 Accel Perf Configuration: 00:06:51.381 Workload Type: copy_crc32c 00:06:51.381 CRC-32C seed: 0 00:06:51.381 Vector size: 4096 bytes 00:06:51.381 Transfer size: 8192 bytes 00:06:51.381 Vector count 2 00:06:51.381 Module: software 00:06:51.381 Queue depth: 32 00:06:51.381 Allocate depth: 32 00:06:51.381 # threads/core: 1 00:06:51.381 Run time: 1 seconds 00:06:51.381 Verify: Yes 00:06:51.381 00:06:51.381 Running for 1 seconds... 00:06:51.381 00:06:51.381 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:51.381 ------------------------------------------------------------------------------------ 00:06:51.381 0,0 297664/s 2325 MiB/s 0 0 00:06:51.381 ==================================================================================== 00:06:51.381 Total 297664/s 1162 MiB/s 0 0' 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.381 07:02:09 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:51.381 07:02:09 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:51.381 07:02:09 -- accel/accel.sh@12 -- # build_accel_config 00:06:51.381 07:02:09 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:51.381 07:02:09 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:51.381 07:02:09 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:51.381 07:02:09 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:51.381 07:02:09 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:51.381 07:02:09 -- accel/accel.sh@41 -- # local IFS=, 00:06:51.381 07:02:09 -- accel/accel.sh@42 -- # jq -r . 00:06:51.381 [2024-12-13 07:02:09.308606] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:51.381 [2024-12-13 07:02:09.308715] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid481135 ] 00:06:51.381 EAL: No free 2048 kB hugepages reported on node 1 00:06:51.381 [2024-12-13 07:02:09.376663] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.381 [2024-12-13 07:02:09.412620] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.381 07:02:09 -- accel/accel.sh@21 -- # val= 00:06:51.381 07:02:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.381 07:02:09 -- accel/accel.sh@21 -- # val= 00:06:51.381 07:02:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.381 07:02:09 -- accel/accel.sh@21 -- # val=0x1 00:06:51.381 07:02:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.381 07:02:09 -- accel/accel.sh@21 -- # val= 00:06:51.381 07:02:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.381 07:02:09 -- accel/accel.sh@21 -- # val= 00:06:51.381 07:02:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.381 07:02:09 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:51.381 07:02:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.381 07:02:09 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.381 07:02:09 -- accel/accel.sh@21 -- # val=0 00:06:51.381 07:02:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.381 07:02:09 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:51.381 07:02:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.381 07:02:09 -- accel/accel.sh@21 -- # val='8192 bytes' 00:06:51.381 07:02:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.381 07:02:09 -- accel/accel.sh@21 -- # val= 00:06:51.381 07:02:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.381 07:02:09 -- accel/accel.sh@21 -- # val=software 00:06:51.381 07:02:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.381 07:02:09 -- accel/accel.sh@23 -- # accel_module=software 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.381 07:02:09 -- accel/accel.sh@21 -- # val=32 00:06:51.381 07:02:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.381 07:02:09 -- accel/accel.sh@21 -- # val=32 00:06:51.381 07:02:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.381 07:02:09 -- accel/accel.sh@21 -- # val=1 00:06:51.381 07:02:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.381 07:02:09 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:51.381 07:02:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.381 07:02:09 -- accel/accel.sh@21 -- # val=Yes 00:06:51.381 07:02:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.381 07:02:09 -- accel/accel.sh@21 -- # val= 00:06:51.381 07:02:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # read -r var val 00:06:51.381 07:02:09 -- accel/accel.sh@21 -- # val= 00:06:51.381 07:02:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # IFS=: 00:06:51.381 07:02:09 -- accel/accel.sh@20 -- # read -r var val 00:06:52.758 07:02:10 -- accel/accel.sh@21 -- # val= 00:06:52.758 07:02:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.758 07:02:10 -- accel/accel.sh@20 -- # IFS=: 00:06:52.758 07:02:10 -- accel/accel.sh@20 -- # read -r var val 00:06:52.758 07:02:10 -- accel/accel.sh@21 -- # val= 00:06:52.758 07:02:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.758 07:02:10 -- accel/accel.sh@20 -- # IFS=: 00:06:52.758 07:02:10 -- accel/accel.sh@20 -- # read -r var val 00:06:52.758 07:02:10 -- accel/accel.sh@21 -- # val= 00:06:52.758 07:02:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.758 07:02:10 -- accel/accel.sh@20 -- # IFS=: 00:06:52.758 07:02:10 -- accel/accel.sh@20 -- # read -r var val 00:06:52.758 07:02:10 -- accel/accel.sh@21 -- # val= 00:06:52.758 07:02:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.758 07:02:10 -- accel/accel.sh@20 -- # IFS=: 00:06:52.758 07:02:10 -- accel/accel.sh@20 -- # read -r var val 00:06:52.758 07:02:10 -- accel/accel.sh@21 -- # val= 00:06:52.758 07:02:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.758 07:02:10 -- accel/accel.sh@20 -- # IFS=: 00:06:52.758 07:02:10 -- accel/accel.sh@20 -- # read -r var val 00:06:52.758 07:02:10 -- accel/accel.sh@21 -- # val= 00:06:52.758 07:02:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.758 07:02:10 -- accel/accel.sh@20 -- # IFS=: 00:06:52.758 07:02:10 -- accel/accel.sh@20 -- # read -r var val 00:06:52.758 07:02:10 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:52.758 07:02:10 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:52.758 07:02:10 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:52.758 00:06:52.758 real 0m2.574s 00:06:52.758 user 0m2.312s 00:06:52.758 sys 0m0.258s 00:06:52.758 07:02:10 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:52.758 07:02:10 -- common/autotest_common.sh@10 -- # set +x 00:06:52.759 ************************************ 00:06:52.759 END TEST accel_copy_crc32c_C2 00:06:52.759 ************************************ 00:06:52.759 07:02:10 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:52.759 07:02:10 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:52.759 07:02:10 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:52.759 07:02:10 -- common/autotest_common.sh@10 -- # set +x 00:06:52.759 ************************************ 00:06:52.759 START TEST accel_dualcast 00:06:52.759 ************************************ 00:06:52.759 07:02:10 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dualcast -y 00:06:52.759 07:02:10 -- accel/accel.sh@16 -- # local accel_opc 00:06:52.759 07:02:10 -- accel/accel.sh@17 -- # local accel_module 00:06:52.759 07:02:10 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:06:52.759 07:02:10 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:52.759 07:02:10 -- accel/accel.sh@12 -- # build_accel_config 00:06:52.759 07:02:10 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:52.759 07:02:10 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:52.759 07:02:10 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:52.759 07:02:10 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:52.759 07:02:10 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:52.759 07:02:10 -- accel/accel.sh@41 -- # local IFS=, 00:06:52.759 07:02:10 -- accel/accel.sh@42 -- # jq -r . 00:06:52.759 [2024-12-13 07:02:10.637226] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:52.759 [2024-12-13 07:02:10.637312] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid481423 ] 00:06:52.759 EAL: No free 2048 kB hugepages reported on node 1 00:06:52.759 [2024-12-13 07:02:10.704321] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.759 [2024-12-13 07:02:10.739378] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.695 07:02:11 -- accel/accel.sh@18 -- # out=' 00:06:53.695 SPDK Configuration: 00:06:53.695 Core mask: 0x1 00:06:53.695 00:06:53.695 Accel Perf Configuration: 00:06:53.695 Workload Type: dualcast 00:06:53.695 Transfer size: 4096 bytes 00:06:53.695 Vector count 1 00:06:53.695 Module: software 00:06:53.695 Queue depth: 32 00:06:53.695 Allocate depth: 32 00:06:53.695 # threads/core: 1 00:06:53.695 Run time: 1 seconds 00:06:53.695 Verify: Yes 00:06:53.695 00:06:53.695 Running for 1 seconds... 00:06:53.695 00:06:53.695 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:53.695 ------------------------------------------------------------------------------------ 00:06:53.695 0,0 643616/s 2514 MiB/s 0 0 00:06:53.695 ==================================================================================== 00:06:53.695 Total 643616/s 2514 MiB/s 0 0' 00:06:53.695 07:02:11 -- accel/accel.sh@20 -- # IFS=: 00:06:53.695 07:02:11 -- accel/accel.sh@20 -- # read -r var val 00:06:53.695 07:02:11 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:53.695 07:02:11 -- accel/accel.sh@12 -- # build_accel_config 00:06:53.695 07:02:11 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:53.695 07:02:11 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:53.695 07:02:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.695 07:02:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.695 07:02:11 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:53.695 07:02:11 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:53.695 07:02:11 -- accel/accel.sh@41 -- # local IFS=, 00:06:53.695 07:02:11 -- accel/accel.sh@42 -- # jq -r . 00:06:53.695 [2024-12-13 07:02:11.920967] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:53.695 [2024-12-13 07:02:11.921053] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid481679 ] 00:06:53.954 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.954 [2024-12-13 07:02:11.988764] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.954 [2024-12-13 07:02:12.023440] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.954 07:02:12 -- accel/accel.sh@21 -- # val= 00:06:53.954 07:02:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # read -r var val 00:06:53.954 07:02:12 -- accel/accel.sh@21 -- # val= 00:06:53.954 07:02:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # read -r var val 00:06:53.954 07:02:12 -- accel/accel.sh@21 -- # val=0x1 00:06:53.954 07:02:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # read -r var val 00:06:53.954 07:02:12 -- accel/accel.sh@21 -- # val= 00:06:53.954 07:02:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # read -r var val 00:06:53.954 07:02:12 -- accel/accel.sh@21 -- # val= 00:06:53.954 07:02:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # read -r var val 00:06:53.954 07:02:12 -- accel/accel.sh@21 -- # val=dualcast 00:06:53.954 07:02:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.954 07:02:12 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # read -r var val 00:06:53.954 07:02:12 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:53.954 07:02:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # read -r var val 00:06:53.954 07:02:12 -- accel/accel.sh@21 -- # val= 00:06:53.954 07:02:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # read -r var val 00:06:53.954 07:02:12 -- accel/accel.sh@21 -- # val=software 00:06:53.954 07:02:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.954 07:02:12 -- accel/accel.sh@23 -- # accel_module=software 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # read -r var val 00:06:53.954 07:02:12 -- accel/accel.sh@21 -- # val=32 00:06:53.954 07:02:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # read -r var val 00:06:53.954 07:02:12 -- accel/accel.sh@21 -- # val=32 00:06:53.954 07:02:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # read -r var val 00:06:53.954 07:02:12 -- accel/accel.sh@21 -- # val=1 00:06:53.954 07:02:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # read -r var val 00:06:53.954 07:02:12 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:53.954 07:02:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # read -r var val 00:06:53.954 07:02:12 -- accel/accel.sh@21 -- # val=Yes 00:06:53.954 07:02:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # read -r var val 00:06:53.954 07:02:12 -- accel/accel.sh@21 -- # val= 00:06:53.954 07:02:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # read -r var val 00:06:53.954 07:02:12 -- accel/accel.sh@21 -- # val= 00:06:53.954 07:02:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # IFS=: 00:06:53.954 07:02:12 -- accel/accel.sh@20 -- # read -r var val 00:06:55.332 07:02:13 -- accel/accel.sh@21 -- # val= 00:06:55.332 07:02:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.332 07:02:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.332 07:02:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.332 07:02:13 -- accel/accel.sh@21 -- # val= 00:06:55.332 07:02:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.332 07:02:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.332 07:02:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.332 07:02:13 -- accel/accel.sh@21 -- # val= 00:06:55.332 07:02:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.332 07:02:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.332 07:02:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.332 07:02:13 -- accel/accel.sh@21 -- # val= 00:06:55.332 07:02:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.332 07:02:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.332 07:02:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.332 07:02:13 -- accel/accel.sh@21 -- # val= 00:06:55.332 07:02:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.332 07:02:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.332 07:02:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.332 07:02:13 -- accel/accel.sh@21 -- # val= 00:06:55.332 07:02:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.332 07:02:13 -- accel/accel.sh@20 -- # IFS=: 00:06:55.332 07:02:13 -- accel/accel.sh@20 -- # read -r var val 00:06:55.332 07:02:13 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:55.332 07:02:13 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:06:55.332 07:02:13 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:55.332 00:06:55.332 real 0m2.568s 00:06:55.332 user 0m2.313s 00:06:55.332 sys 0m0.251s 00:06:55.332 07:02:13 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:55.332 07:02:13 -- common/autotest_common.sh@10 -- # set +x 00:06:55.332 ************************************ 00:06:55.332 END TEST accel_dualcast 00:06:55.332 ************************************ 00:06:55.332 07:02:13 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:55.332 07:02:13 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:55.332 07:02:13 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:55.332 07:02:13 -- common/autotest_common.sh@10 -- # set +x 00:06:55.332 ************************************ 00:06:55.332 START TEST accel_compare 00:06:55.332 ************************************ 00:06:55.332 07:02:13 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compare -y 00:06:55.332 07:02:13 -- accel/accel.sh@16 -- # local accel_opc 00:06:55.332 07:02:13 -- accel/accel.sh@17 -- # local accel_module 00:06:55.332 07:02:13 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:06:55.332 07:02:13 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:55.332 07:02:13 -- accel/accel.sh@12 -- # build_accel_config 00:06:55.332 07:02:13 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:55.332 07:02:13 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:55.332 07:02:13 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:55.332 07:02:13 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:55.332 07:02:13 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:55.332 07:02:13 -- accel/accel.sh@41 -- # local IFS=, 00:06:55.332 07:02:13 -- accel/accel.sh@42 -- # jq -r . 00:06:55.332 [2024-12-13 07:02:13.247720] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:55.332 [2024-12-13 07:02:13.247806] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid481864 ] 00:06:55.332 EAL: No free 2048 kB hugepages reported on node 1 00:06:55.332 [2024-12-13 07:02:13.314716] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.332 [2024-12-13 07:02:13.350477] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.708 07:02:14 -- accel/accel.sh@18 -- # out=' 00:06:56.708 SPDK Configuration: 00:06:56.708 Core mask: 0x1 00:06:56.708 00:06:56.708 Accel Perf Configuration: 00:06:56.708 Workload Type: compare 00:06:56.708 Transfer size: 4096 bytes 00:06:56.708 Vector count 1 00:06:56.708 Module: software 00:06:56.708 Queue depth: 32 00:06:56.708 Allocate depth: 32 00:06:56.708 # threads/core: 1 00:06:56.708 Run time: 1 seconds 00:06:56.708 Verify: Yes 00:06:56.708 00:06:56.708 Running for 1 seconds... 00:06:56.708 00:06:56.708 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:56.708 ------------------------------------------------------------------------------------ 00:06:56.708 0,0 848928/s 3316 MiB/s 0 0 00:06:56.708 ==================================================================================== 00:06:56.708 Total 848928/s 3316 MiB/s 0 0' 00:06:56.708 07:02:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.708 07:02:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.708 07:02:14 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:56.708 07:02:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:56.708 07:02:14 -- accel/accel.sh@12 -- # build_accel_config 00:06:56.708 07:02:14 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:56.708 07:02:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.708 07:02:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:56.708 07:02:14 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:56.708 07:02:14 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:56.708 07:02:14 -- accel/accel.sh@41 -- # local IFS=, 00:06:56.708 07:02:14 -- accel/accel.sh@42 -- # jq -r . 00:06:56.708 [2024-12-13 07:02:14.520395] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:56.708 [2024-12-13 07:02:14.520457] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid482018 ] 00:06:56.708 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.708 [2024-12-13 07:02:14.583828] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.708 [2024-12-13 07:02:14.617764] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.708 07:02:14 -- accel/accel.sh@21 -- # val= 00:06:56.708 07:02:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.708 07:02:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.708 07:02:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.708 07:02:14 -- accel/accel.sh@21 -- # val= 00:06:56.708 07:02:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.708 07:02:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.708 07:02:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.708 07:02:14 -- accel/accel.sh@21 -- # val=0x1 00:06:56.708 07:02:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.708 07:02:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.708 07:02:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.708 07:02:14 -- accel/accel.sh@21 -- # val= 00:06:56.708 07:02:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.708 07:02:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.708 07:02:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.708 07:02:14 -- accel/accel.sh@21 -- # val= 00:06:56.708 07:02:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.708 07:02:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.708 07:02:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.708 07:02:14 -- accel/accel.sh@21 -- # val=compare 00:06:56.708 07:02:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.708 07:02:14 -- accel/accel.sh@24 -- # accel_opc=compare 00:06:56.708 07:02:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.708 07:02:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.708 07:02:14 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:56.708 07:02:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.708 07:02:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.708 07:02:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.708 07:02:14 -- accel/accel.sh@21 -- # val= 00:06:56.708 07:02:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.708 07:02:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.708 07:02:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.708 07:02:14 -- accel/accel.sh@21 -- # val=software 00:06:56.708 07:02:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.709 07:02:14 -- accel/accel.sh@23 -- # accel_module=software 00:06:56.709 07:02:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.709 07:02:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.709 07:02:14 -- accel/accel.sh@21 -- # val=32 00:06:56.709 07:02:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.709 07:02:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.709 07:02:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.709 07:02:14 -- accel/accel.sh@21 -- # val=32 00:06:56.709 07:02:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.709 07:02:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.709 07:02:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.709 07:02:14 -- accel/accel.sh@21 -- # val=1 00:06:56.709 07:02:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.709 07:02:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.709 07:02:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.709 07:02:14 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:56.709 07:02:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.709 07:02:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.709 07:02:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.709 07:02:14 -- accel/accel.sh@21 -- # val=Yes 00:06:56.709 07:02:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.709 07:02:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.709 07:02:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.709 07:02:14 -- accel/accel.sh@21 -- # val= 00:06:56.709 07:02:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.709 07:02:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.709 07:02:14 -- accel/accel.sh@20 -- # read -r var val 00:06:56.709 07:02:14 -- accel/accel.sh@21 -- # val= 00:06:56.709 07:02:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.709 07:02:14 -- accel/accel.sh@20 -- # IFS=: 00:06:56.709 07:02:14 -- accel/accel.sh@20 -- # read -r var val 00:06:57.643 07:02:15 -- accel/accel.sh@21 -- # val= 00:06:57.643 07:02:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.643 07:02:15 -- accel/accel.sh@20 -- # IFS=: 00:06:57.643 07:02:15 -- accel/accel.sh@20 -- # read -r var val 00:06:57.643 07:02:15 -- accel/accel.sh@21 -- # val= 00:06:57.643 07:02:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.643 07:02:15 -- accel/accel.sh@20 -- # IFS=: 00:06:57.643 07:02:15 -- accel/accel.sh@20 -- # read -r var val 00:06:57.643 07:02:15 -- accel/accel.sh@21 -- # val= 00:06:57.643 07:02:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.643 07:02:15 -- accel/accel.sh@20 -- # IFS=: 00:06:57.643 07:02:15 -- accel/accel.sh@20 -- # read -r var val 00:06:57.643 07:02:15 -- accel/accel.sh@21 -- # val= 00:06:57.643 07:02:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.643 07:02:15 -- accel/accel.sh@20 -- # IFS=: 00:06:57.643 07:02:15 -- accel/accel.sh@20 -- # read -r var val 00:06:57.643 07:02:15 -- accel/accel.sh@21 -- # val= 00:06:57.643 07:02:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.643 07:02:15 -- accel/accel.sh@20 -- # IFS=: 00:06:57.643 07:02:15 -- accel/accel.sh@20 -- # read -r var val 00:06:57.643 07:02:15 -- accel/accel.sh@21 -- # val= 00:06:57.643 07:02:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.643 07:02:15 -- accel/accel.sh@20 -- # IFS=: 00:06:57.643 07:02:15 -- accel/accel.sh@20 -- # read -r var val 00:06:57.643 07:02:15 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:57.643 07:02:15 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:06:57.643 07:02:15 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:57.643 00:06:57.643 real 0m2.551s 00:06:57.643 user 0m2.309s 00:06:57.643 sys 0m0.238s 00:06:57.643 07:02:15 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:57.643 07:02:15 -- common/autotest_common.sh@10 -- # set +x 00:06:57.643 ************************************ 00:06:57.643 END TEST accel_compare 00:06:57.643 ************************************ 00:06:57.643 07:02:15 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:06:57.643 07:02:15 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:57.643 07:02:15 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:57.643 07:02:15 -- common/autotest_common.sh@10 -- # set +x 00:06:57.643 ************************************ 00:06:57.643 START TEST accel_xor 00:06:57.643 ************************************ 00:06:57.643 07:02:15 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y 00:06:57.643 07:02:15 -- accel/accel.sh@16 -- # local accel_opc 00:06:57.643 07:02:15 -- accel/accel.sh@17 -- # local accel_module 00:06:57.643 07:02:15 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:06:57.643 07:02:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:57.643 07:02:15 -- accel/accel.sh@12 -- # build_accel_config 00:06:57.643 07:02:15 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:57.643 07:02:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:57.643 07:02:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:57.643 07:02:15 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:57.643 07:02:15 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:57.643 07:02:15 -- accel/accel.sh@41 -- # local IFS=, 00:06:57.643 07:02:15 -- accel/accel.sh@42 -- # jq -r . 00:06:57.643 [2024-12-13 07:02:15.837454] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:57.643 [2024-12-13 07:02:15.837542] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid482286 ] 00:06:57.643 EAL: No free 2048 kB hugepages reported on node 1 00:06:57.902 [2024-12-13 07:02:15.903866] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.902 [2024-12-13 07:02:15.938604] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.279 07:02:17 -- accel/accel.sh@18 -- # out=' 00:06:59.279 SPDK Configuration: 00:06:59.279 Core mask: 0x1 00:06:59.279 00:06:59.279 Accel Perf Configuration: 00:06:59.279 Workload Type: xor 00:06:59.279 Source buffers: 2 00:06:59.279 Transfer size: 4096 bytes 00:06:59.279 Vector count 1 00:06:59.279 Module: software 00:06:59.279 Queue depth: 32 00:06:59.279 Allocate depth: 32 00:06:59.279 # threads/core: 1 00:06:59.279 Run time: 1 seconds 00:06:59.279 Verify: Yes 00:06:59.279 00:06:59.279 Running for 1 seconds... 00:06:59.279 00:06:59.279 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:59.279 ------------------------------------------------------------------------------------ 00:06:59.279 0,0 708352/s 2767 MiB/s 0 0 00:06:59.279 ==================================================================================== 00:06:59.279 Total 708352/s 2767 MiB/s 0 0' 00:06:59.279 07:02:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.279 07:02:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.279 07:02:17 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:06:59.279 07:02:17 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:59.279 07:02:17 -- accel/accel.sh@12 -- # build_accel_config 00:06:59.279 07:02:17 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:59.279 07:02:17 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.279 07:02:17 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.279 07:02:17 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:59.279 07:02:17 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:59.279 07:02:17 -- accel/accel.sh@41 -- # local IFS=, 00:06:59.279 07:02:17 -- accel/accel.sh@42 -- # jq -r . 00:06:59.279 [2024-12-13 07:02:17.118858] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:59.279 [2024-12-13 07:02:17.118943] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid482554 ] 00:06:59.279 EAL: No free 2048 kB hugepages reported on node 1 00:06:59.279 [2024-12-13 07:02:17.184705] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.279 [2024-12-13 07:02:17.218251] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.279 07:02:17 -- accel/accel.sh@21 -- # val= 00:06:59.279 07:02:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.279 07:02:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.279 07:02:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.279 07:02:17 -- accel/accel.sh@21 -- # val= 00:06:59.279 07:02:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.279 07:02:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.279 07:02:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.279 07:02:17 -- accel/accel.sh@21 -- # val=0x1 00:06:59.279 07:02:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.279 07:02:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.279 07:02:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.279 07:02:17 -- accel/accel.sh@21 -- # val= 00:06:59.279 07:02:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.279 07:02:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.280 07:02:17 -- accel/accel.sh@21 -- # val= 00:06:59.280 07:02:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.280 07:02:17 -- accel/accel.sh@21 -- # val=xor 00:06:59.280 07:02:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.280 07:02:17 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.280 07:02:17 -- accel/accel.sh@21 -- # val=2 00:06:59.280 07:02:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.280 07:02:17 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:59.280 07:02:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.280 07:02:17 -- accel/accel.sh@21 -- # val= 00:06:59.280 07:02:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.280 07:02:17 -- accel/accel.sh@21 -- # val=software 00:06:59.280 07:02:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.280 07:02:17 -- accel/accel.sh@23 -- # accel_module=software 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.280 07:02:17 -- accel/accel.sh@21 -- # val=32 00:06:59.280 07:02:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.280 07:02:17 -- accel/accel.sh@21 -- # val=32 00:06:59.280 07:02:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.280 07:02:17 -- accel/accel.sh@21 -- # val=1 00:06:59.280 07:02:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.280 07:02:17 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:59.280 07:02:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.280 07:02:17 -- accel/accel.sh@21 -- # val=Yes 00:06:59.280 07:02:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.280 07:02:17 -- accel/accel.sh@21 -- # val= 00:06:59.280 07:02:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # read -r var val 00:06:59.280 07:02:17 -- accel/accel.sh@21 -- # val= 00:06:59.280 07:02:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # IFS=: 00:06:59.280 07:02:17 -- accel/accel.sh@20 -- # read -r var val 00:07:00.215 07:02:18 -- accel/accel.sh@21 -- # val= 00:07:00.215 07:02:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.215 07:02:18 -- accel/accel.sh@20 -- # IFS=: 00:07:00.215 07:02:18 -- accel/accel.sh@20 -- # read -r var val 00:07:00.215 07:02:18 -- accel/accel.sh@21 -- # val= 00:07:00.215 07:02:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.215 07:02:18 -- accel/accel.sh@20 -- # IFS=: 00:07:00.215 07:02:18 -- accel/accel.sh@20 -- # read -r var val 00:07:00.215 07:02:18 -- accel/accel.sh@21 -- # val= 00:07:00.215 07:02:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.215 07:02:18 -- accel/accel.sh@20 -- # IFS=: 00:07:00.215 07:02:18 -- accel/accel.sh@20 -- # read -r var val 00:07:00.215 07:02:18 -- accel/accel.sh@21 -- # val= 00:07:00.215 07:02:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.215 07:02:18 -- accel/accel.sh@20 -- # IFS=: 00:07:00.215 07:02:18 -- accel/accel.sh@20 -- # read -r var val 00:07:00.215 07:02:18 -- accel/accel.sh@21 -- # val= 00:07:00.215 07:02:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.215 07:02:18 -- accel/accel.sh@20 -- # IFS=: 00:07:00.215 07:02:18 -- accel/accel.sh@20 -- # read -r var val 00:07:00.215 07:02:18 -- accel/accel.sh@21 -- # val= 00:07:00.215 07:02:18 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.215 07:02:18 -- accel/accel.sh@20 -- # IFS=: 00:07:00.215 07:02:18 -- accel/accel.sh@20 -- # read -r var val 00:07:00.215 07:02:18 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:00.215 07:02:18 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:07:00.215 07:02:18 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:00.215 00:07:00.215 real 0m2.560s 00:07:00.215 user 0m2.300s 00:07:00.215 sys 0m0.256s 00:07:00.215 07:02:18 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:00.215 07:02:18 -- common/autotest_common.sh@10 -- # set +x 00:07:00.215 ************************************ 00:07:00.215 END TEST accel_xor 00:07:00.215 ************************************ 00:07:00.215 07:02:18 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:00.215 07:02:18 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:00.215 07:02:18 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:00.215 07:02:18 -- common/autotest_common.sh@10 -- # set +x 00:07:00.215 ************************************ 00:07:00.215 START TEST accel_xor 00:07:00.215 ************************************ 00:07:00.215 07:02:18 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y -x 3 00:07:00.215 07:02:18 -- accel/accel.sh@16 -- # local accel_opc 00:07:00.215 07:02:18 -- accel/accel.sh@17 -- # local accel_module 00:07:00.215 07:02:18 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:07:00.215 07:02:18 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:00.215 07:02:18 -- accel/accel.sh@12 -- # build_accel_config 00:07:00.215 07:02:18 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:00.215 07:02:18 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:00.215 07:02:18 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:00.215 07:02:18 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:00.215 07:02:18 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:00.215 07:02:18 -- accel/accel.sh@41 -- # local IFS=, 00:07:00.215 07:02:18 -- accel/accel.sh@42 -- # jq -r . 00:07:00.215 [2024-12-13 07:02:18.442797] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:00.215 [2024-12-13 07:02:18.442883] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid482835 ] 00:07:00.473 EAL: No free 2048 kB hugepages reported on node 1 00:07:00.473 [2024-12-13 07:02:18.508044] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.473 [2024-12-13 07:02:18.543461] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.850 07:02:19 -- accel/accel.sh@18 -- # out=' 00:07:01.850 SPDK Configuration: 00:07:01.850 Core mask: 0x1 00:07:01.850 00:07:01.850 Accel Perf Configuration: 00:07:01.850 Workload Type: xor 00:07:01.850 Source buffers: 3 00:07:01.850 Transfer size: 4096 bytes 00:07:01.850 Vector count 1 00:07:01.850 Module: software 00:07:01.850 Queue depth: 32 00:07:01.850 Allocate depth: 32 00:07:01.850 # threads/core: 1 00:07:01.850 Run time: 1 seconds 00:07:01.850 Verify: Yes 00:07:01.850 00:07:01.850 Running for 1 seconds... 00:07:01.850 00:07:01.850 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:01.850 ------------------------------------------------------------------------------------ 00:07:01.850 0,0 668096/s 2609 MiB/s 0 0 00:07:01.850 ==================================================================================== 00:07:01.850 Total 668096/s 2609 MiB/s 0 0' 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # IFS=: 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # read -r var val 00:07:01.850 07:02:19 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:01.850 07:02:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:01.850 07:02:19 -- accel/accel.sh@12 -- # build_accel_config 00:07:01.850 07:02:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:01.850 07:02:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:01.850 07:02:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:01.850 07:02:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:01.850 07:02:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:01.850 07:02:19 -- accel/accel.sh@41 -- # local IFS=, 00:07:01.850 07:02:19 -- accel/accel.sh@42 -- # jq -r . 00:07:01.850 [2024-12-13 07:02:19.712943] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:01.850 [2024-12-13 07:02:19.713004] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid483101 ] 00:07:01.850 EAL: No free 2048 kB hugepages reported on node 1 00:07:01.850 [2024-12-13 07:02:19.772704] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.850 [2024-12-13 07:02:19.806608] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.850 07:02:19 -- accel/accel.sh@21 -- # val= 00:07:01.850 07:02:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # IFS=: 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # read -r var val 00:07:01.850 07:02:19 -- accel/accel.sh@21 -- # val= 00:07:01.850 07:02:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # IFS=: 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # read -r var val 00:07:01.850 07:02:19 -- accel/accel.sh@21 -- # val=0x1 00:07:01.850 07:02:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # IFS=: 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # read -r var val 00:07:01.850 07:02:19 -- accel/accel.sh@21 -- # val= 00:07:01.850 07:02:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # IFS=: 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # read -r var val 00:07:01.850 07:02:19 -- accel/accel.sh@21 -- # val= 00:07:01.850 07:02:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # IFS=: 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # read -r var val 00:07:01.850 07:02:19 -- accel/accel.sh@21 -- # val=xor 00:07:01.850 07:02:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.850 07:02:19 -- accel/accel.sh@24 -- # accel_opc=xor 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # IFS=: 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # read -r var val 00:07:01.850 07:02:19 -- accel/accel.sh@21 -- # val=3 00:07:01.850 07:02:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # IFS=: 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # read -r var val 00:07:01.850 07:02:19 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:01.850 07:02:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # IFS=: 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # read -r var val 00:07:01.850 07:02:19 -- accel/accel.sh@21 -- # val= 00:07:01.850 07:02:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # IFS=: 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # read -r var val 00:07:01.850 07:02:19 -- accel/accel.sh@21 -- # val=software 00:07:01.850 07:02:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.850 07:02:19 -- accel/accel.sh@23 -- # accel_module=software 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # IFS=: 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # read -r var val 00:07:01.850 07:02:19 -- accel/accel.sh@21 -- # val=32 00:07:01.850 07:02:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # IFS=: 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # read -r var val 00:07:01.850 07:02:19 -- accel/accel.sh@21 -- # val=32 00:07:01.850 07:02:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # IFS=: 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # read -r var val 00:07:01.850 07:02:19 -- accel/accel.sh@21 -- # val=1 00:07:01.850 07:02:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # IFS=: 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # read -r var val 00:07:01.850 07:02:19 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:01.850 07:02:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # IFS=: 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # read -r var val 00:07:01.850 07:02:19 -- accel/accel.sh@21 -- # val=Yes 00:07:01.850 07:02:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # IFS=: 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # read -r var val 00:07:01.850 07:02:19 -- accel/accel.sh@21 -- # val= 00:07:01.850 07:02:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # IFS=: 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # read -r var val 00:07:01.850 07:02:19 -- accel/accel.sh@21 -- # val= 00:07:01.850 07:02:19 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # IFS=: 00:07:01.850 07:02:19 -- accel/accel.sh@20 -- # read -r var val 00:07:02.787 07:02:20 -- accel/accel.sh@21 -- # val= 00:07:02.787 07:02:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.787 07:02:20 -- accel/accel.sh@20 -- # IFS=: 00:07:02.787 07:02:20 -- accel/accel.sh@20 -- # read -r var val 00:07:02.787 07:02:20 -- accel/accel.sh@21 -- # val= 00:07:02.787 07:02:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.787 07:02:20 -- accel/accel.sh@20 -- # IFS=: 00:07:02.787 07:02:20 -- accel/accel.sh@20 -- # read -r var val 00:07:02.787 07:02:20 -- accel/accel.sh@21 -- # val= 00:07:02.787 07:02:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.787 07:02:20 -- accel/accel.sh@20 -- # IFS=: 00:07:02.787 07:02:20 -- accel/accel.sh@20 -- # read -r var val 00:07:02.787 07:02:20 -- accel/accel.sh@21 -- # val= 00:07:02.787 07:02:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.787 07:02:20 -- accel/accel.sh@20 -- # IFS=: 00:07:02.787 07:02:20 -- accel/accel.sh@20 -- # read -r var val 00:07:02.787 07:02:20 -- accel/accel.sh@21 -- # val= 00:07:02.787 07:02:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.787 07:02:20 -- accel/accel.sh@20 -- # IFS=: 00:07:02.787 07:02:20 -- accel/accel.sh@20 -- # read -r var val 00:07:02.787 07:02:20 -- accel/accel.sh@21 -- # val= 00:07:02.787 07:02:20 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.787 07:02:20 -- accel/accel.sh@20 -- # IFS=: 00:07:02.787 07:02:20 -- accel/accel.sh@20 -- # read -r var val 00:07:02.787 07:02:20 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:02.787 07:02:20 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:07:02.787 07:02:20 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:02.787 00:07:02.787 real 0m2.547s 00:07:02.787 user 0m2.309s 00:07:02.787 sys 0m0.234s 00:07:02.787 07:02:20 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:02.787 07:02:20 -- common/autotest_common.sh@10 -- # set +x 00:07:02.787 ************************************ 00:07:02.787 END TEST accel_xor 00:07:02.787 ************************************ 00:07:02.787 07:02:21 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:02.787 07:02:21 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:02.787 07:02:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:02.787 07:02:21 -- common/autotest_common.sh@10 -- # set +x 00:07:02.787 ************************************ 00:07:02.787 START TEST accel_dif_verify 00:07:02.787 ************************************ 00:07:02.787 07:02:21 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_verify 00:07:02.787 07:02:21 -- accel/accel.sh@16 -- # local accel_opc 00:07:02.787 07:02:21 -- accel/accel.sh@17 -- # local accel_module 00:07:02.787 07:02:21 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:07:02.787 07:02:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:02.787 07:02:21 -- accel/accel.sh@12 -- # build_accel_config 00:07:02.787 07:02:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:02.787 07:02:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:02.787 07:02:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:02.787 07:02:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:02.787 07:02:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:02.787 07:02:21 -- accel/accel.sh@41 -- # local IFS=, 00:07:02.787 07:02:21 -- accel/accel.sh@42 -- # jq -r . 00:07:03.046 [2024-12-13 07:02:21.034196] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:03.046 [2024-12-13 07:02:21.034288] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid483350 ] 00:07:03.046 EAL: No free 2048 kB hugepages reported on node 1 00:07:03.046 [2024-12-13 07:02:21.104063] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.046 [2024-12-13 07:02:21.139459] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.424 07:02:22 -- accel/accel.sh@18 -- # out=' 00:07:04.424 SPDK Configuration: 00:07:04.424 Core mask: 0x1 00:07:04.424 00:07:04.424 Accel Perf Configuration: 00:07:04.424 Workload Type: dif_verify 00:07:04.424 Vector size: 4096 bytes 00:07:04.424 Transfer size: 4096 bytes 00:07:04.424 Block size: 512 bytes 00:07:04.424 Metadata size: 8 bytes 00:07:04.424 Vector count 1 00:07:04.424 Module: software 00:07:04.424 Queue depth: 32 00:07:04.424 Allocate depth: 32 00:07:04.424 # threads/core: 1 00:07:04.424 Run time: 1 seconds 00:07:04.424 Verify: No 00:07:04.424 00:07:04.424 Running for 1 seconds... 00:07:04.424 00:07:04.424 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:04.424 ------------------------------------------------------------------------------------ 00:07:04.424 0,0 248896/s 987 MiB/s 0 0 00:07:04.424 ==================================================================================== 00:07:04.424 Total 248896/s 972 MiB/s 0 0' 00:07:04.424 07:02:22 -- accel/accel.sh@20 -- # IFS=: 00:07:04.424 07:02:22 -- accel/accel.sh@20 -- # read -r var val 00:07:04.424 07:02:22 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:04.424 07:02:22 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:04.424 07:02:22 -- accel/accel.sh@12 -- # build_accel_config 00:07:04.424 07:02:22 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:04.424 07:02:22 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:04.424 07:02:22 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:04.424 07:02:22 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:04.424 07:02:22 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:04.424 07:02:22 -- accel/accel.sh@41 -- # local IFS=, 00:07:04.424 07:02:22 -- accel/accel.sh@42 -- # jq -r . 00:07:04.424 [2024-12-13 07:02:22.318219] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:04.424 [2024-12-13 07:02:22.318312] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid483496 ] 00:07:04.424 EAL: No free 2048 kB hugepages reported on node 1 00:07:04.424 [2024-12-13 07:02:22.387902] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.425 [2024-12-13 07:02:22.423707] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.425 07:02:22 -- accel/accel.sh@21 -- # val= 00:07:04.425 07:02:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # IFS=: 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # read -r var val 00:07:04.425 07:02:22 -- accel/accel.sh@21 -- # val= 00:07:04.425 07:02:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # IFS=: 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # read -r var val 00:07:04.425 07:02:22 -- accel/accel.sh@21 -- # val=0x1 00:07:04.425 07:02:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # IFS=: 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # read -r var val 00:07:04.425 07:02:22 -- accel/accel.sh@21 -- # val= 00:07:04.425 07:02:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # IFS=: 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # read -r var val 00:07:04.425 07:02:22 -- accel/accel.sh@21 -- # val= 00:07:04.425 07:02:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # IFS=: 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # read -r var val 00:07:04.425 07:02:22 -- accel/accel.sh@21 -- # val=dif_verify 00:07:04.425 07:02:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.425 07:02:22 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # IFS=: 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # read -r var val 00:07:04.425 07:02:22 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:04.425 07:02:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # IFS=: 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # read -r var val 00:07:04.425 07:02:22 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:04.425 07:02:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # IFS=: 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # read -r var val 00:07:04.425 07:02:22 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:04.425 07:02:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # IFS=: 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # read -r var val 00:07:04.425 07:02:22 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:04.425 07:02:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # IFS=: 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # read -r var val 00:07:04.425 07:02:22 -- accel/accel.sh@21 -- # val= 00:07:04.425 07:02:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # IFS=: 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # read -r var val 00:07:04.425 07:02:22 -- accel/accel.sh@21 -- # val=software 00:07:04.425 07:02:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.425 07:02:22 -- accel/accel.sh@23 -- # accel_module=software 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # IFS=: 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # read -r var val 00:07:04.425 07:02:22 -- accel/accel.sh@21 -- # val=32 00:07:04.425 07:02:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # IFS=: 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # read -r var val 00:07:04.425 07:02:22 -- accel/accel.sh@21 -- # val=32 00:07:04.425 07:02:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # IFS=: 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # read -r var val 00:07:04.425 07:02:22 -- accel/accel.sh@21 -- # val=1 00:07:04.425 07:02:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # IFS=: 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # read -r var val 00:07:04.425 07:02:22 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:04.425 07:02:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # IFS=: 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # read -r var val 00:07:04.425 07:02:22 -- accel/accel.sh@21 -- # val=No 00:07:04.425 07:02:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # IFS=: 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # read -r var val 00:07:04.425 07:02:22 -- accel/accel.sh@21 -- # val= 00:07:04.425 07:02:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # IFS=: 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # read -r var val 00:07:04.425 07:02:22 -- accel/accel.sh@21 -- # val= 00:07:04.425 07:02:22 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # IFS=: 00:07:04.425 07:02:22 -- accel/accel.sh@20 -- # read -r var val 00:07:05.360 07:02:23 -- accel/accel.sh@21 -- # val= 00:07:05.360 07:02:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.360 07:02:23 -- accel/accel.sh@20 -- # IFS=: 00:07:05.360 07:02:23 -- accel/accel.sh@20 -- # read -r var val 00:07:05.360 07:02:23 -- accel/accel.sh@21 -- # val= 00:07:05.360 07:02:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.360 07:02:23 -- accel/accel.sh@20 -- # IFS=: 00:07:05.360 07:02:23 -- accel/accel.sh@20 -- # read -r var val 00:07:05.360 07:02:23 -- accel/accel.sh@21 -- # val= 00:07:05.360 07:02:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.360 07:02:23 -- accel/accel.sh@20 -- # IFS=: 00:07:05.360 07:02:23 -- accel/accel.sh@20 -- # read -r var val 00:07:05.360 07:02:23 -- accel/accel.sh@21 -- # val= 00:07:05.360 07:02:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.360 07:02:23 -- accel/accel.sh@20 -- # IFS=: 00:07:05.360 07:02:23 -- accel/accel.sh@20 -- # read -r var val 00:07:05.360 07:02:23 -- accel/accel.sh@21 -- # val= 00:07:05.360 07:02:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.360 07:02:23 -- accel/accel.sh@20 -- # IFS=: 00:07:05.360 07:02:23 -- accel/accel.sh@20 -- # read -r var val 00:07:05.360 07:02:23 -- accel/accel.sh@21 -- # val= 00:07:05.360 07:02:23 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.360 07:02:23 -- accel/accel.sh@20 -- # IFS=: 00:07:05.360 07:02:23 -- accel/accel.sh@20 -- # read -r var val 00:07:05.360 07:02:23 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:05.360 07:02:23 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:07:05.360 07:02:23 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:05.360 00:07:05.360 real 0m2.571s 00:07:05.360 user 0m2.299s 00:07:05.360 sys 0m0.270s 00:07:05.360 07:02:23 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:05.360 07:02:23 -- common/autotest_common.sh@10 -- # set +x 00:07:05.360 ************************************ 00:07:05.360 END TEST accel_dif_verify 00:07:05.360 ************************************ 00:07:05.620 07:02:23 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:05.620 07:02:23 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:05.620 07:02:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:05.620 07:02:23 -- common/autotest_common.sh@10 -- # set +x 00:07:05.620 ************************************ 00:07:05.620 START TEST accel_dif_generate 00:07:05.620 ************************************ 00:07:05.620 07:02:23 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate 00:07:05.620 07:02:23 -- accel/accel.sh@16 -- # local accel_opc 00:07:05.620 07:02:23 -- accel/accel.sh@17 -- # local accel_module 00:07:05.620 07:02:23 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:07:05.620 07:02:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:05.620 07:02:23 -- accel/accel.sh@12 -- # build_accel_config 00:07:05.620 07:02:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:05.620 07:02:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:05.620 07:02:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:05.620 07:02:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:05.620 07:02:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:05.620 07:02:23 -- accel/accel.sh@41 -- # local IFS=, 00:07:05.620 07:02:23 -- accel/accel.sh@42 -- # jq -r . 00:07:05.620 [2024-12-13 07:02:23.647139] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:05.620 [2024-12-13 07:02:23.647265] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid483701 ] 00:07:05.620 EAL: No free 2048 kB hugepages reported on node 1 00:07:05.620 [2024-12-13 07:02:23.715912] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.620 [2024-12-13 07:02:23.751080] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.997 07:02:24 -- accel/accel.sh@18 -- # out=' 00:07:06.997 SPDK Configuration: 00:07:06.997 Core mask: 0x1 00:07:06.997 00:07:06.997 Accel Perf Configuration: 00:07:06.997 Workload Type: dif_generate 00:07:06.997 Vector size: 4096 bytes 00:07:06.997 Transfer size: 4096 bytes 00:07:06.997 Block size: 512 bytes 00:07:06.997 Metadata size: 8 bytes 00:07:06.997 Vector count 1 00:07:06.997 Module: software 00:07:06.997 Queue depth: 32 00:07:06.998 Allocate depth: 32 00:07:06.998 # threads/core: 1 00:07:06.998 Run time: 1 seconds 00:07:06.998 Verify: No 00:07:06.998 00:07:06.998 Running for 1 seconds... 00:07:06.998 00:07:06.998 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:06.998 ------------------------------------------------------------------------------------ 00:07:06.998 0,0 283520/s 1124 MiB/s 0 0 00:07:06.998 ==================================================================================== 00:07:06.998 Total 283520/s 1107 MiB/s 0 0' 00:07:06.998 07:02:24 -- accel/accel.sh@20 -- # IFS=: 00:07:06.998 07:02:24 -- accel/accel.sh@20 -- # read -r var val 00:07:06.998 07:02:24 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:06.998 07:02:24 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:06.998 07:02:24 -- accel/accel.sh@12 -- # build_accel_config 00:07:06.998 07:02:24 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:06.998 07:02:24 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:06.998 07:02:24 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:06.998 07:02:24 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:06.998 07:02:24 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:06.998 07:02:24 -- accel/accel.sh@41 -- # local IFS=, 00:07:06.998 07:02:24 -- accel/accel.sh@42 -- # jq -r . 00:07:06.998 [2024-12-13 07:02:24.920879] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:06.998 [2024-12-13 07:02:24.920939] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid483964 ] 00:07:06.998 EAL: No free 2048 kB hugepages reported on node 1 00:07:06.998 [2024-12-13 07:02:24.982283] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.998 [2024-12-13 07:02:25.016406] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.998 07:02:25 -- accel/accel.sh@21 -- # val= 00:07:06.998 07:02:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # IFS=: 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # read -r var val 00:07:06.998 07:02:25 -- accel/accel.sh@21 -- # val= 00:07:06.998 07:02:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # IFS=: 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # read -r var val 00:07:06.998 07:02:25 -- accel/accel.sh@21 -- # val=0x1 00:07:06.998 07:02:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # IFS=: 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # read -r var val 00:07:06.998 07:02:25 -- accel/accel.sh@21 -- # val= 00:07:06.998 07:02:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # IFS=: 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # read -r var val 00:07:06.998 07:02:25 -- accel/accel.sh@21 -- # val= 00:07:06.998 07:02:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # IFS=: 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # read -r var val 00:07:06.998 07:02:25 -- accel/accel.sh@21 -- # val=dif_generate 00:07:06.998 07:02:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.998 07:02:25 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # IFS=: 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # read -r var val 00:07:06.998 07:02:25 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:06.998 07:02:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # IFS=: 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # read -r var val 00:07:06.998 07:02:25 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:06.998 07:02:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # IFS=: 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # read -r var val 00:07:06.998 07:02:25 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:06.998 07:02:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # IFS=: 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # read -r var val 00:07:06.998 07:02:25 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:06.998 07:02:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # IFS=: 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # read -r var val 00:07:06.998 07:02:25 -- accel/accel.sh@21 -- # val= 00:07:06.998 07:02:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # IFS=: 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # read -r var val 00:07:06.998 07:02:25 -- accel/accel.sh@21 -- # val=software 00:07:06.998 07:02:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.998 07:02:25 -- accel/accel.sh@23 -- # accel_module=software 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # IFS=: 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # read -r var val 00:07:06.998 07:02:25 -- accel/accel.sh@21 -- # val=32 00:07:06.998 07:02:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # IFS=: 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # read -r var val 00:07:06.998 07:02:25 -- accel/accel.sh@21 -- # val=32 00:07:06.998 07:02:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # IFS=: 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # read -r var val 00:07:06.998 07:02:25 -- accel/accel.sh@21 -- # val=1 00:07:06.998 07:02:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # IFS=: 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # read -r var val 00:07:06.998 07:02:25 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:06.998 07:02:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # IFS=: 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # read -r var val 00:07:06.998 07:02:25 -- accel/accel.sh@21 -- # val=No 00:07:06.998 07:02:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # IFS=: 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # read -r var val 00:07:06.998 07:02:25 -- accel/accel.sh@21 -- # val= 00:07:06.998 07:02:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # IFS=: 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # read -r var val 00:07:06.998 07:02:25 -- accel/accel.sh@21 -- # val= 00:07:06.998 07:02:25 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # IFS=: 00:07:06.998 07:02:25 -- accel/accel.sh@20 -- # read -r var val 00:07:08.374 07:02:26 -- accel/accel.sh@21 -- # val= 00:07:08.374 07:02:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.374 07:02:26 -- accel/accel.sh@20 -- # IFS=: 00:07:08.374 07:02:26 -- accel/accel.sh@20 -- # read -r var val 00:07:08.374 07:02:26 -- accel/accel.sh@21 -- # val= 00:07:08.374 07:02:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.375 07:02:26 -- accel/accel.sh@20 -- # IFS=: 00:07:08.375 07:02:26 -- accel/accel.sh@20 -- # read -r var val 00:07:08.375 07:02:26 -- accel/accel.sh@21 -- # val= 00:07:08.375 07:02:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.375 07:02:26 -- accel/accel.sh@20 -- # IFS=: 00:07:08.375 07:02:26 -- accel/accel.sh@20 -- # read -r var val 00:07:08.375 07:02:26 -- accel/accel.sh@21 -- # val= 00:07:08.375 07:02:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.375 07:02:26 -- accel/accel.sh@20 -- # IFS=: 00:07:08.375 07:02:26 -- accel/accel.sh@20 -- # read -r var val 00:07:08.375 07:02:26 -- accel/accel.sh@21 -- # val= 00:07:08.375 07:02:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.375 07:02:26 -- accel/accel.sh@20 -- # IFS=: 00:07:08.375 07:02:26 -- accel/accel.sh@20 -- # read -r var val 00:07:08.375 07:02:26 -- accel/accel.sh@21 -- # val= 00:07:08.375 07:02:26 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.375 07:02:26 -- accel/accel.sh@20 -- # IFS=: 00:07:08.375 07:02:26 -- accel/accel.sh@20 -- # read -r var val 00:07:08.375 07:02:26 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:08.375 07:02:26 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:07:08.375 07:02:26 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:08.375 00:07:08.375 real 0m2.550s 00:07:08.375 user 0m1.154s 00:07:08.375 sys 0m0.133s 00:07:08.375 07:02:26 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:08.375 07:02:26 -- common/autotest_common.sh@10 -- # set +x 00:07:08.375 ************************************ 00:07:08.375 END TEST accel_dif_generate 00:07:08.375 ************************************ 00:07:08.375 07:02:26 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:08.375 07:02:26 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:08.375 07:02:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:08.375 07:02:26 -- common/autotest_common.sh@10 -- # set +x 00:07:08.375 ************************************ 00:07:08.375 START TEST accel_dif_generate_copy 00:07:08.375 ************************************ 00:07:08.375 07:02:26 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate_copy 00:07:08.375 07:02:26 -- accel/accel.sh@16 -- # local accel_opc 00:07:08.375 07:02:26 -- accel/accel.sh@17 -- # local accel_module 00:07:08.375 07:02:26 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:07:08.375 07:02:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:08.375 07:02:26 -- accel/accel.sh@12 -- # build_accel_config 00:07:08.375 07:02:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:08.375 07:02:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:08.375 07:02:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:08.375 07:02:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:08.375 07:02:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:08.375 07:02:26 -- accel/accel.sh@41 -- # local IFS=, 00:07:08.375 07:02:26 -- accel/accel.sh@42 -- # jq -r . 00:07:08.375 [2024-12-13 07:02:26.238396] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:08.375 [2024-12-13 07:02:26.238479] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid484253 ] 00:07:08.375 EAL: No free 2048 kB hugepages reported on node 1 00:07:08.375 [2024-12-13 07:02:26.305990] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.375 [2024-12-13 07:02:26.341297] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.311 07:02:27 -- accel/accel.sh@18 -- # out=' 00:07:09.311 SPDK Configuration: 00:07:09.311 Core mask: 0x1 00:07:09.311 00:07:09.311 Accel Perf Configuration: 00:07:09.311 Workload Type: dif_generate_copy 00:07:09.311 Vector size: 4096 bytes 00:07:09.311 Transfer size: 4096 bytes 00:07:09.311 Vector count 1 00:07:09.311 Module: software 00:07:09.311 Queue depth: 32 00:07:09.311 Allocate depth: 32 00:07:09.311 # threads/core: 1 00:07:09.311 Run time: 1 seconds 00:07:09.311 Verify: No 00:07:09.311 00:07:09.311 Running for 1 seconds... 00:07:09.311 00:07:09.311 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:09.311 ------------------------------------------------------------------------------------ 00:07:09.311 0,0 223616/s 887 MiB/s 0 0 00:07:09.311 ==================================================================================== 00:07:09.311 Total 223616/s 873 MiB/s 0 0' 00:07:09.311 07:02:27 -- accel/accel.sh@20 -- # IFS=: 00:07:09.311 07:02:27 -- accel/accel.sh@20 -- # read -r var val 00:07:09.311 07:02:27 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:09.311 07:02:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:09.311 07:02:27 -- accel/accel.sh@12 -- # build_accel_config 00:07:09.311 07:02:27 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:09.311 07:02:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:09.311 07:02:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:09.311 07:02:27 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:09.311 07:02:27 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:09.311 07:02:27 -- accel/accel.sh@41 -- # local IFS=, 00:07:09.311 07:02:27 -- accel/accel.sh@42 -- # jq -r . 00:07:09.311 [2024-12-13 07:02:27.510978] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:09.311 [2024-12-13 07:02:27.511039] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid484519 ] 00:07:09.311 EAL: No free 2048 kB hugepages reported on node 1 00:07:09.571 [2024-12-13 07:02:27.572295] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.571 [2024-12-13 07:02:27.606407] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.571 07:02:27 -- accel/accel.sh@21 -- # val= 00:07:09.571 07:02:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # IFS=: 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # read -r var val 00:07:09.571 07:02:27 -- accel/accel.sh@21 -- # val= 00:07:09.571 07:02:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # IFS=: 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # read -r var val 00:07:09.571 07:02:27 -- accel/accel.sh@21 -- # val=0x1 00:07:09.571 07:02:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # IFS=: 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # read -r var val 00:07:09.571 07:02:27 -- accel/accel.sh@21 -- # val= 00:07:09.571 07:02:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # IFS=: 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # read -r var val 00:07:09.571 07:02:27 -- accel/accel.sh@21 -- # val= 00:07:09.571 07:02:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # IFS=: 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # read -r var val 00:07:09.571 07:02:27 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:07:09.571 07:02:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.571 07:02:27 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # IFS=: 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # read -r var val 00:07:09.571 07:02:27 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:09.571 07:02:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # IFS=: 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # read -r var val 00:07:09.571 07:02:27 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:09.571 07:02:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # IFS=: 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # read -r var val 00:07:09.571 07:02:27 -- accel/accel.sh@21 -- # val= 00:07:09.571 07:02:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # IFS=: 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # read -r var val 00:07:09.571 07:02:27 -- accel/accel.sh@21 -- # val=software 00:07:09.571 07:02:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.571 07:02:27 -- accel/accel.sh@23 -- # accel_module=software 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # IFS=: 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # read -r var val 00:07:09.571 07:02:27 -- accel/accel.sh@21 -- # val=32 00:07:09.571 07:02:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # IFS=: 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # read -r var val 00:07:09.571 07:02:27 -- accel/accel.sh@21 -- # val=32 00:07:09.571 07:02:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # IFS=: 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # read -r var val 00:07:09.571 07:02:27 -- accel/accel.sh@21 -- # val=1 00:07:09.571 07:02:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # IFS=: 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # read -r var val 00:07:09.571 07:02:27 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:09.571 07:02:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # IFS=: 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # read -r var val 00:07:09.571 07:02:27 -- accel/accel.sh@21 -- # val=No 00:07:09.571 07:02:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # IFS=: 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # read -r var val 00:07:09.571 07:02:27 -- accel/accel.sh@21 -- # val= 00:07:09.571 07:02:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # IFS=: 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # read -r var val 00:07:09.571 07:02:27 -- accel/accel.sh@21 -- # val= 00:07:09.571 07:02:27 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # IFS=: 00:07:09.571 07:02:27 -- accel/accel.sh@20 -- # read -r var val 00:07:10.948 07:02:28 -- accel/accel.sh@21 -- # val= 00:07:10.948 07:02:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.948 07:02:28 -- accel/accel.sh@20 -- # IFS=: 00:07:10.948 07:02:28 -- accel/accel.sh@20 -- # read -r var val 00:07:10.948 07:02:28 -- accel/accel.sh@21 -- # val= 00:07:10.948 07:02:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.948 07:02:28 -- accel/accel.sh@20 -- # IFS=: 00:07:10.948 07:02:28 -- accel/accel.sh@20 -- # read -r var val 00:07:10.948 07:02:28 -- accel/accel.sh@21 -- # val= 00:07:10.948 07:02:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.948 07:02:28 -- accel/accel.sh@20 -- # IFS=: 00:07:10.948 07:02:28 -- accel/accel.sh@20 -- # read -r var val 00:07:10.948 07:02:28 -- accel/accel.sh@21 -- # val= 00:07:10.948 07:02:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.948 07:02:28 -- accel/accel.sh@20 -- # IFS=: 00:07:10.948 07:02:28 -- accel/accel.sh@20 -- # read -r var val 00:07:10.948 07:02:28 -- accel/accel.sh@21 -- # val= 00:07:10.948 07:02:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.948 07:02:28 -- accel/accel.sh@20 -- # IFS=: 00:07:10.948 07:02:28 -- accel/accel.sh@20 -- # read -r var val 00:07:10.948 07:02:28 -- accel/accel.sh@21 -- # val= 00:07:10.948 07:02:28 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.948 07:02:28 -- accel/accel.sh@20 -- # IFS=: 00:07:10.948 07:02:28 -- accel/accel.sh@20 -- # read -r var val 00:07:10.948 07:02:28 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:10.948 07:02:28 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:07:10.948 07:02:28 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:10.948 00:07:10.948 real 0m2.548s 00:07:10.948 user 0m2.291s 00:07:10.948 sys 0m0.255s 00:07:10.948 07:02:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:10.948 07:02:28 -- common/autotest_common.sh@10 -- # set +x 00:07:10.948 ************************************ 00:07:10.948 END TEST accel_dif_generate_copy 00:07:10.948 ************************************ 00:07:10.948 07:02:28 -- accel/accel.sh@107 -- # [[ y == y ]] 00:07:10.948 07:02:28 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:10.948 07:02:28 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:07:10.948 07:02:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:10.948 07:02:28 -- common/autotest_common.sh@10 -- # set +x 00:07:10.948 ************************************ 00:07:10.948 START TEST accel_comp 00:07:10.948 ************************************ 00:07:10.948 07:02:28 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:10.948 07:02:28 -- accel/accel.sh@16 -- # local accel_opc 00:07:10.948 07:02:28 -- accel/accel.sh@17 -- # local accel_module 00:07:10.948 07:02:28 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:10.948 07:02:28 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:10.948 07:02:28 -- accel/accel.sh@12 -- # build_accel_config 00:07:10.948 07:02:28 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:10.948 07:02:28 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:10.948 07:02:28 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:10.948 07:02:28 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:10.948 07:02:28 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:10.948 07:02:28 -- accel/accel.sh@41 -- # local IFS=, 00:07:10.948 07:02:28 -- accel/accel.sh@42 -- # jq -r . 00:07:10.948 [2024-12-13 07:02:28.829772] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:10.948 [2024-12-13 07:02:28.829858] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid484800 ] 00:07:10.948 EAL: No free 2048 kB hugepages reported on node 1 00:07:10.948 [2024-12-13 07:02:28.898649] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.948 [2024-12-13 07:02:28.933599] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.884 07:02:30 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:11.884 00:07:11.884 SPDK Configuration: 00:07:11.884 Core mask: 0x1 00:07:11.884 00:07:11.884 Accel Perf Configuration: 00:07:11.884 Workload Type: compress 00:07:11.884 Transfer size: 4096 bytes 00:07:11.884 Vector count 1 00:07:11.884 Module: software 00:07:11.884 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:11.884 Queue depth: 32 00:07:11.884 Allocate depth: 32 00:07:11.884 # threads/core: 1 00:07:11.884 Run time: 1 seconds 00:07:11.884 Verify: No 00:07:11.884 00:07:11.884 Running for 1 seconds... 00:07:11.884 00:07:11.884 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:11.884 ------------------------------------------------------------------------------------ 00:07:11.884 0,0 68096/s 283 MiB/s 0 0 00:07:11.884 ==================================================================================== 00:07:11.884 Total 68096/s 266 MiB/s 0 0' 00:07:11.884 07:02:30 -- accel/accel.sh@20 -- # IFS=: 00:07:11.884 07:02:30 -- accel/accel.sh@20 -- # read -r var val 00:07:11.884 07:02:30 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:11.885 07:02:30 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:11.885 07:02:30 -- accel/accel.sh@12 -- # build_accel_config 00:07:11.885 07:02:30 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:11.885 07:02:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:11.885 07:02:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:11.885 07:02:30 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:11.885 07:02:30 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:11.885 07:02:30 -- accel/accel.sh@41 -- # local IFS=, 00:07:11.885 07:02:30 -- accel/accel.sh@42 -- # jq -r . 00:07:11.885 [2024-12-13 07:02:30.116255] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:11.885 [2024-12-13 07:02:30.116342] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid484954 ] 00:07:12.143 EAL: No free 2048 kB hugepages reported on node 1 00:07:12.143 [2024-12-13 07:02:30.184765] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.143 [2024-12-13 07:02:30.221112] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.143 07:02:30 -- accel/accel.sh@21 -- # val= 00:07:12.143 07:02:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # IFS=: 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # read -r var val 00:07:12.143 07:02:30 -- accel/accel.sh@21 -- # val= 00:07:12.143 07:02:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # IFS=: 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # read -r var val 00:07:12.143 07:02:30 -- accel/accel.sh@21 -- # val= 00:07:12.143 07:02:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # IFS=: 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # read -r var val 00:07:12.143 07:02:30 -- accel/accel.sh@21 -- # val=0x1 00:07:12.143 07:02:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # IFS=: 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # read -r var val 00:07:12.143 07:02:30 -- accel/accel.sh@21 -- # val= 00:07:12.143 07:02:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # IFS=: 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # read -r var val 00:07:12.143 07:02:30 -- accel/accel.sh@21 -- # val= 00:07:12.143 07:02:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # IFS=: 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # read -r var val 00:07:12.143 07:02:30 -- accel/accel.sh@21 -- # val=compress 00:07:12.143 07:02:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.143 07:02:30 -- accel/accel.sh@24 -- # accel_opc=compress 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # IFS=: 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # read -r var val 00:07:12.143 07:02:30 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:12.143 07:02:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # IFS=: 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # read -r var val 00:07:12.143 07:02:30 -- accel/accel.sh@21 -- # val= 00:07:12.143 07:02:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # IFS=: 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # read -r var val 00:07:12.143 07:02:30 -- accel/accel.sh@21 -- # val=software 00:07:12.143 07:02:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.143 07:02:30 -- accel/accel.sh@23 -- # accel_module=software 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # IFS=: 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # read -r var val 00:07:12.143 07:02:30 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:12.143 07:02:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # IFS=: 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # read -r var val 00:07:12.143 07:02:30 -- accel/accel.sh@21 -- # val=32 00:07:12.143 07:02:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # IFS=: 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # read -r var val 00:07:12.143 07:02:30 -- accel/accel.sh@21 -- # val=32 00:07:12.143 07:02:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # IFS=: 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # read -r var val 00:07:12.143 07:02:30 -- accel/accel.sh@21 -- # val=1 00:07:12.143 07:02:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # IFS=: 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # read -r var val 00:07:12.143 07:02:30 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:12.143 07:02:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # IFS=: 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # read -r var val 00:07:12.143 07:02:30 -- accel/accel.sh@21 -- # val=No 00:07:12.143 07:02:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # IFS=: 00:07:12.143 07:02:30 -- accel/accel.sh@20 -- # read -r var val 00:07:12.144 07:02:30 -- accel/accel.sh@21 -- # val= 00:07:12.144 07:02:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.144 07:02:30 -- accel/accel.sh@20 -- # IFS=: 00:07:12.144 07:02:30 -- accel/accel.sh@20 -- # read -r var val 00:07:12.144 07:02:30 -- accel/accel.sh@21 -- # val= 00:07:12.144 07:02:30 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.144 07:02:30 -- accel/accel.sh@20 -- # IFS=: 00:07:12.144 07:02:30 -- accel/accel.sh@20 -- # read -r var val 00:07:13.521 07:02:31 -- accel/accel.sh@21 -- # val= 00:07:13.521 07:02:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.521 07:02:31 -- accel/accel.sh@20 -- # IFS=: 00:07:13.521 07:02:31 -- accel/accel.sh@20 -- # read -r var val 00:07:13.521 07:02:31 -- accel/accel.sh@21 -- # val= 00:07:13.521 07:02:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.521 07:02:31 -- accel/accel.sh@20 -- # IFS=: 00:07:13.521 07:02:31 -- accel/accel.sh@20 -- # read -r var val 00:07:13.521 07:02:31 -- accel/accel.sh@21 -- # val= 00:07:13.521 07:02:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.521 07:02:31 -- accel/accel.sh@20 -- # IFS=: 00:07:13.521 07:02:31 -- accel/accel.sh@20 -- # read -r var val 00:07:13.521 07:02:31 -- accel/accel.sh@21 -- # val= 00:07:13.521 07:02:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.521 07:02:31 -- accel/accel.sh@20 -- # IFS=: 00:07:13.521 07:02:31 -- accel/accel.sh@20 -- # read -r var val 00:07:13.521 07:02:31 -- accel/accel.sh@21 -- # val= 00:07:13.521 07:02:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.521 07:02:31 -- accel/accel.sh@20 -- # IFS=: 00:07:13.521 07:02:31 -- accel/accel.sh@20 -- # read -r var val 00:07:13.521 07:02:31 -- accel/accel.sh@21 -- # val= 00:07:13.521 07:02:31 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.521 07:02:31 -- accel/accel.sh@20 -- # IFS=: 00:07:13.521 07:02:31 -- accel/accel.sh@20 -- # read -r var val 00:07:13.521 07:02:31 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:13.521 07:02:31 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:07:13.521 07:02:31 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:13.521 00:07:13.521 real 0m2.575s 00:07:13.521 user 0m2.310s 00:07:13.521 sys 0m0.262s 00:07:13.521 07:02:31 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:13.521 07:02:31 -- common/autotest_common.sh@10 -- # set +x 00:07:13.521 ************************************ 00:07:13.521 END TEST accel_comp 00:07:13.521 ************************************ 00:07:13.521 07:02:31 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:13.521 07:02:31 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:13.521 07:02:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:13.521 07:02:31 -- common/autotest_common.sh@10 -- # set +x 00:07:13.521 ************************************ 00:07:13.521 START TEST accel_decomp 00:07:13.521 ************************************ 00:07:13.521 07:02:31 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:13.521 07:02:31 -- accel/accel.sh@16 -- # local accel_opc 00:07:13.521 07:02:31 -- accel/accel.sh@17 -- # local accel_module 00:07:13.521 07:02:31 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:13.521 07:02:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:13.521 07:02:31 -- accel/accel.sh@12 -- # build_accel_config 00:07:13.521 07:02:31 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:13.521 07:02:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:13.521 07:02:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:13.521 07:02:31 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:13.521 07:02:31 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:13.521 07:02:31 -- accel/accel.sh@41 -- # local IFS=, 00:07:13.521 07:02:31 -- accel/accel.sh@42 -- # jq -r . 00:07:13.521 [2024-12-13 07:02:31.448697] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:13.521 [2024-12-13 07:02:31.448782] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid485138 ] 00:07:13.521 EAL: No free 2048 kB hugepages reported on node 1 00:07:13.521 [2024-12-13 07:02:31.516999] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.521 [2024-12-13 07:02:31.552821] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.899 07:02:32 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:14.899 00:07:14.899 SPDK Configuration: 00:07:14.899 Core mask: 0x1 00:07:14.899 00:07:14.899 Accel Perf Configuration: 00:07:14.899 Workload Type: decompress 00:07:14.899 Transfer size: 4096 bytes 00:07:14.899 Vector count 1 00:07:14.899 Module: software 00:07:14.899 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:14.899 Queue depth: 32 00:07:14.899 Allocate depth: 32 00:07:14.899 # threads/core: 1 00:07:14.899 Run time: 1 seconds 00:07:14.899 Verify: Yes 00:07:14.899 00:07:14.899 Running for 1 seconds... 00:07:14.899 00:07:14.899 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:14.899 ------------------------------------------------------------------------------------ 00:07:14.899 0,0 94816/s 174 MiB/s 0 0 00:07:14.899 ==================================================================================== 00:07:14.899 Total 94816/s 370 MiB/s 0 0' 00:07:14.899 07:02:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.899 07:02:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.899 07:02:32 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:14.899 07:02:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:14.899 07:02:32 -- accel/accel.sh@12 -- # build_accel_config 00:07:14.899 07:02:32 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:14.899 07:02:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:14.899 07:02:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:14.900 07:02:32 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:14.900 07:02:32 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:14.900 07:02:32 -- accel/accel.sh@41 -- # local IFS=, 00:07:14.900 07:02:32 -- accel/accel.sh@42 -- # jq -r . 00:07:14.900 [2024-12-13 07:02:32.725437] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:14.900 [2024-12-13 07:02:32.725498] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid485381 ] 00:07:14.900 EAL: No free 2048 kB hugepages reported on node 1 00:07:14.900 [2024-12-13 07:02:32.787777] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.900 [2024-12-13 07:02:32.821710] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.900 07:02:32 -- accel/accel.sh@21 -- # val= 00:07:14.900 07:02:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.900 07:02:32 -- accel/accel.sh@21 -- # val= 00:07:14.900 07:02:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.900 07:02:32 -- accel/accel.sh@21 -- # val= 00:07:14.900 07:02:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.900 07:02:32 -- accel/accel.sh@21 -- # val=0x1 00:07:14.900 07:02:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.900 07:02:32 -- accel/accel.sh@21 -- # val= 00:07:14.900 07:02:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.900 07:02:32 -- accel/accel.sh@21 -- # val= 00:07:14.900 07:02:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.900 07:02:32 -- accel/accel.sh@21 -- # val=decompress 00:07:14.900 07:02:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.900 07:02:32 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.900 07:02:32 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:14.900 07:02:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.900 07:02:32 -- accel/accel.sh@21 -- # val= 00:07:14.900 07:02:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.900 07:02:32 -- accel/accel.sh@21 -- # val=software 00:07:14.900 07:02:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.900 07:02:32 -- accel/accel.sh@23 -- # accel_module=software 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.900 07:02:32 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:14.900 07:02:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.900 07:02:32 -- accel/accel.sh@21 -- # val=32 00:07:14.900 07:02:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.900 07:02:32 -- accel/accel.sh@21 -- # val=32 00:07:14.900 07:02:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.900 07:02:32 -- accel/accel.sh@21 -- # val=1 00:07:14.900 07:02:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.900 07:02:32 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:14.900 07:02:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.900 07:02:32 -- accel/accel.sh@21 -- # val=Yes 00:07:14.900 07:02:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.900 07:02:32 -- accel/accel.sh@21 -- # val= 00:07:14.900 07:02:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # read -r var val 00:07:14.900 07:02:32 -- accel/accel.sh@21 -- # val= 00:07:14.900 07:02:32 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # IFS=: 00:07:14.900 07:02:32 -- accel/accel.sh@20 -- # read -r var val 00:07:15.837 07:02:33 -- accel/accel.sh@21 -- # val= 00:07:15.837 07:02:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.837 07:02:33 -- accel/accel.sh@20 -- # IFS=: 00:07:15.837 07:02:33 -- accel/accel.sh@20 -- # read -r var val 00:07:15.837 07:02:33 -- accel/accel.sh@21 -- # val= 00:07:15.837 07:02:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.837 07:02:33 -- accel/accel.sh@20 -- # IFS=: 00:07:15.837 07:02:33 -- accel/accel.sh@20 -- # read -r var val 00:07:15.837 07:02:33 -- accel/accel.sh@21 -- # val= 00:07:15.837 07:02:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.837 07:02:33 -- accel/accel.sh@20 -- # IFS=: 00:07:15.837 07:02:33 -- accel/accel.sh@20 -- # read -r var val 00:07:15.837 07:02:33 -- accel/accel.sh@21 -- # val= 00:07:15.837 07:02:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.837 07:02:33 -- accel/accel.sh@20 -- # IFS=: 00:07:15.837 07:02:33 -- accel/accel.sh@20 -- # read -r var val 00:07:15.837 07:02:33 -- accel/accel.sh@21 -- # val= 00:07:15.837 07:02:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.837 07:02:33 -- accel/accel.sh@20 -- # IFS=: 00:07:15.837 07:02:33 -- accel/accel.sh@20 -- # read -r var val 00:07:15.837 07:02:33 -- accel/accel.sh@21 -- # val= 00:07:15.837 07:02:33 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.837 07:02:33 -- accel/accel.sh@20 -- # IFS=: 00:07:15.837 07:02:33 -- accel/accel.sh@20 -- # read -r var val 00:07:15.837 07:02:33 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:15.837 07:02:33 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:15.837 07:02:33 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:15.837 00:07:15.837 real 0m2.556s 00:07:15.837 user 0m2.301s 00:07:15.837 sys 0m0.251s 00:07:15.837 07:02:33 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:15.837 07:02:33 -- common/autotest_common.sh@10 -- # set +x 00:07:15.837 ************************************ 00:07:15.837 END TEST accel_decomp 00:07:15.837 ************************************ 00:07:15.837 07:02:34 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:15.837 07:02:34 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:15.837 07:02:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:15.837 07:02:34 -- common/autotest_common.sh@10 -- # set +x 00:07:15.837 ************************************ 00:07:15.837 START TEST accel_decmop_full 00:07:15.837 ************************************ 00:07:15.837 07:02:34 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:15.837 07:02:34 -- accel/accel.sh@16 -- # local accel_opc 00:07:15.837 07:02:34 -- accel/accel.sh@17 -- # local accel_module 00:07:15.837 07:02:34 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:15.837 07:02:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:15.837 07:02:34 -- accel/accel.sh@12 -- # build_accel_config 00:07:15.837 07:02:34 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:15.837 07:02:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:15.837 07:02:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:15.837 07:02:34 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:15.837 07:02:34 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:15.837 07:02:34 -- accel/accel.sh@41 -- # local IFS=, 00:07:15.837 07:02:34 -- accel/accel.sh@42 -- # jq -r . 00:07:15.837 [2024-12-13 07:02:34.047958] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:15.837 [2024-12-13 07:02:34.048066] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid485662 ] 00:07:16.096 EAL: No free 2048 kB hugepages reported on node 1 00:07:16.096 [2024-12-13 07:02:34.116443] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.096 [2024-12-13 07:02:34.151468] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.473 07:02:35 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:17.473 00:07:17.474 SPDK Configuration: 00:07:17.474 Core mask: 0x1 00:07:17.474 00:07:17.474 Accel Perf Configuration: 00:07:17.474 Workload Type: decompress 00:07:17.474 Transfer size: 111250 bytes 00:07:17.474 Vector count 1 00:07:17.474 Module: software 00:07:17.474 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:17.474 Queue depth: 32 00:07:17.474 Allocate depth: 32 00:07:17.474 # threads/core: 1 00:07:17.474 Run time: 1 seconds 00:07:17.474 Verify: Yes 00:07:17.474 00:07:17.474 Running for 1 seconds... 00:07:17.474 00:07:17.474 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:17.474 ------------------------------------------------------------------------------------ 00:07:17.474 0,0 5920/s 244 MiB/s 0 0 00:07:17.474 ==================================================================================== 00:07:17.474 Total 5920/s 628 MiB/s 0 0' 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # IFS=: 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # read -r var val 00:07:17.474 07:02:35 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:17.474 07:02:35 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:17.474 07:02:35 -- accel/accel.sh@12 -- # build_accel_config 00:07:17.474 07:02:35 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:17.474 07:02:35 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:17.474 07:02:35 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:17.474 07:02:35 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:17.474 07:02:35 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:17.474 07:02:35 -- accel/accel.sh@41 -- # local IFS=, 00:07:17.474 07:02:35 -- accel/accel.sh@42 -- # jq -r . 00:07:17.474 [2024-12-13 07:02:35.338997] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:17.474 [2024-12-13 07:02:35.339100] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid485931 ] 00:07:17.474 EAL: No free 2048 kB hugepages reported on node 1 00:07:17.474 [2024-12-13 07:02:35.406978] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.474 [2024-12-13 07:02:35.440855] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.474 07:02:35 -- accel/accel.sh@21 -- # val= 00:07:17.474 07:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # IFS=: 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # read -r var val 00:07:17.474 07:02:35 -- accel/accel.sh@21 -- # val= 00:07:17.474 07:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # IFS=: 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # read -r var val 00:07:17.474 07:02:35 -- accel/accel.sh@21 -- # val= 00:07:17.474 07:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # IFS=: 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # read -r var val 00:07:17.474 07:02:35 -- accel/accel.sh@21 -- # val=0x1 00:07:17.474 07:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # IFS=: 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # read -r var val 00:07:17.474 07:02:35 -- accel/accel.sh@21 -- # val= 00:07:17.474 07:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # IFS=: 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # read -r var val 00:07:17.474 07:02:35 -- accel/accel.sh@21 -- # val= 00:07:17.474 07:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # IFS=: 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # read -r var val 00:07:17.474 07:02:35 -- accel/accel.sh@21 -- # val=decompress 00:07:17.474 07:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.474 07:02:35 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # IFS=: 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # read -r var val 00:07:17.474 07:02:35 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:17.474 07:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # IFS=: 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # read -r var val 00:07:17.474 07:02:35 -- accel/accel.sh@21 -- # val= 00:07:17.474 07:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # IFS=: 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # read -r var val 00:07:17.474 07:02:35 -- accel/accel.sh@21 -- # val=software 00:07:17.474 07:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.474 07:02:35 -- accel/accel.sh@23 -- # accel_module=software 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # IFS=: 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # read -r var val 00:07:17.474 07:02:35 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:17.474 07:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # IFS=: 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # read -r var val 00:07:17.474 07:02:35 -- accel/accel.sh@21 -- # val=32 00:07:17.474 07:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # IFS=: 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # read -r var val 00:07:17.474 07:02:35 -- accel/accel.sh@21 -- # val=32 00:07:17.474 07:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # IFS=: 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # read -r var val 00:07:17.474 07:02:35 -- accel/accel.sh@21 -- # val=1 00:07:17.474 07:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # IFS=: 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # read -r var val 00:07:17.474 07:02:35 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:17.474 07:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # IFS=: 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # read -r var val 00:07:17.474 07:02:35 -- accel/accel.sh@21 -- # val=Yes 00:07:17.474 07:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # IFS=: 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # read -r var val 00:07:17.474 07:02:35 -- accel/accel.sh@21 -- # val= 00:07:17.474 07:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # IFS=: 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # read -r var val 00:07:17.474 07:02:35 -- accel/accel.sh@21 -- # val= 00:07:17.474 07:02:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # IFS=: 00:07:17.474 07:02:35 -- accel/accel.sh@20 -- # read -r var val 00:07:18.410 07:02:36 -- accel/accel.sh@21 -- # val= 00:07:18.410 07:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.410 07:02:36 -- accel/accel.sh@20 -- # IFS=: 00:07:18.410 07:02:36 -- accel/accel.sh@20 -- # read -r var val 00:07:18.410 07:02:36 -- accel/accel.sh@21 -- # val= 00:07:18.410 07:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.410 07:02:36 -- accel/accel.sh@20 -- # IFS=: 00:07:18.410 07:02:36 -- accel/accel.sh@20 -- # read -r var val 00:07:18.410 07:02:36 -- accel/accel.sh@21 -- # val= 00:07:18.410 07:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.410 07:02:36 -- accel/accel.sh@20 -- # IFS=: 00:07:18.410 07:02:36 -- accel/accel.sh@20 -- # read -r var val 00:07:18.410 07:02:36 -- accel/accel.sh@21 -- # val= 00:07:18.410 07:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.410 07:02:36 -- accel/accel.sh@20 -- # IFS=: 00:07:18.410 07:02:36 -- accel/accel.sh@20 -- # read -r var val 00:07:18.410 07:02:36 -- accel/accel.sh@21 -- # val= 00:07:18.410 07:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.410 07:02:36 -- accel/accel.sh@20 -- # IFS=: 00:07:18.410 07:02:36 -- accel/accel.sh@20 -- # read -r var val 00:07:18.410 07:02:36 -- accel/accel.sh@21 -- # val= 00:07:18.410 07:02:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.410 07:02:36 -- accel/accel.sh@20 -- # IFS=: 00:07:18.410 07:02:36 -- accel/accel.sh@20 -- # read -r var val 00:07:18.410 07:02:36 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:18.410 07:02:36 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:18.410 07:02:36 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:18.410 00:07:18.410 real 0m2.585s 00:07:18.410 user 0m2.321s 00:07:18.410 sys 0m0.258s 00:07:18.410 07:02:36 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:18.410 07:02:36 -- common/autotest_common.sh@10 -- # set +x 00:07:18.410 ************************************ 00:07:18.410 END TEST accel_decmop_full 00:07:18.410 ************************************ 00:07:18.411 07:02:36 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:18.411 07:02:36 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:18.411 07:02:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:18.411 07:02:36 -- common/autotest_common.sh@10 -- # set +x 00:07:18.669 ************************************ 00:07:18.669 START TEST accel_decomp_mcore 00:07:18.669 ************************************ 00:07:18.669 07:02:36 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:18.669 07:02:36 -- accel/accel.sh@16 -- # local accel_opc 00:07:18.669 07:02:36 -- accel/accel.sh@17 -- # local accel_module 00:07:18.669 07:02:36 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:18.669 07:02:36 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:18.669 07:02:36 -- accel/accel.sh@12 -- # build_accel_config 00:07:18.669 07:02:36 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:18.669 07:02:36 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:18.669 07:02:36 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:18.669 07:02:36 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:18.669 07:02:36 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:18.669 07:02:36 -- accel/accel.sh@41 -- # local IFS=, 00:07:18.669 07:02:36 -- accel/accel.sh@42 -- # jq -r . 00:07:18.669 [2024-12-13 07:02:36.674473] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:18.669 [2024-12-13 07:02:36.674573] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid486217 ] 00:07:18.669 EAL: No free 2048 kB hugepages reported on node 1 00:07:18.669 [2024-12-13 07:02:36.742091] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:18.669 [2024-12-13 07:02:36.779657] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:18.669 [2024-12-13 07:02:36.779756] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:18.669 [2024-12-13 07:02:36.779844] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:18.669 [2024-12-13 07:02:36.779846] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.045 07:02:37 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:20.045 00:07:20.045 SPDK Configuration: 00:07:20.045 Core mask: 0xf 00:07:20.045 00:07:20.045 Accel Perf Configuration: 00:07:20.045 Workload Type: decompress 00:07:20.045 Transfer size: 4096 bytes 00:07:20.045 Vector count 1 00:07:20.045 Module: software 00:07:20.045 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:20.045 Queue depth: 32 00:07:20.045 Allocate depth: 32 00:07:20.045 # threads/core: 1 00:07:20.045 Run time: 1 seconds 00:07:20.045 Verify: Yes 00:07:20.045 00:07:20.045 Running for 1 seconds... 00:07:20.045 00:07:20.045 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:20.045 ------------------------------------------------------------------------------------ 00:07:20.045 0,0 77984/s 143 MiB/s 0 0 00:07:20.045 3,0 78368/s 144 MiB/s 0 0 00:07:20.045 2,0 78112/s 143 MiB/s 0 0 00:07:20.045 1,0 78272/s 144 MiB/s 0 0 00:07:20.045 ==================================================================================== 00:07:20.045 Total 312736/s 1221 MiB/s 0 0' 00:07:20.045 07:02:37 -- accel/accel.sh@20 -- # IFS=: 00:07:20.045 07:02:37 -- accel/accel.sh@20 -- # read -r var val 00:07:20.045 07:02:37 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:20.045 07:02:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:20.045 07:02:37 -- accel/accel.sh@12 -- # build_accel_config 00:07:20.045 07:02:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:20.045 07:02:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:20.045 07:02:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:20.045 07:02:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:20.045 07:02:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:20.045 07:02:37 -- accel/accel.sh@41 -- # local IFS=, 00:07:20.045 07:02:37 -- accel/accel.sh@42 -- # jq -r . 00:07:20.045 [2024-12-13 07:02:37.972575] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:20.045 [2024-12-13 07:02:37.972663] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid486448 ] 00:07:20.045 EAL: No free 2048 kB hugepages reported on node 1 00:07:20.046 [2024-12-13 07:02:38.040509] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:20.046 [2024-12-13 07:02:38.076948] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:20.046 [2024-12-13 07:02:38.077044] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:20.046 [2024-12-13 07:02:38.077105] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:20.046 [2024-12-13 07:02:38.077107] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.046 07:02:38 -- accel/accel.sh@21 -- # val= 00:07:20.046 07:02:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # IFS=: 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # read -r var val 00:07:20.046 07:02:38 -- accel/accel.sh@21 -- # val= 00:07:20.046 07:02:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # IFS=: 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # read -r var val 00:07:20.046 07:02:38 -- accel/accel.sh@21 -- # val= 00:07:20.046 07:02:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # IFS=: 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # read -r var val 00:07:20.046 07:02:38 -- accel/accel.sh@21 -- # val=0xf 00:07:20.046 07:02:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # IFS=: 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # read -r var val 00:07:20.046 07:02:38 -- accel/accel.sh@21 -- # val= 00:07:20.046 07:02:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # IFS=: 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # read -r var val 00:07:20.046 07:02:38 -- accel/accel.sh@21 -- # val= 00:07:20.046 07:02:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # IFS=: 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # read -r var val 00:07:20.046 07:02:38 -- accel/accel.sh@21 -- # val=decompress 00:07:20.046 07:02:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.046 07:02:38 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # IFS=: 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # read -r var val 00:07:20.046 07:02:38 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:20.046 07:02:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # IFS=: 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # read -r var val 00:07:20.046 07:02:38 -- accel/accel.sh@21 -- # val= 00:07:20.046 07:02:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # IFS=: 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # read -r var val 00:07:20.046 07:02:38 -- accel/accel.sh@21 -- # val=software 00:07:20.046 07:02:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.046 07:02:38 -- accel/accel.sh@23 -- # accel_module=software 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # IFS=: 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # read -r var val 00:07:20.046 07:02:38 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:20.046 07:02:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # IFS=: 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # read -r var val 00:07:20.046 07:02:38 -- accel/accel.sh@21 -- # val=32 00:07:20.046 07:02:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # IFS=: 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # read -r var val 00:07:20.046 07:02:38 -- accel/accel.sh@21 -- # val=32 00:07:20.046 07:02:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # IFS=: 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # read -r var val 00:07:20.046 07:02:38 -- accel/accel.sh@21 -- # val=1 00:07:20.046 07:02:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # IFS=: 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # read -r var val 00:07:20.046 07:02:38 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:20.046 07:02:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # IFS=: 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # read -r var val 00:07:20.046 07:02:38 -- accel/accel.sh@21 -- # val=Yes 00:07:20.046 07:02:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # IFS=: 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # read -r var val 00:07:20.046 07:02:38 -- accel/accel.sh@21 -- # val= 00:07:20.046 07:02:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # IFS=: 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # read -r var val 00:07:20.046 07:02:38 -- accel/accel.sh@21 -- # val= 00:07:20.046 07:02:38 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # IFS=: 00:07:20.046 07:02:38 -- accel/accel.sh@20 -- # read -r var val 00:07:21.423 07:02:39 -- accel/accel.sh@21 -- # val= 00:07:21.423 07:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.423 07:02:39 -- accel/accel.sh@20 -- # IFS=: 00:07:21.423 07:02:39 -- accel/accel.sh@20 -- # read -r var val 00:07:21.423 07:02:39 -- accel/accel.sh@21 -- # val= 00:07:21.423 07:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.423 07:02:39 -- accel/accel.sh@20 -- # IFS=: 00:07:21.423 07:02:39 -- accel/accel.sh@20 -- # read -r var val 00:07:21.423 07:02:39 -- accel/accel.sh@21 -- # val= 00:07:21.423 07:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.423 07:02:39 -- accel/accel.sh@20 -- # IFS=: 00:07:21.423 07:02:39 -- accel/accel.sh@20 -- # read -r var val 00:07:21.423 07:02:39 -- accel/accel.sh@21 -- # val= 00:07:21.423 07:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.423 07:02:39 -- accel/accel.sh@20 -- # IFS=: 00:07:21.424 07:02:39 -- accel/accel.sh@20 -- # read -r var val 00:07:21.424 07:02:39 -- accel/accel.sh@21 -- # val= 00:07:21.424 07:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.424 07:02:39 -- accel/accel.sh@20 -- # IFS=: 00:07:21.424 07:02:39 -- accel/accel.sh@20 -- # read -r var val 00:07:21.424 07:02:39 -- accel/accel.sh@21 -- # val= 00:07:21.424 07:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.424 07:02:39 -- accel/accel.sh@20 -- # IFS=: 00:07:21.424 07:02:39 -- accel/accel.sh@20 -- # read -r var val 00:07:21.424 07:02:39 -- accel/accel.sh@21 -- # val= 00:07:21.424 07:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.424 07:02:39 -- accel/accel.sh@20 -- # IFS=: 00:07:21.424 07:02:39 -- accel/accel.sh@20 -- # read -r var val 00:07:21.424 07:02:39 -- accel/accel.sh@21 -- # val= 00:07:21.424 07:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.424 07:02:39 -- accel/accel.sh@20 -- # IFS=: 00:07:21.424 07:02:39 -- accel/accel.sh@20 -- # read -r var val 00:07:21.424 07:02:39 -- accel/accel.sh@21 -- # val= 00:07:21.424 07:02:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:21.424 07:02:39 -- accel/accel.sh@20 -- # IFS=: 00:07:21.424 07:02:39 -- accel/accel.sh@20 -- # read -r var val 00:07:21.424 07:02:39 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:21.424 07:02:39 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:21.424 07:02:39 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:21.424 00:07:21.424 real 0m2.603s 00:07:21.424 user 0m9.009s 00:07:21.424 sys 0m0.263s 00:07:21.424 07:02:39 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:21.424 07:02:39 -- common/autotest_common.sh@10 -- # set +x 00:07:21.424 ************************************ 00:07:21.424 END TEST accel_decomp_mcore 00:07:21.424 ************************************ 00:07:21.424 07:02:39 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:21.424 07:02:39 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:21.424 07:02:39 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:21.424 07:02:39 -- common/autotest_common.sh@10 -- # set +x 00:07:21.424 ************************************ 00:07:21.424 START TEST accel_decomp_full_mcore 00:07:21.424 ************************************ 00:07:21.424 07:02:39 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:21.424 07:02:39 -- accel/accel.sh@16 -- # local accel_opc 00:07:21.424 07:02:39 -- accel/accel.sh@17 -- # local accel_module 00:07:21.424 07:02:39 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:21.424 07:02:39 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:21.424 07:02:39 -- accel/accel.sh@12 -- # build_accel_config 00:07:21.424 07:02:39 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:21.424 07:02:39 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:21.424 07:02:39 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:21.424 07:02:39 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:21.424 07:02:39 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:21.424 07:02:39 -- accel/accel.sh@41 -- # local IFS=, 00:07:21.424 07:02:39 -- accel/accel.sh@42 -- # jq -r . 00:07:21.424 [2024-12-13 07:02:39.327622] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:21.424 [2024-12-13 07:02:39.327712] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid486639 ] 00:07:21.424 EAL: No free 2048 kB hugepages reported on node 1 00:07:21.424 [2024-12-13 07:02:39.397260] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:21.424 [2024-12-13 07:02:39.435297] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:21.424 [2024-12-13 07:02:39.435395] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:21.424 [2024-12-13 07:02:39.435476] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:21.424 [2024-12-13 07:02:39.435478] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.802 07:02:40 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:22.802 00:07:22.802 SPDK Configuration: 00:07:22.802 Core mask: 0xf 00:07:22.802 00:07:22.802 Accel Perf Configuration: 00:07:22.802 Workload Type: decompress 00:07:22.802 Transfer size: 111250 bytes 00:07:22.802 Vector count 1 00:07:22.802 Module: software 00:07:22.802 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:22.802 Queue depth: 32 00:07:22.802 Allocate depth: 32 00:07:22.802 # threads/core: 1 00:07:22.802 Run time: 1 seconds 00:07:22.802 Verify: Yes 00:07:22.802 00:07:22.802 Running for 1 seconds... 00:07:22.802 00:07:22.802 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:22.802 ------------------------------------------------------------------------------------ 00:07:22.802 0,0 5792/s 239 MiB/s 0 0 00:07:22.802 3,0 5824/s 240 MiB/s 0 0 00:07:22.802 2,0 5824/s 240 MiB/s 0 0 00:07:22.802 1,0 5824/s 240 MiB/s 0 0 00:07:22.802 ==================================================================================== 00:07:22.802 Total 23264/s 2468 MiB/s 0 0' 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.802 07:02:40 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:22.802 07:02:40 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:22.802 07:02:40 -- accel/accel.sh@12 -- # build_accel_config 00:07:22.802 07:02:40 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:22.802 07:02:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:22.802 07:02:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:22.802 07:02:40 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:22.802 07:02:40 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:22.802 07:02:40 -- accel/accel.sh@41 -- # local IFS=, 00:07:22.802 07:02:40 -- accel/accel.sh@42 -- # jq -r . 00:07:22.802 [2024-12-13 07:02:40.633070] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:22.802 [2024-12-13 07:02:40.633157] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid486811 ] 00:07:22.802 EAL: No free 2048 kB hugepages reported on node 1 00:07:22.802 [2024-12-13 07:02:40.702643] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:22.802 [2024-12-13 07:02:40.739736] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:22.802 [2024-12-13 07:02:40.739831] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:22.802 [2024-12-13 07:02:40.739894] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:22.802 [2024-12-13 07:02:40.739896] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.802 07:02:40 -- accel/accel.sh@21 -- # val= 00:07:22.802 07:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.802 07:02:40 -- accel/accel.sh@21 -- # val= 00:07:22.802 07:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.802 07:02:40 -- accel/accel.sh@21 -- # val= 00:07:22.802 07:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.802 07:02:40 -- accel/accel.sh@21 -- # val=0xf 00:07:22.802 07:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.802 07:02:40 -- accel/accel.sh@21 -- # val= 00:07:22.802 07:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.802 07:02:40 -- accel/accel.sh@21 -- # val= 00:07:22.802 07:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.802 07:02:40 -- accel/accel.sh@21 -- # val=decompress 00:07:22.802 07:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.802 07:02:40 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.802 07:02:40 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:22.802 07:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.802 07:02:40 -- accel/accel.sh@21 -- # val= 00:07:22.802 07:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.802 07:02:40 -- accel/accel.sh@21 -- # val=software 00:07:22.802 07:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.802 07:02:40 -- accel/accel.sh@23 -- # accel_module=software 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.802 07:02:40 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:22.802 07:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.802 07:02:40 -- accel/accel.sh@21 -- # val=32 00:07:22.802 07:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.802 07:02:40 -- accel/accel.sh@21 -- # val=32 00:07:22.802 07:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.802 07:02:40 -- accel/accel.sh@21 -- # val=1 00:07:22.802 07:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.802 07:02:40 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:22.802 07:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.802 07:02:40 -- accel/accel.sh@21 -- # val=Yes 00:07:22.802 07:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.802 07:02:40 -- accel/accel.sh@21 -- # val= 00:07:22.802 07:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # read -r var val 00:07:22.802 07:02:40 -- accel/accel.sh@21 -- # val= 00:07:22.802 07:02:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # IFS=: 00:07:22.802 07:02:40 -- accel/accel.sh@20 -- # read -r var val 00:07:23.738 07:02:41 -- accel/accel.sh@21 -- # val= 00:07:23.738 07:02:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.738 07:02:41 -- accel/accel.sh@20 -- # IFS=: 00:07:23.738 07:02:41 -- accel/accel.sh@20 -- # read -r var val 00:07:23.738 07:02:41 -- accel/accel.sh@21 -- # val= 00:07:23.739 07:02:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.739 07:02:41 -- accel/accel.sh@20 -- # IFS=: 00:07:23.739 07:02:41 -- accel/accel.sh@20 -- # read -r var val 00:07:23.739 07:02:41 -- accel/accel.sh@21 -- # val= 00:07:23.739 07:02:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.739 07:02:41 -- accel/accel.sh@20 -- # IFS=: 00:07:23.739 07:02:41 -- accel/accel.sh@20 -- # read -r var val 00:07:23.739 07:02:41 -- accel/accel.sh@21 -- # val= 00:07:23.739 07:02:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.739 07:02:41 -- accel/accel.sh@20 -- # IFS=: 00:07:23.739 07:02:41 -- accel/accel.sh@20 -- # read -r var val 00:07:23.739 07:02:41 -- accel/accel.sh@21 -- # val= 00:07:23.739 07:02:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.739 07:02:41 -- accel/accel.sh@20 -- # IFS=: 00:07:23.739 07:02:41 -- accel/accel.sh@20 -- # read -r var val 00:07:23.739 07:02:41 -- accel/accel.sh@21 -- # val= 00:07:23.739 07:02:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.739 07:02:41 -- accel/accel.sh@20 -- # IFS=: 00:07:23.739 07:02:41 -- accel/accel.sh@20 -- # read -r var val 00:07:23.739 07:02:41 -- accel/accel.sh@21 -- # val= 00:07:23.739 07:02:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.739 07:02:41 -- accel/accel.sh@20 -- # IFS=: 00:07:23.739 07:02:41 -- accel/accel.sh@20 -- # read -r var val 00:07:23.739 07:02:41 -- accel/accel.sh@21 -- # val= 00:07:23.739 07:02:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.739 07:02:41 -- accel/accel.sh@20 -- # IFS=: 00:07:23.739 07:02:41 -- accel/accel.sh@20 -- # read -r var val 00:07:23.739 07:02:41 -- accel/accel.sh@21 -- # val= 00:07:23.739 07:02:41 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.739 07:02:41 -- accel/accel.sh@20 -- # IFS=: 00:07:23.739 07:02:41 -- accel/accel.sh@20 -- # read -r var val 00:07:23.739 07:02:41 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:23.739 07:02:41 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:23.739 07:02:41 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:23.739 00:07:23.739 real 0m2.620s 00:07:23.739 user 0m9.047s 00:07:23.739 sys 0m0.273s 00:07:23.739 07:02:41 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:23.739 07:02:41 -- common/autotest_common.sh@10 -- # set +x 00:07:23.739 ************************************ 00:07:23.739 END TEST accel_decomp_full_mcore 00:07:23.739 ************************************ 00:07:23.739 07:02:41 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:23.739 07:02:41 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:23.739 07:02:41 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:23.739 07:02:41 -- common/autotest_common.sh@10 -- # set +x 00:07:23.739 ************************************ 00:07:23.739 START TEST accel_decomp_mthread 00:07:23.739 ************************************ 00:07:23.739 07:02:41 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:23.739 07:02:41 -- accel/accel.sh@16 -- # local accel_opc 00:07:23.739 07:02:41 -- accel/accel.sh@17 -- # local accel_module 00:07:23.998 07:02:41 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:23.998 07:02:41 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:23.998 07:02:41 -- accel/accel.sh@12 -- # build_accel_config 00:07:23.998 07:02:41 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:23.998 07:02:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:23.998 07:02:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:23.998 07:02:41 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:23.998 07:02:41 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:23.998 07:02:41 -- accel/accel.sh@41 -- # local IFS=, 00:07:23.998 07:02:41 -- accel/accel.sh@42 -- # jq -r . 00:07:23.998 [2024-12-13 07:02:41.997253] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:23.998 [2024-12-13 07:02:41.997344] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid487091 ] 00:07:23.998 EAL: No free 2048 kB hugepages reported on node 1 00:07:23.998 [2024-12-13 07:02:42.066105] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.998 [2024-12-13 07:02:42.101564] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.376 07:02:43 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:25.376 00:07:25.376 SPDK Configuration: 00:07:25.376 Core mask: 0x1 00:07:25.376 00:07:25.376 Accel Perf Configuration: 00:07:25.376 Workload Type: decompress 00:07:25.376 Transfer size: 4096 bytes 00:07:25.376 Vector count 1 00:07:25.376 Module: software 00:07:25.376 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:25.376 Queue depth: 32 00:07:25.376 Allocate depth: 32 00:07:25.376 # threads/core: 2 00:07:25.376 Run time: 1 seconds 00:07:25.376 Verify: Yes 00:07:25.376 00:07:25.376 Running for 1 seconds... 00:07:25.376 00:07:25.376 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:25.376 ------------------------------------------------------------------------------------ 00:07:25.376 0,1 47808/s 88 MiB/s 0 0 00:07:25.376 0,0 47648/s 87 MiB/s 0 0 00:07:25.376 ==================================================================================== 00:07:25.376 Total 95456/s 372 MiB/s 0 0' 00:07:25.376 07:02:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.376 07:02:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.376 07:02:43 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:25.376 07:02:43 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:25.376 07:02:43 -- accel/accel.sh@12 -- # build_accel_config 00:07:25.376 07:02:43 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:25.376 07:02:43 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:25.376 07:02:43 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:25.376 07:02:43 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:25.376 07:02:43 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:25.376 07:02:43 -- accel/accel.sh@41 -- # local IFS=, 00:07:25.376 07:02:43 -- accel/accel.sh@42 -- # jq -r . 00:07:25.376 [2024-12-13 07:02:43.287524] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:25.376 [2024-12-13 07:02:43.287603] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid487357 ] 00:07:25.376 EAL: No free 2048 kB hugepages reported on node 1 00:07:25.376 [2024-12-13 07:02:43.354540] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.376 [2024-12-13 07:02:43.388137] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.376 07:02:43 -- accel/accel.sh@21 -- # val= 00:07:25.376 07:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.376 07:02:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.376 07:02:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.376 07:02:43 -- accel/accel.sh@21 -- # val= 00:07:25.376 07:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.376 07:02:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.376 07:02:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.376 07:02:43 -- accel/accel.sh@21 -- # val= 00:07:25.376 07:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.376 07:02:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.376 07:02:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.376 07:02:43 -- accel/accel.sh@21 -- # val=0x1 00:07:25.376 07:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.376 07:02:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.376 07:02:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.376 07:02:43 -- accel/accel.sh@21 -- # val= 00:07:25.376 07:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.376 07:02:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.376 07:02:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.376 07:02:43 -- accel/accel.sh@21 -- # val= 00:07:25.376 07:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.376 07:02:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.376 07:02:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.376 07:02:43 -- accel/accel.sh@21 -- # val=decompress 00:07:25.376 07:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.376 07:02:43 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:25.376 07:02:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.376 07:02:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.376 07:02:43 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:25.376 07:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.376 07:02:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.376 07:02:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.376 07:02:43 -- accel/accel.sh@21 -- # val= 00:07:25.377 07:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.377 07:02:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.377 07:02:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.377 07:02:43 -- accel/accel.sh@21 -- # val=software 00:07:25.377 07:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.377 07:02:43 -- accel/accel.sh@23 -- # accel_module=software 00:07:25.377 07:02:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.377 07:02:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.377 07:02:43 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:25.377 07:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.377 07:02:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.377 07:02:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.377 07:02:43 -- accel/accel.sh@21 -- # val=32 00:07:25.377 07:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.377 07:02:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.377 07:02:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.377 07:02:43 -- accel/accel.sh@21 -- # val=32 00:07:25.377 07:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.377 07:02:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.377 07:02:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.377 07:02:43 -- accel/accel.sh@21 -- # val=2 00:07:25.377 07:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.377 07:02:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.377 07:02:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.377 07:02:43 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:25.377 07:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.377 07:02:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.377 07:02:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.377 07:02:43 -- accel/accel.sh@21 -- # val=Yes 00:07:25.377 07:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.377 07:02:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.377 07:02:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.377 07:02:43 -- accel/accel.sh@21 -- # val= 00:07:25.377 07:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.377 07:02:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.377 07:02:43 -- accel/accel.sh@20 -- # read -r var val 00:07:25.377 07:02:43 -- accel/accel.sh@21 -- # val= 00:07:25.377 07:02:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.377 07:02:43 -- accel/accel.sh@20 -- # IFS=: 00:07:25.377 07:02:43 -- accel/accel.sh@20 -- # read -r var val 00:07:26.314 07:02:44 -- accel/accel.sh@21 -- # val= 00:07:26.314 07:02:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.314 07:02:44 -- accel/accel.sh@20 -- # IFS=: 00:07:26.314 07:02:44 -- accel/accel.sh@20 -- # read -r var val 00:07:26.314 07:02:44 -- accel/accel.sh@21 -- # val= 00:07:26.573 07:02:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.573 07:02:44 -- accel/accel.sh@20 -- # IFS=: 00:07:26.573 07:02:44 -- accel/accel.sh@20 -- # read -r var val 00:07:26.573 07:02:44 -- accel/accel.sh@21 -- # val= 00:07:26.573 07:02:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.573 07:02:44 -- accel/accel.sh@20 -- # IFS=: 00:07:26.573 07:02:44 -- accel/accel.sh@20 -- # read -r var val 00:07:26.573 07:02:44 -- accel/accel.sh@21 -- # val= 00:07:26.573 07:02:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.573 07:02:44 -- accel/accel.sh@20 -- # IFS=: 00:07:26.573 07:02:44 -- accel/accel.sh@20 -- # read -r var val 00:07:26.573 07:02:44 -- accel/accel.sh@21 -- # val= 00:07:26.573 07:02:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.573 07:02:44 -- accel/accel.sh@20 -- # IFS=: 00:07:26.573 07:02:44 -- accel/accel.sh@20 -- # read -r var val 00:07:26.573 07:02:44 -- accel/accel.sh@21 -- # val= 00:07:26.573 07:02:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.573 07:02:44 -- accel/accel.sh@20 -- # IFS=: 00:07:26.573 07:02:44 -- accel/accel.sh@20 -- # read -r var val 00:07:26.573 07:02:44 -- accel/accel.sh@21 -- # val= 00:07:26.573 07:02:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:26.573 07:02:44 -- accel/accel.sh@20 -- # IFS=: 00:07:26.573 07:02:44 -- accel/accel.sh@20 -- # read -r var val 00:07:26.573 07:02:44 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:26.573 07:02:44 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:26.573 07:02:44 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:26.573 00:07:26.573 real 0m2.585s 00:07:26.573 user 0m2.338s 00:07:26.573 sys 0m0.257s 00:07:26.573 07:02:44 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:26.573 07:02:44 -- common/autotest_common.sh@10 -- # set +x 00:07:26.573 ************************************ 00:07:26.573 END TEST accel_decomp_mthread 00:07:26.573 ************************************ 00:07:26.573 07:02:44 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:26.573 07:02:44 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:26.573 07:02:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:26.573 07:02:44 -- common/autotest_common.sh@10 -- # set +x 00:07:26.573 ************************************ 00:07:26.573 START TEST accel_deomp_full_mthread 00:07:26.573 ************************************ 00:07:26.573 07:02:44 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:26.573 07:02:44 -- accel/accel.sh@16 -- # local accel_opc 00:07:26.573 07:02:44 -- accel/accel.sh@17 -- # local accel_module 00:07:26.573 07:02:44 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:26.573 07:02:44 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:26.573 07:02:44 -- accel/accel.sh@12 -- # build_accel_config 00:07:26.573 07:02:44 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:26.573 07:02:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:26.573 07:02:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:26.573 07:02:44 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:26.573 07:02:44 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:26.573 07:02:44 -- accel/accel.sh@41 -- # local IFS=, 00:07:26.573 07:02:44 -- accel/accel.sh@42 -- # jq -r . 00:07:26.573 [2024-12-13 07:02:44.629774] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:26.573 [2024-12-13 07:02:44.629860] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid487645 ] 00:07:26.573 EAL: No free 2048 kB hugepages reported on node 1 00:07:26.573 [2024-12-13 07:02:44.697159] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.573 [2024-12-13 07:02:44.732541] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.951 07:02:45 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:27.951 00:07:27.951 SPDK Configuration: 00:07:27.951 Core mask: 0x1 00:07:27.951 00:07:27.951 Accel Perf Configuration: 00:07:27.951 Workload Type: decompress 00:07:27.951 Transfer size: 111250 bytes 00:07:27.951 Vector count 1 00:07:27.951 Module: software 00:07:27.951 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:27.951 Queue depth: 32 00:07:27.951 Allocate depth: 32 00:07:27.951 # threads/core: 2 00:07:27.951 Run time: 1 seconds 00:07:27.951 Verify: Yes 00:07:27.951 00:07:27.951 Running for 1 seconds... 00:07:27.951 00:07:27.951 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:27.951 ------------------------------------------------------------------------------------ 00:07:27.951 0,1 2976/s 122 MiB/s 0 0 00:07:27.951 0,0 2944/s 121 MiB/s 0 0 00:07:27.951 ==================================================================================== 00:07:27.951 Total 5920/s 628 MiB/s 0 0' 00:07:27.952 07:02:45 -- accel/accel.sh@20 -- # IFS=: 00:07:27.952 07:02:45 -- accel/accel.sh@20 -- # read -r var val 00:07:27.952 07:02:45 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:27.952 07:02:45 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:27.952 07:02:45 -- accel/accel.sh@12 -- # build_accel_config 00:07:27.952 07:02:45 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:27.952 07:02:45 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:27.952 07:02:45 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:27.952 07:02:45 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:27.952 07:02:45 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:27.952 07:02:45 -- accel/accel.sh@41 -- # local IFS=, 00:07:27.952 07:02:45 -- accel/accel.sh@42 -- # jq -r . 00:07:27.952 [2024-12-13 07:02:45.938432] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:27.952 [2024-12-13 07:02:45.938515] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid487912 ] 00:07:27.952 EAL: No free 2048 kB hugepages reported on node 1 00:07:27.952 [2024-12-13 07:02:46.006417] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:27.952 [2024-12-13 07:02:46.040082] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.952 07:02:46 -- accel/accel.sh@21 -- # val= 00:07:27.952 07:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # IFS=: 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # read -r var val 00:07:27.952 07:02:46 -- accel/accel.sh@21 -- # val= 00:07:27.952 07:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # IFS=: 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # read -r var val 00:07:27.952 07:02:46 -- accel/accel.sh@21 -- # val= 00:07:27.952 07:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # IFS=: 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # read -r var val 00:07:27.952 07:02:46 -- accel/accel.sh@21 -- # val=0x1 00:07:27.952 07:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # IFS=: 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # read -r var val 00:07:27.952 07:02:46 -- accel/accel.sh@21 -- # val= 00:07:27.952 07:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # IFS=: 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # read -r var val 00:07:27.952 07:02:46 -- accel/accel.sh@21 -- # val= 00:07:27.952 07:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # IFS=: 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # read -r var val 00:07:27.952 07:02:46 -- accel/accel.sh@21 -- # val=decompress 00:07:27.952 07:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.952 07:02:46 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # IFS=: 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # read -r var val 00:07:27.952 07:02:46 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:27.952 07:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # IFS=: 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # read -r var val 00:07:27.952 07:02:46 -- accel/accel.sh@21 -- # val= 00:07:27.952 07:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # IFS=: 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # read -r var val 00:07:27.952 07:02:46 -- accel/accel.sh@21 -- # val=software 00:07:27.952 07:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.952 07:02:46 -- accel/accel.sh@23 -- # accel_module=software 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # IFS=: 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # read -r var val 00:07:27.952 07:02:46 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:27.952 07:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # IFS=: 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # read -r var val 00:07:27.952 07:02:46 -- accel/accel.sh@21 -- # val=32 00:07:27.952 07:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # IFS=: 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # read -r var val 00:07:27.952 07:02:46 -- accel/accel.sh@21 -- # val=32 00:07:27.952 07:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # IFS=: 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # read -r var val 00:07:27.952 07:02:46 -- accel/accel.sh@21 -- # val=2 00:07:27.952 07:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # IFS=: 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # read -r var val 00:07:27.952 07:02:46 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:27.952 07:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # IFS=: 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # read -r var val 00:07:27.952 07:02:46 -- accel/accel.sh@21 -- # val=Yes 00:07:27.952 07:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # IFS=: 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # read -r var val 00:07:27.952 07:02:46 -- accel/accel.sh@21 -- # val= 00:07:27.952 07:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # IFS=: 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # read -r var val 00:07:27.952 07:02:46 -- accel/accel.sh@21 -- # val= 00:07:27.952 07:02:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # IFS=: 00:07:27.952 07:02:46 -- accel/accel.sh@20 -- # read -r var val 00:07:29.329 07:02:47 -- accel/accel.sh@21 -- # val= 00:07:29.329 07:02:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.329 07:02:47 -- accel/accel.sh@20 -- # IFS=: 00:07:29.329 07:02:47 -- accel/accel.sh@20 -- # read -r var val 00:07:29.329 07:02:47 -- accel/accel.sh@21 -- # val= 00:07:29.329 07:02:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.330 07:02:47 -- accel/accel.sh@20 -- # IFS=: 00:07:29.330 07:02:47 -- accel/accel.sh@20 -- # read -r var val 00:07:29.330 07:02:47 -- accel/accel.sh@21 -- # val= 00:07:29.330 07:02:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.330 07:02:47 -- accel/accel.sh@20 -- # IFS=: 00:07:29.330 07:02:47 -- accel/accel.sh@20 -- # read -r var val 00:07:29.330 07:02:47 -- accel/accel.sh@21 -- # val= 00:07:29.330 07:02:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.330 07:02:47 -- accel/accel.sh@20 -- # IFS=: 00:07:29.330 07:02:47 -- accel/accel.sh@20 -- # read -r var val 00:07:29.330 07:02:47 -- accel/accel.sh@21 -- # val= 00:07:29.330 07:02:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.330 07:02:47 -- accel/accel.sh@20 -- # IFS=: 00:07:29.330 07:02:47 -- accel/accel.sh@20 -- # read -r var val 00:07:29.330 07:02:47 -- accel/accel.sh@21 -- # val= 00:07:29.330 07:02:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.330 07:02:47 -- accel/accel.sh@20 -- # IFS=: 00:07:29.330 07:02:47 -- accel/accel.sh@20 -- # read -r var val 00:07:29.330 07:02:47 -- accel/accel.sh@21 -- # val= 00:07:29.330 07:02:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:29.330 07:02:47 -- accel/accel.sh@20 -- # IFS=: 00:07:29.330 07:02:47 -- accel/accel.sh@20 -- # read -r var val 00:07:29.330 07:02:47 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:29.330 07:02:47 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:29.330 07:02:47 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:29.330 00:07:29.330 real 0m2.622s 00:07:29.330 user 0m2.381s 00:07:29.330 sys 0m0.249s 00:07:29.330 07:02:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:29.330 07:02:47 -- common/autotest_common.sh@10 -- # set +x 00:07:29.330 ************************************ 00:07:29.330 END TEST accel_deomp_full_mthread 00:07:29.330 ************************************ 00:07:29.330 07:02:47 -- accel/accel.sh@116 -- # [[ n == y ]] 00:07:29.330 07:02:47 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:29.330 07:02:47 -- accel/accel.sh@129 -- # build_accel_config 00:07:29.330 07:02:47 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:07:29.330 07:02:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:29.330 07:02:47 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:29.330 07:02:47 -- common/autotest_common.sh@10 -- # set +x 00:07:29.330 07:02:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:29.330 07:02:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:29.330 07:02:47 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:29.330 07:02:47 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:29.330 07:02:47 -- accel/accel.sh@41 -- # local IFS=, 00:07:29.330 07:02:47 -- accel/accel.sh@42 -- # jq -r . 00:07:29.330 ************************************ 00:07:29.330 START TEST accel_dif_functional_tests 00:07:29.330 ************************************ 00:07:29.330 07:02:47 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:29.330 [2024-12-13 07:02:47.305059] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:29.330 [2024-12-13 07:02:47.305141] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid488182 ] 00:07:29.330 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.330 [2024-12-13 07:02:47.371956] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:29.330 [2024-12-13 07:02:47.409171] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:29.330 [2024-12-13 07:02:47.409268] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:29.330 [2024-12-13 07:02:47.409268] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.330 00:07:29.330 00:07:29.330 CUnit - A unit testing framework for C - Version 2.1-3 00:07:29.330 http://cunit.sourceforge.net/ 00:07:29.330 00:07:29.330 00:07:29.330 Suite: accel_dif 00:07:29.330 Test: verify: DIF generated, GUARD check ...passed 00:07:29.330 Test: verify: DIF generated, APPTAG check ...passed 00:07:29.330 Test: verify: DIF generated, REFTAG check ...passed 00:07:29.330 Test: verify: DIF not generated, GUARD check ...[2024-12-13 07:02:47.472383] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:29.330 [2024-12-13 07:02:47.472432] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:29.330 passed 00:07:29.330 Test: verify: DIF not generated, APPTAG check ...[2024-12-13 07:02:47.472483] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:29.330 [2024-12-13 07:02:47.472502] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:29.330 passed 00:07:29.330 Test: verify: DIF not generated, REFTAG check ...[2024-12-13 07:02:47.472522] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:29.330 [2024-12-13 07:02:47.472540] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:29.330 passed 00:07:29.330 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:29.330 Test: verify: APPTAG incorrect, APPTAG check ...[2024-12-13 07:02:47.472586] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:29.330 passed 00:07:29.330 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:29.330 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:29.330 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:29.330 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-12-13 07:02:47.472692] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:29.330 passed 00:07:29.330 Test: generate copy: DIF generated, GUARD check ...passed 00:07:29.330 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:29.330 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:29.330 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:29.330 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:29.330 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:29.330 Test: generate copy: iovecs-len validate ...[2024-12-13 07:02:47.472878] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:29.330 passed 00:07:29.330 Test: generate copy: buffer alignment validate ...passed 00:07:29.330 00:07:29.330 Run Summary: Type Total Ran Passed Failed Inactive 00:07:29.330 suites 1 1 n/a 0 0 00:07:29.330 tests 20 20 20 0 0 00:07:29.330 asserts 204 204 204 0 n/a 00:07:29.330 00:07:29.330 Elapsed time = 0.000 seconds 00:07:29.589 00:07:29.589 real 0m0.342s 00:07:29.589 user 0m0.531s 00:07:29.589 sys 0m0.154s 00:07:29.589 07:02:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:29.589 07:02:47 -- common/autotest_common.sh@10 -- # set +x 00:07:29.589 ************************************ 00:07:29.589 END TEST accel_dif_functional_tests 00:07:29.589 ************************************ 00:07:29.589 00:07:29.589 real 0m54.892s 00:07:29.589 user 1m2.470s 00:07:29.589 sys 0m6.902s 00:07:29.589 07:02:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:29.589 07:02:47 -- common/autotest_common.sh@10 -- # set +x 00:07:29.589 ************************************ 00:07:29.589 END TEST accel 00:07:29.589 ************************************ 00:07:29.589 07:02:47 -- spdk/autotest.sh@177 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:29.589 07:02:47 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:29.589 07:02:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:29.589 07:02:47 -- common/autotest_common.sh@10 -- # set +x 00:07:29.589 ************************************ 00:07:29.589 START TEST accel_rpc 00:07:29.589 ************************************ 00:07:29.589 07:02:47 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:29.589 * Looking for test storage... 00:07:29.589 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:29.590 07:02:47 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:29.590 07:02:47 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:29.590 07:02:47 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:29.849 07:02:47 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:29.849 07:02:47 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:29.849 07:02:47 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:29.849 07:02:47 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:29.849 07:02:47 -- scripts/common.sh@335 -- # IFS=.-: 00:07:29.849 07:02:47 -- scripts/common.sh@335 -- # read -ra ver1 00:07:29.849 07:02:47 -- scripts/common.sh@336 -- # IFS=.-: 00:07:29.849 07:02:47 -- scripts/common.sh@336 -- # read -ra ver2 00:07:29.849 07:02:47 -- scripts/common.sh@337 -- # local 'op=<' 00:07:29.849 07:02:47 -- scripts/common.sh@339 -- # ver1_l=2 00:07:29.849 07:02:47 -- scripts/common.sh@340 -- # ver2_l=1 00:07:29.849 07:02:47 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:29.849 07:02:47 -- scripts/common.sh@343 -- # case "$op" in 00:07:29.849 07:02:47 -- scripts/common.sh@344 -- # : 1 00:07:29.849 07:02:47 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:29.849 07:02:47 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:29.849 07:02:47 -- scripts/common.sh@364 -- # decimal 1 00:07:29.849 07:02:47 -- scripts/common.sh@352 -- # local d=1 00:07:29.849 07:02:47 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:29.849 07:02:47 -- scripts/common.sh@354 -- # echo 1 00:07:29.849 07:02:47 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:29.849 07:02:47 -- scripts/common.sh@365 -- # decimal 2 00:07:29.849 07:02:47 -- scripts/common.sh@352 -- # local d=2 00:07:29.849 07:02:47 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:29.849 07:02:47 -- scripts/common.sh@354 -- # echo 2 00:07:29.849 07:02:47 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:29.849 07:02:47 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:29.849 07:02:47 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:29.849 07:02:47 -- scripts/common.sh@367 -- # return 0 00:07:29.849 07:02:47 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:29.849 07:02:47 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:29.849 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:29.849 --rc genhtml_branch_coverage=1 00:07:29.849 --rc genhtml_function_coverage=1 00:07:29.849 --rc genhtml_legend=1 00:07:29.849 --rc geninfo_all_blocks=1 00:07:29.849 --rc geninfo_unexecuted_blocks=1 00:07:29.849 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:29.849 ' 00:07:29.849 07:02:47 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:29.849 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:29.849 --rc genhtml_branch_coverage=1 00:07:29.849 --rc genhtml_function_coverage=1 00:07:29.849 --rc genhtml_legend=1 00:07:29.849 --rc geninfo_all_blocks=1 00:07:29.849 --rc geninfo_unexecuted_blocks=1 00:07:29.849 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:29.849 ' 00:07:29.849 07:02:47 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:29.849 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:29.849 --rc genhtml_branch_coverage=1 00:07:29.849 --rc genhtml_function_coverage=1 00:07:29.849 --rc genhtml_legend=1 00:07:29.849 --rc geninfo_all_blocks=1 00:07:29.849 --rc geninfo_unexecuted_blocks=1 00:07:29.849 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:29.849 ' 00:07:29.849 07:02:47 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:29.849 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:29.849 --rc genhtml_branch_coverage=1 00:07:29.849 --rc genhtml_function_coverage=1 00:07:29.849 --rc genhtml_legend=1 00:07:29.849 --rc geninfo_all_blocks=1 00:07:29.849 --rc geninfo_unexecuted_blocks=1 00:07:29.849 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:29.849 ' 00:07:29.849 07:02:47 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:29.849 07:02:47 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=488277 00:07:29.849 07:02:47 -- accel/accel_rpc.sh@15 -- # waitforlisten 488277 00:07:29.849 07:02:47 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:29.849 07:02:47 -- common/autotest_common.sh@829 -- # '[' -z 488277 ']' 00:07:29.849 07:02:47 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:29.849 07:02:47 -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:29.849 07:02:47 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:29.849 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:29.849 07:02:47 -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:29.849 07:02:47 -- common/autotest_common.sh@10 -- # set +x 00:07:29.849 [2024-12-13 07:02:47.921425] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:29.849 [2024-12-13 07:02:47.921513] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid488277 ] 00:07:29.849 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.849 [2024-12-13 07:02:47.989902] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.849 [2024-12-13 07:02:48.025831] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:29.849 [2024-12-13 07:02:48.025946] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.849 07:02:48 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:29.849 07:02:48 -- common/autotest_common.sh@862 -- # return 0 00:07:29.849 07:02:48 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:29.849 07:02:48 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:29.849 07:02:48 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:29.849 07:02:48 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:29.849 07:02:48 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:29.849 07:02:48 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:29.849 07:02:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:29.849 07:02:48 -- common/autotest_common.sh@10 -- # set +x 00:07:29.849 ************************************ 00:07:29.849 START TEST accel_assign_opcode 00:07:29.849 ************************************ 00:07:30.109 07:02:48 -- common/autotest_common.sh@1114 -- # accel_assign_opcode_test_suite 00:07:30.109 07:02:48 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:30.109 07:02:48 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:30.109 07:02:48 -- common/autotest_common.sh@10 -- # set +x 00:07:30.109 [2024-12-13 07:02:48.094425] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:30.109 07:02:48 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:30.109 07:02:48 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:30.109 07:02:48 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:30.109 07:02:48 -- common/autotest_common.sh@10 -- # set +x 00:07:30.109 [2024-12-13 07:02:48.106456] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:30.109 07:02:48 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:30.109 07:02:48 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:30.109 07:02:48 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:30.109 07:02:48 -- common/autotest_common.sh@10 -- # set +x 00:07:30.109 07:02:48 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:30.109 07:02:48 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:30.109 07:02:48 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:30.109 07:02:48 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:30.109 07:02:48 -- common/autotest_common.sh@10 -- # set +x 00:07:30.109 07:02:48 -- accel/accel_rpc.sh@42 -- # grep software 00:07:30.109 07:02:48 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:30.109 software 00:07:30.109 00:07:30.109 real 0m0.218s 00:07:30.109 user 0m0.046s 00:07:30.109 sys 0m0.008s 00:07:30.109 07:02:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:30.109 07:02:48 -- common/autotest_common.sh@10 -- # set +x 00:07:30.109 ************************************ 00:07:30.109 END TEST accel_assign_opcode 00:07:30.109 ************************************ 00:07:30.109 07:02:48 -- accel/accel_rpc.sh@55 -- # killprocess 488277 00:07:30.109 07:02:48 -- common/autotest_common.sh@936 -- # '[' -z 488277 ']' 00:07:30.368 07:02:48 -- common/autotest_common.sh@940 -- # kill -0 488277 00:07:30.368 07:02:48 -- common/autotest_common.sh@941 -- # uname 00:07:30.368 07:02:48 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:30.368 07:02:48 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 488277 00:07:30.368 07:02:48 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:30.368 07:02:48 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:30.368 07:02:48 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 488277' 00:07:30.368 killing process with pid 488277 00:07:30.368 07:02:48 -- common/autotest_common.sh@955 -- # kill 488277 00:07:30.368 07:02:48 -- common/autotest_common.sh@960 -- # wait 488277 00:07:30.627 00:07:30.627 real 0m0.992s 00:07:30.627 user 0m0.884s 00:07:30.627 sys 0m0.469s 00:07:30.627 07:02:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:30.627 07:02:48 -- common/autotest_common.sh@10 -- # set +x 00:07:30.627 ************************************ 00:07:30.627 END TEST accel_rpc 00:07:30.627 ************************************ 00:07:30.627 07:02:48 -- spdk/autotest.sh@178 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:30.627 07:02:48 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:30.627 07:02:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:30.627 07:02:48 -- common/autotest_common.sh@10 -- # set +x 00:07:30.627 ************************************ 00:07:30.627 START TEST app_cmdline 00:07:30.627 ************************************ 00:07:30.627 07:02:48 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:30.627 * Looking for test storage... 00:07:30.627 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:30.627 07:02:48 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:30.627 07:02:48 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:30.627 07:02:48 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:30.886 07:02:48 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:30.886 07:02:48 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:30.886 07:02:48 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:30.886 07:02:48 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:30.886 07:02:48 -- scripts/common.sh@335 -- # IFS=.-: 00:07:30.886 07:02:48 -- scripts/common.sh@335 -- # read -ra ver1 00:07:30.886 07:02:48 -- scripts/common.sh@336 -- # IFS=.-: 00:07:30.886 07:02:48 -- scripts/common.sh@336 -- # read -ra ver2 00:07:30.886 07:02:48 -- scripts/common.sh@337 -- # local 'op=<' 00:07:30.886 07:02:48 -- scripts/common.sh@339 -- # ver1_l=2 00:07:30.886 07:02:48 -- scripts/common.sh@340 -- # ver2_l=1 00:07:30.886 07:02:48 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:30.886 07:02:48 -- scripts/common.sh@343 -- # case "$op" in 00:07:30.886 07:02:48 -- scripts/common.sh@344 -- # : 1 00:07:30.886 07:02:48 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:30.886 07:02:48 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:30.886 07:02:48 -- scripts/common.sh@364 -- # decimal 1 00:07:30.886 07:02:48 -- scripts/common.sh@352 -- # local d=1 00:07:30.886 07:02:48 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:30.886 07:02:48 -- scripts/common.sh@354 -- # echo 1 00:07:30.886 07:02:48 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:30.886 07:02:48 -- scripts/common.sh@365 -- # decimal 2 00:07:30.886 07:02:48 -- scripts/common.sh@352 -- # local d=2 00:07:30.886 07:02:48 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:30.886 07:02:48 -- scripts/common.sh@354 -- # echo 2 00:07:30.886 07:02:48 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:30.886 07:02:48 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:30.886 07:02:48 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:30.886 07:02:48 -- scripts/common.sh@367 -- # return 0 00:07:30.886 07:02:48 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:30.886 07:02:48 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:30.886 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.886 --rc genhtml_branch_coverage=1 00:07:30.886 --rc genhtml_function_coverage=1 00:07:30.886 --rc genhtml_legend=1 00:07:30.886 --rc geninfo_all_blocks=1 00:07:30.886 --rc geninfo_unexecuted_blocks=1 00:07:30.886 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.886 ' 00:07:30.886 07:02:48 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:30.886 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.886 --rc genhtml_branch_coverage=1 00:07:30.886 --rc genhtml_function_coverage=1 00:07:30.886 --rc genhtml_legend=1 00:07:30.886 --rc geninfo_all_blocks=1 00:07:30.886 --rc geninfo_unexecuted_blocks=1 00:07:30.886 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.886 ' 00:07:30.886 07:02:48 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:30.886 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.886 --rc genhtml_branch_coverage=1 00:07:30.886 --rc genhtml_function_coverage=1 00:07:30.886 --rc genhtml_legend=1 00:07:30.886 --rc geninfo_all_blocks=1 00:07:30.886 --rc geninfo_unexecuted_blocks=1 00:07:30.886 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.886 ' 00:07:30.886 07:02:48 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:30.886 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.886 --rc genhtml_branch_coverage=1 00:07:30.886 --rc genhtml_function_coverage=1 00:07:30.886 --rc genhtml_legend=1 00:07:30.886 --rc geninfo_all_blocks=1 00:07:30.886 --rc geninfo_unexecuted_blocks=1 00:07:30.886 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.886 ' 00:07:30.886 07:02:48 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:30.886 07:02:48 -- app/cmdline.sh@17 -- # spdk_tgt_pid=488612 00:07:30.886 07:02:48 -- app/cmdline.sh@18 -- # waitforlisten 488612 00:07:30.886 07:02:48 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:30.886 07:02:48 -- common/autotest_common.sh@829 -- # '[' -z 488612 ']' 00:07:30.886 07:02:48 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:30.886 07:02:48 -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:30.886 07:02:48 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:30.886 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:30.886 07:02:48 -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:30.886 07:02:48 -- common/autotest_common.sh@10 -- # set +x 00:07:30.886 [2024-12-13 07:02:48.958923] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:30.886 [2024-12-13 07:02:48.959010] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid488612 ] 00:07:30.886 EAL: No free 2048 kB hugepages reported on node 1 00:07:30.886 [2024-12-13 07:02:49.026859] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.886 [2024-12-13 07:02:49.063887] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:30.886 [2024-12-13 07:02:49.063993] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.823 07:02:49 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:31.823 07:02:49 -- common/autotest_common.sh@862 -- # return 0 00:07:31.823 07:02:49 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:31.823 { 00:07:31.823 "version": "SPDK v24.01.1-pre git sha1 c13c99a5e", 00:07:31.823 "fields": { 00:07:31.823 "major": 24, 00:07:31.823 "minor": 1, 00:07:31.823 "patch": 1, 00:07:31.823 "suffix": "-pre", 00:07:31.823 "commit": "c13c99a5e" 00:07:31.823 } 00:07:31.823 } 00:07:31.823 07:02:49 -- app/cmdline.sh@22 -- # expected_methods=() 00:07:31.823 07:02:49 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:31.823 07:02:49 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:31.823 07:02:49 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:31.823 07:02:49 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:31.823 07:02:49 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:31.823 07:02:49 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:31.823 07:02:49 -- common/autotest_common.sh@10 -- # set +x 00:07:31.823 07:02:49 -- app/cmdline.sh@26 -- # sort 00:07:31.823 07:02:49 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:31.823 07:02:49 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:31.823 07:02:49 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:31.823 07:02:49 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:31.823 07:02:49 -- common/autotest_common.sh@650 -- # local es=0 00:07:31.823 07:02:49 -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:31.823 07:02:49 -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:31.823 07:02:49 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:31.823 07:02:49 -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:31.823 07:02:49 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:31.823 07:02:49 -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:31.823 07:02:49 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:31.823 07:02:49 -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:31.823 07:02:49 -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:31.823 07:02:49 -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:32.082 request: 00:07:32.082 { 00:07:32.082 "method": "env_dpdk_get_mem_stats", 00:07:32.082 "req_id": 1 00:07:32.082 } 00:07:32.082 Got JSON-RPC error response 00:07:32.082 response: 00:07:32.082 { 00:07:32.082 "code": -32601, 00:07:32.082 "message": "Method not found" 00:07:32.082 } 00:07:32.082 07:02:50 -- common/autotest_common.sh@653 -- # es=1 00:07:32.082 07:02:50 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:32.082 07:02:50 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:32.082 07:02:50 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:32.082 07:02:50 -- app/cmdline.sh@1 -- # killprocess 488612 00:07:32.082 07:02:50 -- common/autotest_common.sh@936 -- # '[' -z 488612 ']' 00:07:32.082 07:02:50 -- common/autotest_common.sh@940 -- # kill -0 488612 00:07:32.082 07:02:50 -- common/autotest_common.sh@941 -- # uname 00:07:32.082 07:02:50 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:32.082 07:02:50 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 488612 00:07:32.082 07:02:50 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:32.082 07:02:50 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:32.082 07:02:50 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 488612' 00:07:32.082 killing process with pid 488612 00:07:32.082 07:02:50 -- common/autotest_common.sh@955 -- # kill 488612 00:07:32.082 07:02:50 -- common/autotest_common.sh@960 -- # wait 488612 00:07:32.342 00:07:32.342 real 0m1.794s 00:07:32.342 user 0m2.078s 00:07:32.342 sys 0m0.520s 00:07:32.342 07:02:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:32.342 07:02:50 -- common/autotest_common.sh@10 -- # set +x 00:07:32.342 ************************************ 00:07:32.342 END TEST app_cmdline 00:07:32.342 ************************************ 00:07:32.342 07:02:50 -- spdk/autotest.sh@179 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:32.342 07:02:50 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:32.342 07:02:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:32.342 07:02:50 -- common/autotest_common.sh@10 -- # set +x 00:07:32.601 ************************************ 00:07:32.601 START TEST version 00:07:32.601 ************************************ 00:07:32.601 07:02:50 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:32.601 * Looking for test storage... 00:07:32.602 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:32.602 07:02:50 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:32.602 07:02:50 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:32.602 07:02:50 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:32.602 07:02:50 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:32.602 07:02:50 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:32.602 07:02:50 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:32.602 07:02:50 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:32.602 07:02:50 -- scripts/common.sh@335 -- # IFS=.-: 00:07:32.602 07:02:50 -- scripts/common.sh@335 -- # read -ra ver1 00:07:32.602 07:02:50 -- scripts/common.sh@336 -- # IFS=.-: 00:07:32.602 07:02:50 -- scripts/common.sh@336 -- # read -ra ver2 00:07:32.602 07:02:50 -- scripts/common.sh@337 -- # local 'op=<' 00:07:32.602 07:02:50 -- scripts/common.sh@339 -- # ver1_l=2 00:07:32.602 07:02:50 -- scripts/common.sh@340 -- # ver2_l=1 00:07:32.602 07:02:50 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:32.602 07:02:50 -- scripts/common.sh@343 -- # case "$op" in 00:07:32.602 07:02:50 -- scripts/common.sh@344 -- # : 1 00:07:32.602 07:02:50 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:32.602 07:02:50 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:32.602 07:02:50 -- scripts/common.sh@364 -- # decimal 1 00:07:32.602 07:02:50 -- scripts/common.sh@352 -- # local d=1 00:07:32.602 07:02:50 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:32.602 07:02:50 -- scripts/common.sh@354 -- # echo 1 00:07:32.602 07:02:50 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:32.602 07:02:50 -- scripts/common.sh@365 -- # decimal 2 00:07:32.602 07:02:50 -- scripts/common.sh@352 -- # local d=2 00:07:32.602 07:02:50 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:32.602 07:02:50 -- scripts/common.sh@354 -- # echo 2 00:07:32.602 07:02:50 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:32.602 07:02:50 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:32.602 07:02:50 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:32.602 07:02:50 -- scripts/common.sh@367 -- # return 0 00:07:32.602 07:02:50 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:32.602 07:02:50 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:32.602 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.602 --rc genhtml_branch_coverage=1 00:07:32.602 --rc genhtml_function_coverage=1 00:07:32.602 --rc genhtml_legend=1 00:07:32.602 --rc geninfo_all_blocks=1 00:07:32.602 --rc geninfo_unexecuted_blocks=1 00:07:32.602 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.602 ' 00:07:32.602 07:02:50 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:32.602 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.602 --rc genhtml_branch_coverage=1 00:07:32.602 --rc genhtml_function_coverage=1 00:07:32.602 --rc genhtml_legend=1 00:07:32.602 --rc geninfo_all_blocks=1 00:07:32.602 --rc geninfo_unexecuted_blocks=1 00:07:32.602 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.602 ' 00:07:32.602 07:02:50 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:32.602 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.602 --rc genhtml_branch_coverage=1 00:07:32.602 --rc genhtml_function_coverage=1 00:07:32.602 --rc genhtml_legend=1 00:07:32.602 --rc geninfo_all_blocks=1 00:07:32.602 --rc geninfo_unexecuted_blocks=1 00:07:32.602 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.602 ' 00:07:32.602 07:02:50 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:32.602 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.602 --rc genhtml_branch_coverage=1 00:07:32.602 --rc genhtml_function_coverage=1 00:07:32.602 --rc genhtml_legend=1 00:07:32.602 --rc geninfo_all_blocks=1 00:07:32.602 --rc geninfo_unexecuted_blocks=1 00:07:32.602 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.602 ' 00:07:32.602 07:02:50 -- app/version.sh@17 -- # get_header_version major 00:07:32.602 07:02:50 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:32.602 07:02:50 -- app/version.sh@14 -- # cut -f2 00:07:32.602 07:02:50 -- app/version.sh@14 -- # tr -d '"' 00:07:32.602 07:02:50 -- app/version.sh@17 -- # major=24 00:07:32.602 07:02:50 -- app/version.sh@18 -- # get_header_version minor 00:07:32.602 07:02:50 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:32.602 07:02:50 -- app/version.sh@14 -- # cut -f2 00:07:32.602 07:02:50 -- app/version.sh@14 -- # tr -d '"' 00:07:32.602 07:02:50 -- app/version.sh@18 -- # minor=1 00:07:32.602 07:02:50 -- app/version.sh@19 -- # get_header_version patch 00:07:32.602 07:02:50 -- app/version.sh@14 -- # cut -f2 00:07:32.602 07:02:50 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:32.602 07:02:50 -- app/version.sh@14 -- # tr -d '"' 00:07:32.602 07:02:50 -- app/version.sh@19 -- # patch=1 00:07:32.602 07:02:50 -- app/version.sh@20 -- # get_header_version suffix 00:07:32.602 07:02:50 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:32.602 07:02:50 -- app/version.sh@14 -- # cut -f2 00:07:32.602 07:02:50 -- app/version.sh@14 -- # tr -d '"' 00:07:32.602 07:02:50 -- app/version.sh@20 -- # suffix=-pre 00:07:32.602 07:02:50 -- app/version.sh@22 -- # version=24.1 00:07:32.602 07:02:50 -- app/version.sh@25 -- # (( patch != 0 )) 00:07:32.602 07:02:50 -- app/version.sh@25 -- # version=24.1.1 00:07:32.602 07:02:50 -- app/version.sh@28 -- # version=24.1.1rc0 00:07:32.602 07:02:50 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:32.602 07:02:50 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:32.862 07:02:50 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:07:32.862 07:02:50 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:07:32.862 00:07:32.862 real 0m0.262s 00:07:32.862 user 0m0.156s 00:07:32.862 sys 0m0.158s 00:07:32.862 07:02:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:32.862 07:02:50 -- common/autotest_common.sh@10 -- # set +x 00:07:32.862 ************************************ 00:07:32.862 END TEST version 00:07:32.862 ************************************ 00:07:32.862 07:02:50 -- spdk/autotest.sh@181 -- # '[' 0 -eq 1 ']' 00:07:32.862 07:02:50 -- spdk/autotest.sh@191 -- # uname -s 00:07:32.862 07:02:50 -- spdk/autotest.sh@191 -- # [[ Linux == Linux ]] 00:07:32.862 07:02:50 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:07:32.862 07:02:50 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:07:32.862 07:02:50 -- spdk/autotest.sh@204 -- # '[' 0 -eq 1 ']' 00:07:32.862 07:02:50 -- spdk/autotest.sh@251 -- # '[' 0 -eq 1 ']' 00:07:32.862 07:02:50 -- spdk/autotest.sh@255 -- # timing_exit lib 00:07:32.862 07:02:50 -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:32.862 07:02:50 -- common/autotest_common.sh@10 -- # set +x 00:07:32.862 07:02:50 -- spdk/autotest.sh@257 -- # '[' 0 -eq 1 ']' 00:07:32.862 07:02:50 -- spdk/autotest.sh@265 -- # '[' 0 -eq 1 ']' 00:07:32.862 07:02:50 -- spdk/autotest.sh@274 -- # '[' 0 -eq 1 ']' 00:07:32.862 07:02:50 -- spdk/autotest.sh@298 -- # '[' 0 -eq 1 ']' 00:07:32.862 07:02:50 -- spdk/autotest.sh@302 -- # '[' 0 -eq 1 ']' 00:07:32.862 07:02:50 -- spdk/autotest.sh@306 -- # '[' 0 -eq 1 ']' 00:07:32.862 07:02:50 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:32.862 07:02:50 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:07:32.862 07:02:50 -- spdk/autotest.sh@325 -- # '[' 0 -eq 1 ']' 00:07:32.862 07:02:50 -- spdk/autotest.sh@329 -- # '[' 0 -eq 1 ']' 00:07:32.862 07:02:50 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:32.862 07:02:50 -- spdk/autotest.sh@337 -- # '[' 0 -eq 1 ']' 00:07:32.862 07:02:50 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:32.862 07:02:50 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:32.862 07:02:50 -- spdk/autotest.sh@353 -- # [[ 0 -eq 1 ]] 00:07:32.862 07:02:50 -- spdk/autotest.sh@357 -- # [[ 0 -eq 1 ]] 00:07:32.862 07:02:50 -- spdk/autotest.sh@361 -- # [[ 1 -eq 1 ]] 00:07:32.862 07:02:50 -- spdk/autotest.sh@362 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:32.862 07:02:50 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:32.862 07:02:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:32.862 07:02:50 -- common/autotest_common.sh@10 -- # set +x 00:07:32.862 ************************************ 00:07:32.862 START TEST llvm_fuzz 00:07:32.862 ************************************ 00:07:32.862 07:02:50 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:32.862 * Looking for test storage... 00:07:32.862 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:32.862 07:02:51 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:32.862 07:02:51 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:32.862 07:02:51 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:33.122 07:02:51 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:33.122 07:02:51 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:33.122 07:02:51 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:33.122 07:02:51 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:33.122 07:02:51 -- scripts/common.sh@335 -- # IFS=.-: 00:07:33.122 07:02:51 -- scripts/common.sh@335 -- # read -ra ver1 00:07:33.122 07:02:51 -- scripts/common.sh@336 -- # IFS=.-: 00:07:33.122 07:02:51 -- scripts/common.sh@336 -- # read -ra ver2 00:07:33.122 07:02:51 -- scripts/common.sh@337 -- # local 'op=<' 00:07:33.122 07:02:51 -- scripts/common.sh@339 -- # ver1_l=2 00:07:33.122 07:02:51 -- scripts/common.sh@340 -- # ver2_l=1 00:07:33.122 07:02:51 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:33.122 07:02:51 -- scripts/common.sh@343 -- # case "$op" in 00:07:33.122 07:02:51 -- scripts/common.sh@344 -- # : 1 00:07:33.122 07:02:51 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:33.122 07:02:51 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:33.122 07:02:51 -- scripts/common.sh@364 -- # decimal 1 00:07:33.122 07:02:51 -- scripts/common.sh@352 -- # local d=1 00:07:33.122 07:02:51 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:33.122 07:02:51 -- scripts/common.sh@354 -- # echo 1 00:07:33.122 07:02:51 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:33.122 07:02:51 -- scripts/common.sh@365 -- # decimal 2 00:07:33.122 07:02:51 -- scripts/common.sh@352 -- # local d=2 00:07:33.122 07:02:51 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:33.122 07:02:51 -- scripts/common.sh@354 -- # echo 2 00:07:33.122 07:02:51 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:33.122 07:02:51 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:33.122 07:02:51 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:33.122 07:02:51 -- scripts/common.sh@367 -- # return 0 00:07:33.122 07:02:51 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:33.122 07:02:51 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:33.122 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:33.122 --rc genhtml_branch_coverage=1 00:07:33.122 --rc genhtml_function_coverage=1 00:07:33.122 --rc genhtml_legend=1 00:07:33.122 --rc geninfo_all_blocks=1 00:07:33.122 --rc geninfo_unexecuted_blocks=1 00:07:33.122 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:33.122 ' 00:07:33.122 07:02:51 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:33.122 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:33.122 --rc genhtml_branch_coverage=1 00:07:33.122 --rc genhtml_function_coverage=1 00:07:33.122 --rc genhtml_legend=1 00:07:33.122 --rc geninfo_all_blocks=1 00:07:33.122 --rc geninfo_unexecuted_blocks=1 00:07:33.122 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:33.122 ' 00:07:33.122 07:02:51 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:33.122 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:33.122 --rc genhtml_branch_coverage=1 00:07:33.122 --rc genhtml_function_coverage=1 00:07:33.122 --rc genhtml_legend=1 00:07:33.122 --rc geninfo_all_blocks=1 00:07:33.122 --rc geninfo_unexecuted_blocks=1 00:07:33.122 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:33.122 ' 00:07:33.122 07:02:51 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:33.122 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:33.122 --rc genhtml_branch_coverage=1 00:07:33.122 --rc genhtml_function_coverage=1 00:07:33.122 --rc genhtml_legend=1 00:07:33.122 --rc geninfo_all_blocks=1 00:07:33.122 --rc geninfo_unexecuted_blocks=1 00:07:33.122 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:33.122 ' 00:07:33.122 07:02:51 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:33.122 07:02:51 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:33.122 07:02:51 -- common/autotest_common.sh@548 -- # fuzzers=() 00:07:33.122 07:02:51 -- common/autotest_common.sh@548 -- # local fuzzers 00:07:33.122 07:02:51 -- common/autotest_common.sh@550 -- # [[ -n '' ]] 00:07:33.122 07:02:51 -- common/autotest_common.sh@553 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:33.122 07:02:51 -- common/autotest_common.sh@554 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:33.122 07:02:51 -- common/autotest_common.sh@557 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:33.122 07:02:51 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:33.122 07:02:51 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:07:33.122 07:02:51 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:33.122 07:02:51 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:33.122 07:02:51 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:33.122 07:02:51 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:33.122 07:02:51 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:33.122 07:02:51 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:33.122 07:02:51 -- fuzz/llvm.sh@19 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:33.122 07:02:51 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:33.122 07:02:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:33.122 07:02:51 -- common/autotest_common.sh@10 -- # set +x 00:07:33.122 ************************************ 00:07:33.122 START TEST nvmf_fuzz 00:07:33.122 ************************************ 00:07:33.122 07:02:51 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:33.122 * Looking for test storage... 00:07:33.122 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:33.122 07:02:51 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:33.122 07:02:51 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:33.122 07:02:51 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:33.122 07:02:51 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:33.122 07:02:51 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:33.122 07:02:51 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:33.122 07:02:51 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:33.122 07:02:51 -- scripts/common.sh@335 -- # IFS=.-: 00:07:33.122 07:02:51 -- scripts/common.sh@335 -- # read -ra ver1 00:07:33.122 07:02:51 -- scripts/common.sh@336 -- # IFS=.-: 00:07:33.122 07:02:51 -- scripts/common.sh@336 -- # read -ra ver2 00:07:33.122 07:02:51 -- scripts/common.sh@337 -- # local 'op=<' 00:07:33.122 07:02:51 -- scripts/common.sh@339 -- # ver1_l=2 00:07:33.122 07:02:51 -- scripts/common.sh@340 -- # ver2_l=1 00:07:33.122 07:02:51 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:33.122 07:02:51 -- scripts/common.sh@343 -- # case "$op" in 00:07:33.122 07:02:51 -- scripts/common.sh@344 -- # : 1 00:07:33.122 07:02:51 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:33.122 07:02:51 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:33.122 07:02:51 -- scripts/common.sh@364 -- # decimal 1 00:07:33.122 07:02:51 -- scripts/common.sh@352 -- # local d=1 00:07:33.122 07:02:51 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:33.122 07:02:51 -- scripts/common.sh@354 -- # echo 1 00:07:33.122 07:02:51 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:33.122 07:02:51 -- scripts/common.sh@365 -- # decimal 2 00:07:33.122 07:02:51 -- scripts/common.sh@352 -- # local d=2 00:07:33.122 07:02:51 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:33.122 07:02:51 -- scripts/common.sh@354 -- # echo 2 00:07:33.122 07:02:51 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:33.122 07:02:51 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:33.122 07:02:51 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:33.122 07:02:51 -- scripts/common.sh@367 -- # return 0 00:07:33.122 07:02:51 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:33.122 07:02:51 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:33.122 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:33.122 --rc genhtml_branch_coverage=1 00:07:33.122 --rc genhtml_function_coverage=1 00:07:33.122 --rc genhtml_legend=1 00:07:33.122 --rc geninfo_all_blocks=1 00:07:33.122 --rc geninfo_unexecuted_blocks=1 00:07:33.122 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:33.122 ' 00:07:33.122 07:02:51 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:33.122 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:33.122 --rc genhtml_branch_coverage=1 00:07:33.122 --rc genhtml_function_coverage=1 00:07:33.122 --rc genhtml_legend=1 00:07:33.122 --rc geninfo_all_blocks=1 00:07:33.122 --rc geninfo_unexecuted_blocks=1 00:07:33.122 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:33.122 ' 00:07:33.122 07:02:51 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:33.122 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:33.122 --rc genhtml_branch_coverage=1 00:07:33.122 --rc genhtml_function_coverage=1 00:07:33.122 --rc genhtml_legend=1 00:07:33.122 --rc geninfo_all_blocks=1 00:07:33.122 --rc geninfo_unexecuted_blocks=1 00:07:33.122 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:33.122 ' 00:07:33.122 07:02:51 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:33.122 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:33.123 --rc genhtml_branch_coverage=1 00:07:33.123 --rc genhtml_function_coverage=1 00:07:33.123 --rc genhtml_legend=1 00:07:33.123 --rc geninfo_all_blocks=1 00:07:33.123 --rc geninfo_unexecuted_blocks=1 00:07:33.123 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:33.123 ' 00:07:33.123 07:02:51 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:33.123 07:02:51 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:33.123 07:02:51 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:33.123 07:02:51 -- common/autotest_common.sh@34 -- # set -e 00:07:33.123 07:02:51 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:33.123 07:02:51 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:33.123 07:02:51 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:33.123 07:02:51 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:33.123 07:02:51 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:33.123 07:02:51 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:33.123 07:02:51 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:33.123 07:02:51 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:33.123 07:02:51 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:33.123 07:02:51 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:33.123 07:02:51 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:33.123 07:02:51 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:33.123 07:02:51 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:33.123 07:02:51 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:33.123 07:02:51 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:33.123 07:02:51 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:33.123 07:02:51 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:33.123 07:02:51 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:33.123 07:02:51 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:33.123 07:02:51 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:33.123 07:02:51 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:33.123 07:02:51 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:33.123 07:02:51 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:33.123 07:02:51 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:33.123 07:02:51 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:33.123 07:02:51 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:33.123 07:02:51 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:33.123 07:02:51 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:33.123 07:02:51 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:33.123 07:02:51 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:33.123 07:02:51 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:33.123 07:02:51 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:33.123 07:02:51 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:33.123 07:02:51 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:33.123 07:02:51 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:33.123 07:02:51 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:33.123 07:02:51 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:33.123 07:02:51 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:33.123 07:02:51 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:33.123 07:02:51 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:33.123 07:02:51 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:33.123 07:02:51 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:33.123 07:02:51 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:33.123 07:02:51 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:33.123 07:02:51 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:33.123 07:02:51 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:33.123 07:02:51 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:33.123 07:02:51 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:33.123 07:02:51 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:33.123 07:02:51 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:33.123 07:02:51 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:33.123 07:02:51 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:33.123 07:02:51 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:33.123 07:02:51 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:33.123 07:02:51 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:33.123 07:02:51 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:33.123 07:02:51 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:07:33.123 07:02:51 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:07:33.123 07:02:51 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:07:33.123 07:02:51 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:07:33.123 07:02:51 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:07:33.123 07:02:51 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:07:33.123 07:02:51 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:07:33.123 07:02:51 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:07:33.123 07:02:51 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:33.123 07:02:51 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:07:33.123 07:02:51 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:07:33.123 07:02:51 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:07:33.123 07:02:51 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:07:33.123 07:02:51 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:33.123 07:02:51 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:07:33.123 07:02:51 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:07:33.123 07:02:51 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:07:33.123 07:02:51 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:07:33.123 07:02:51 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:07:33.123 07:02:51 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:07:33.385 07:02:51 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:07:33.385 07:02:51 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:07:33.385 07:02:51 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:07:33.385 07:02:51 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:07:33.385 07:02:51 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:33.385 07:02:51 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:07:33.385 07:02:51 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:07:33.385 07:02:51 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:33.385 07:02:51 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:33.385 07:02:51 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:33.385 07:02:51 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:33.385 07:02:51 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:33.385 07:02:51 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:33.385 07:02:51 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:33.385 07:02:51 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:33.385 07:02:51 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:33.385 07:02:51 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:33.385 07:02:51 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:33.385 07:02:51 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:33.385 07:02:51 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:33.385 07:02:51 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:33.385 07:02:51 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:33.385 07:02:51 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:33.385 #define SPDK_CONFIG_H 00:07:33.385 #define SPDK_CONFIG_APPS 1 00:07:33.385 #define SPDK_CONFIG_ARCH native 00:07:33.385 #undef SPDK_CONFIG_ASAN 00:07:33.385 #undef SPDK_CONFIG_AVAHI 00:07:33.385 #undef SPDK_CONFIG_CET 00:07:33.385 #define SPDK_CONFIG_COVERAGE 1 00:07:33.385 #define SPDK_CONFIG_CROSS_PREFIX 00:07:33.385 #undef SPDK_CONFIG_CRYPTO 00:07:33.385 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:33.385 #undef SPDK_CONFIG_CUSTOMOCF 00:07:33.385 #undef SPDK_CONFIG_DAOS 00:07:33.385 #define SPDK_CONFIG_DAOS_DIR 00:07:33.385 #define SPDK_CONFIG_DEBUG 1 00:07:33.385 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:33.385 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:33.385 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:33.385 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:33.385 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:33.385 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:33.385 #define SPDK_CONFIG_EXAMPLES 1 00:07:33.385 #undef SPDK_CONFIG_FC 00:07:33.385 #define SPDK_CONFIG_FC_PATH 00:07:33.385 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:33.385 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:33.385 #undef SPDK_CONFIG_FUSE 00:07:33.385 #define SPDK_CONFIG_FUZZER 1 00:07:33.385 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:33.385 #undef SPDK_CONFIG_GOLANG 00:07:33.385 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:33.385 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:33.385 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:33.385 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:33.385 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:33.385 #define SPDK_CONFIG_IDXD 1 00:07:33.385 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:33.385 #undef SPDK_CONFIG_IPSEC_MB 00:07:33.385 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:33.385 #define SPDK_CONFIG_ISAL 1 00:07:33.385 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:33.385 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:33.385 #define SPDK_CONFIG_LIBDIR 00:07:33.385 #undef SPDK_CONFIG_LTO 00:07:33.385 #define SPDK_CONFIG_MAX_LCORES 00:07:33.385 #define SPDK_CONFIG_NVME_CUSE 1 00:07:33.385 #undef SPDK_CONFIG_OCF 00:07:33.385 #define SPDK_CONFIG_OCF_PATH 00:07:33.385 #define SPDK_CONFIG_OPENSSL_PATH 00:07:33.385 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:33.385 #undef SPDK_CONFIG_PGO_USE 00:07:33.385 #define SPDK_CONFIG_PREFIX /usr/local 00:07:33.385 #undef SPDK_CONFIG_RAID5F 00:07:33.385 #undef SPDK_CONFIG_RBD 00:07:33.385 #define SPDK_CONFIG_RDMA 1 00:07:33.385 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:33.385 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:33.385 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:33.385 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:33.385 #undef SPDK_CONFIG_SHARED 00:07:33.385 #undef SPDK_CONFIG_SMA 00:07:33.385 #define SPDK_CONFIG_TESTS 1 00:07:33.385 #undef SPDK_CONFIG_TSAN 00:07:33.385 #define SPDK_CONFIG_UBLK 1 00:07:33.385 #define SPDK_CONFIG_UBSAN 1 00:07:33.385 #undef SPDK_CONFIG_UNIT_TESTS 00:07:33.385 #undef SPDK_CONFIG_URING 00:07:33.385 #define SPDK_CONFIG_URING_PATH 00:07:33.385 #undef SPDK_CONFIG_URING_ZNS 00:07:33.385 #undef SPDK_CONFIG_USDT 00:07:33.385 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:33.385 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:33.385 #define SPDK_CONFIG_VFIO_USER 1 00:07:33.385 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:33.385 #define SPDK_CONFIG_VHOST 1 00:07:33.385 #define SPDK_CONFIG_VIRTIO 1 00:07:33.385 #undef SPDK_CONFIG_VTUNE 00:07:33.385 #define SPDK_CONFIG_VTUNE_DIR 00:07:33.385 #define SPDK_CONFIG_WERROR 1 00:07:33.385 #define SPDK_CONFIG_WPDK_DIR 00:07:33.385 #undef SPDK_CONFIG_XNVME 00:07:33.385 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:33.385 07:02:51 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:33.385 07:02:51 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:33.385 07:02:51 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:33.385 07:02:51 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:33.385 07:02:51 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:33.385 07:02:51 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:33.385 07:02:51 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:33.385 07:02:51 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:33.385 07:02:51 -- paths/export.sh@5 -- # export PATH 00:07:33.385 07:02:51 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:33.385 07:02:51 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:33.385 07:02:51 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:33.385 07:02:51 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:33.385 07:02:51 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:33.385 07:02:51 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:33.385 07:02:51 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:33.385 07:02:51 -- pm/common@16 -- # TEST_TAG=N/A 00:07:33.385 07:02:51 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:33.385 07:02:51 -- common/autotest_common.sh@52 -- # : 1 00:07:33.385 07:02:51 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:07:33.385 07:02:51 -- common/autotest_common.sh@56 -- # : 0 00:07:33.385 07:02:51 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:33.385 07:02:51 -- common/autotest_common.sh@58 -- # : 0 00:07:33.385 07:02:51 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:07:33.385 07:02:51 -- common/autotest_common.sh@60 -- # : 1 00:07:33.385 07:02:51 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:33.385 07:02:51 -- common/autotest_common.sh@62 -- # : 0 00:07:33.385 07:02:51 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:07:33.385 07:02:51 -- common/autotest_common.sh@64 -- # : 00:07:33.385 07:02:51 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:07:33.385 07:02:51 -- common/autotest_common.sh@66 -- # : 0 00:07:33.385 07:02:51 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:07:33.385 07:02:51 -- common/autotest_common.sh@68 -- # : 0 00:07:33.385 07:02:51 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:07:33.385 07:02:51 -- common/autotest_common.sh@70 -- # : 0 00:07:33.385 07:02:51 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:07:33.386 07:02:51 -- common/autotest_common.sh@72 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:33.386 07:02:51 -- common/autotest_common.sh@74 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:07:33.386 07:02:51 -- common/autotest_common.sh@76 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:07:33.386 07:02:51 -- common/autotest_common.sh@78 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:07:33.386 07:02:51 -- common/autotest_common.sh@80 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:07:33.386 07:02:51 -- common/autotest_common.sh@82 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:07:33.386 07:02:51 -- common/autotest_common.sh@84 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:07:33.386 07:02:51 -- common/autotest_common.sh@86 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:07:33.386 07:02:51 -- common/autotest_common.sh@88 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:07:33.386 07:02:51 -- common/autotest_common.sh@90 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:33.386 07:02:51 -- common/autotest_common.sh@92 -- # : 1 00:07:33.386 07:02:51 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:07:33.386 07:02:51 -- common/autotest_common.sh@94 -- # : 1 00:07:33.386 07:02:51 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:07:33.386 07:02:51 -- common/autotest_common.sh@96 -- # : rdma 00:07:33.386 07:02:51 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:33.386 07:02:51 -- common/autotest_common.sh@98 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:07:33.386 07:02:51 -- common/autotest_common.sh@100 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:07:33.386 07:02:51 -- common/autotest_common.sh@102 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:07:33.386 07:02:51 -- common/autotest_common.sh@104 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:07:33.386 07:02:51 -- common/autotest_common.sh@106 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:07:33.386 07:02:51 -- common/autotest_common.sh@108 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:07:33.386 07:02:51 -- common/autotest_common.sh@110 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:07:33.386 07:02:51 -- common/autotest_common.sh@112 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:33.386 07:02:51 -- common/autotest_common.sh@114 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:07:33.386 07:02:51 -- common/autotest_common.sh@116 -- # : 1 00:07:33.386 07:02:51 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:07:33.386 07:02:51 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:33.386 07:02:51 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:33.386 07:02:51 -- common/autotest_common.sh@120 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:07:33.386 07:02:51 -- common/autotest_common.sh@122 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:07:33.386 07:02:51 -- common/autotest_common.sh@124 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:07:33.386 07:02:51 -- common/autotest_common.sh@126 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:07:33.386 07:02:51 -- common/autotest_common.sh@128 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:07:33.386 07:02:51 -- common/autotest_common.sh@130 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:07:33.386 07:02:51 -- common/autotest_common.sh@132 -- # : v22.11.4 00:07:33.386 07:02:51 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:07:33.386 07:02:51 -- common/autotest_common.sh@134 -- # : true 00:07:33.386 07:02:51 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:07:33.386 07:02:51 -- common/autotest_common.sh@136 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:07:33.386 07:02:51 -- common/autotest_common.sh@138 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:07:33.386 07:02:51 -- common/autotest_common.sh@140 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:07:33.386 07:02:51 -- common/autotest_common.sh@142 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:07:33.386 07:02:51 -- common/autotest_common.sh@144 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:07:33.386 07:02:51 -- common/autotest_common.sh@146 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:07:33.386 07:02:51 -- common/autotest_common.sh@148 -- # : 00:07:33.386 07:02:51 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:07:33.386 07:02:51 -- common/autotest_common.sh@150 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:07:33.386 07:02:51 -- common/autotest_common.sh@152 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:07:33.386 07:02:51 -- common/autotest_common.sh@154 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:07:33.386 07:02:51 -- common/autotest_common.sh@156 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:07:33.386 07:02:51 -- common/autotest_common.sh@158 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:07:33.386 07:02:51 -- common/autotest_common.sh@160 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:07:33.386 07:02:51 -- common/autotest_common.sh@163 -- # : 00:07:33.386 07:02:51 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:07:33.386 07:02:51 -- common/autotest_common.sh@165 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:07:33.386 07:02:51 -- common/autotest_common.sh@167 -- # : 0 00:07:33.386 07:02:51 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:33.386 07:02:51 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:33.386 07:02:51 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:33.386 07:02:51 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:33.386 07:02:51 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:33.386 07:02:51 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:33.386 07:02:51 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:33.386 07:02:51 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:33.386 07:02:51 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:33.386 07:02:51 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:33.386 07:02:51 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:33.386 07:02:51 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:33.386 07:02:51 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:33.386 07:02:51 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:33.386 07:02:51 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:07:33.386 07:02:51 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:33.386 07:02:51 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:33.386 07:02:51 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:33.386 07:02:51 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:33.386 07:02:51 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:33.386 07:02:51 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:07:33.386 07:02:51 -- common/autotest_common.sh@196 -- # cat 00:07:33.386 07:02:51 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:07:33.386 07:02:51 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:33.387 07:02:51 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:33.387 07:02:51 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:33.387 07:02:51 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:33.387 07:02:51 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:07:33.387 07:02:51 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:07:33.387 07:02:51 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:33.387 07:02:51 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:33.387 07:02:51 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:33.387 07:02:51 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:33.387 07:02:51 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:33.387 07:02:51 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:33.387 07:02:51 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:33.387 07:02:51 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:33.387 07:02:51 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:33.387 07:02:51 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:33.387 07:02:51 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:33.387 07:02:51 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:33.387 07:02:51 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:07:33.387 07:02:51 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:07:33.387 07:02:51 -- common/autotest_common.sh@249 -- # _LCOV= 00:07:33.387 07:02:51 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:07:33.387 07:02:51 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:07:33.387 07:02:51 -- common/autotest_common.sh@250 -- # _LCOV=1 00:07:33.387 07:02:51 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:33.387 07:02:51 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:07:33.387 07:02:51 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:33.387 07:02:51 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:07:33.387 07:02:51 -- common/autotest_common.sh@259 -- # export valgrind= 00:07:33.387 07:02:51 -- common/autotest_common.sh@259 -- # valgrind= 00:07:33.387 07:02:51 -- common/autotest_common.sh@265 -- # uname -s 00:07:33.387 07:02:51 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:07:33.387 07:02:51 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:07:33.387 07:02:51 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:07:33.387 07:02:51 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:07:33.387 07:02:51 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:33.387 07:02:51 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:33.387 07:02:51 -- common/autotest_common.sh@275 -- # MAKE=make 00:07:33.387 07:02:51 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:07:33.387 07:02:51 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:07:33.387 07:02:51 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:07:33.387 07:02:51 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:33.387 07:02:51 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:07:33.387 07:02:51 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:07:33.387 07:02:51 -- common/autotest_common.sh@319 -- # [[ -z 489059 ]] 00:07:33.387 07:02:51 -- common/autotest_common.sh@319 -- # kill -0 489059 00:07:33.387 07:02:51 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:07:33.387 07:02:51 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:07:33.387 07:02:51 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:07:33.387 07:02:51 -- common/autotest_common.sh@332 -- # local mount target_dir 00:07:33.387 07:02:51 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:07:33.387 07:02:51 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:07:33.387 07:02:51 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:07:33.387 07:02:51 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:07:33.387 07:02:51 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.HapaZA 00:07:33.387 07:02:51 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:33.387 07:02:51 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:07:33.387 07:02:51 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:07:33.387 07:02:51 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.HapaZA/tests/nvmf /tmp/spdk.HapaZA 00:07:33.387 07:02:51 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:07:33.387 07:02:51 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:33.387 07:02:51 -- common/autotest_common.sh@328 -- # df -T 00:07:33.387 07:02:51 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:07:33.387 07:02:51 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:07:33.387 07:02:51 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:07:33.387 07:02:51 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:07:33.387 07:02:51 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:07:33.387 07:02:51 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:07:33.387 07:02:51 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:33.387 07:02:51 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:07:33.387 07:02:51 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:07:33.387 07:02:51 -- common/autotest_common.sh@363 -- # avails["$mount"]=785162240 00:07:33.387 07:02:51 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:07:33.387 07:02:51 -- common/autotest_common.sh@364 -- # uses["$mount"]=4499267584 00:07:33.387 07:02:51 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:33.387 07:02:51 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:07:33.387 07:02:51 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:07:33.387 07:02:51 -- common/autotest_common.sh@363 -- # avails["$mount"]=53206769664 00:07:33.387 07:02:51 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730607104 00:07:33.387 07:02:51 -- common/autotest_common.sh@364 -- # uses["$mount"]=8523837440 00:07:33.387 07:02:51 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:33.387 07:02:51 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:33.387 07:02:51 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:33.387 07:02:51 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864044032 00:07:33.387 07:02:51 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865301504 00:07:33.387 07:02:51 -- common/autotest_common.sh@364 -- # uses["$mount"]=1257472 00:07:33.387 07:02:51 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:33.387 07:02:51 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:33.387 07:02:51 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:33.387 07:02:51 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340121600 00:07:33.387 07:02:51 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346122240 00:07:33.387 07:02:51 -- common/autotest_common.sh@364 -- # uses["$mount"]=6000640 00:07:33.387 07:02:51 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:33.387 07:02:51 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:33.387 07:02:51 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:33.387 07:02:51 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864986112 00:07:33.387 07:02:51 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865305600 00:07:33.387 07:02:51 -- common/autotest_common.sh@364 -- # uses["$mount"]=319488 00:07:33.387 07:02:51 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:33.387 07:02:51 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:33.387 07:02:51 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:33.387 07:02:51 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:07:33.387 07:02:51 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:07:33.387 07:02:51 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:07:33.387 07:02:51 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:33.387 07:02:51 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:07:33.387 * Looking for test storage... 00:07:33.387 07:02:51 -- common/autotest_common.sh@369 -- # local target_space new_size 00:07:33.387 07:02:51 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:07:33.387 07:02:51 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:33.387 07:02:51 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:33.387 07:02:51 -- common/autotest_common.sh@373 -- # mount=/ 00:07:33.387 07:02:51 -- common/autotest_common.sh@375 -- # target_space=53206769664 00:07:33.387 07:02:51 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:07:33.387 07:02:51 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:07:33.387 07:02:51 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:07:33.387 07:02:51 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:07:33.387 07:02:51 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:07:33.387 07:02:51 -- common/autotest_common.sh@382 -- # new_size=10738429952 00:07:33.387 07:02:51 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:33.387 07:02:51 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:33.387 07:02:51 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:33.387 07:02:51 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:33.387 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:33.387 07:02:51 -- common/autotest_common.sh@390 -- # return 0 00:07:33.387 07:02:51 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:07:33.387 07:02:51 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:07:33.387 07:02:51 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:33.387 07:02:51 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:33.387 07:02:51 -- common/autotest_common.sh@1682 -- # true 00:07:33.387 07:02:51 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:07:33.387 07:02:51 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:33.387 07:02:51 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:33.387 07:02:51 -- common/autotest_common.sh@27 -- # exec 00:07:33.387 07:02:51 -- common/autotest_common.sh@29 -- # exec 00:07:33.387 07:02:51 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:33.388 07:02:51 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:33.388 07:02:51 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:33.388 07:02:51 -- common/autotest_common.sh@18 -- # set -x 00:07:33.388 07:02:51 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:33.388 07:02:51 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:33.388 07:02:51 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:33.388 07:02:51 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:33.388 07:02:51 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:33.388 07:02:51 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:33.388 07:02:51 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:33.388 07:02:51 -- scripts/common.sh@335 -- # IFS=.-: 00:07:33.388 07:02:51 -- scripts/common.sh@335 -- # read -ra ver1 00:07:33.388 07:02:51 -- scripts/common.sh@336 -- # IFS=.-: 00:07:33.388 07:02:51 -- scripts/common.sh@336 -- # read -ra ver2 00:07:33.388 07:02:51 -- scripts/common.sh@337 -- # local 'op=<' 00:07:33.388 07:02:51 -- scripts/common.sh@339 -- # ver1_l=2 00:07:33.388 07:02:51 -- scripts/common.sh@340 -- # ver2_l=1 00:07:33.388 07:02:51 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:33.388 07:02:51 -- scripts/common.sh@343 -- # case "$op" in 00:07:33.388 07:02:51 -- scripts/common.sh@344 -- # : 1 00:07:33.388 07:02:51 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:33.388 07:02:51 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:33.388 07:02:51 -- scripts/common.sh@364 -- # decimal 1 00:07:33.388 07:02:51 -- scripts/common.sh@352 -- # local d=1 00:07:33.388 07:02:51 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:33.388 07:02:51 -- scripts/common.sh@354 -- # echo 1 00:07:33.388 07:02:51 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:33.388 07:02:51 -- scripts/common.sh@365 -- # decimal 2 00:07:33.388 07:02:51 -- scripts/common.sh@352 -- # local d=2 00:07:33.388 07:02:51 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:33.388 07:02:51 -- scripts/common.sh@354 -- # echo 2 00:07:33.388 07:02:51 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:33.388 07:02:51 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:33.388 07:02:51 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:33.388 07:02:51 -- scripts/common.sh@367 -- # return 0 00:07:33.388 07:02:51 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:33.388 07:02:51 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:33.388 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:33.388 --rc genhtml_branch_coverage=1 00:07:33.388 --rc genhtml_function_coverage=1 00:07:33.388 --rc genhtml_legend=1 00:07:33.388 --rc geninfo_all_blocks=1 00:07:33.388 --rc geninfo_unexecuted_blocks=1 00:07:33.388 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:33.388 ' 00:07:33.388 07:02:51 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:33.388 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:33.388 --rc genhtml_branch_coverage=1 00:07:33.388 --rc genhtml_function_coverage=1 00:07:33.388 --rc genhtml_legend=1 00:07:33.388 --rc geninfo_all_blocks=1 00:07:33.388 --rc geninfo_unexecuted_blocks=1 00:07:33.388 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:33.388 ' 00:07:33.388 07:02:51 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:33.388 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:33.388 --rc genhtml_branch_coverage=1 00:07:33.388 --rc genhtml_function_coverage=1 00:07:33.388 --rc genhtml_legend=1 00:07:33.388 --rc geninfo_all_blocks=1 00:07:33.388 --rc geninfo_unexecuted_blocks=1 00:07:33.388 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:33.388 ' 00:07:33.388 07:02:51 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:33.388 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:33.388 --rc genhtml_branch_coverage=1 00:07:33.388 --rc genhtml_function_coverage=1 00:07:33.388 --rc genhtml_legend=1 00:07:33.388 --rc geninfo_all_blocks=1 00:07:33.388 --rc geninfo_unexecuted_blocks=1 00:07:33.388 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:33.388 ' 00:07:33.388 07:02:51 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:33.388 07:02:51 -- ../common.sh@8 -- # pids=() 00:07:33.388 07:02:51 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:33.388 07:02:51 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:33.388 07:02:51 -- nvmf/run.sh@56 -- # fuzz_num=25 00:07:33.388 07:02:51 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:07:33.388 07:02:51 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:07:33.388 07:02:51 -- nvmf/run.sh@61 -- # mem_size=512 00:07:33.388 07:02:51 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:07:33.388 07:02:51 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:07:33.388 07:02:51 -- ../common.sh@69 -- # local fuzz_num=25 00:07:33.388 07:02:51 -- ../common.sh@70 -- # local time=1 00:07:33.388 07:02:51 -- ../common.sh@72 -- # (( i = 0 )) 00:07:33.388 07:02:51 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:33.388 07:02:51 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:33.388 07:02:51 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:33.388 07:02:51 -- nvmf/run.sh@24 -- # local timen=1 00:07:33.388 07:02:51 -- nvmf/run.sh@25 -- # local core=0x1 00:07:33.388 07:02:51 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:33.388 07:02:51 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:33.388 07:02:51 -- nvmf/run.sh@29 -- # printf %02d 0 00:07:33.388 07:02:51 -- nvmf/run.sh@29 -- # port=4400 00:07:33.388 07:02:51 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:33.388 07:02:51 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:33.388 07:02:51 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:33.651 07:02:51 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:07:33.651 [2024-12-13 07:02:51.653566] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:33.651 [2024-12-13 07:02:51.653657] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid489211 ] 00:07:33.651 EAL: No free 2048 kB hugepages reported on node 1 00:07:33.651 [2024-12-13 07:02:51.837162] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.651 [2024-12-13 07:02:51.857678] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:33.651 [2024-12-13 07:02:51.857800] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.955 [2024-12-13 07:02:51.909473] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:33.956 [2024-12-13 07:02:51.925795] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:33.956 INFO: Running with entropic power schedule (0xFF, 100). 00:07:33.956 INFO: Seed: 609686649 00:07:33.956 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:33.956 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:33.956 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:33.956 INFO: A corpus is not provided, starting from an empty corpus 00:07:33.956 #2 INITED exec/s: 0 rss: 59Mb 00:07:33.956 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:33.956 This may also happen if the target rejected all inputs we tried so far 00:07:33.956 [2024-12-13 07:02:51.990969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:33.956 [2024-12-13 07:02:51.990997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.219 NEW_FUNC[1/670]: 0x451418 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:34.219 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:34.219 #30 NEW cov: 11559 ft: 11561 corp: 2/113b lim: 320 exec/s: 0 rss: 67Mb L: 112/112 MS: 3 ChangeBit-InsertRepeatedBytes-InsertRepeatedBytes- 00:07:34.219 [2024-12-13 07:02:52.311909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:34.219 [2024-12-13 07:02:52.311963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.219 NEW_FUNC[1/1]: 0xead8b8 in spdk_get_ticks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/env.c:296 00:07:34.219 #31 NEW cov: 11675 ft: 12029 corp: 3/225b lim: 320 exec/s: 0 rss: 67Mb L: 112/112 MS: 1 ChangeByte- 00:07:34.219 [2024-12-13 07:02:52.361834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (0b) qid:0 cid:4 nsid:1b1b1b1b cdw10:1b1b1b1b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x1b1b1b1b1b1b1b1b 00:07:34.219 [2024-12-13 07:02:52.361858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.219 #33 NEW cov: 11681 ft: 12445 corp: 4/309b lim: 320 exec/s: 0 rss: 67Mb L: 84/112 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:34.219 [2024-12-13 07:02:52.401916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (0b) qid:0 cid:4 nsid:1b1b1b1b cdw10:1b1b1b1b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x1b1b1b1b1b1b1b1b 00:07:34.219 [2024-12-13 07:02:52.401944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.219 #34 NEW cov: 11766 ft: 12731 corp: 5/393b lim: 320 exec/s: 0 rss: 67Mb L: 84/112 MS: 1 ChangeBinInt- 00:07:34.219 [2024-12-13 07:02:52.442053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x55555555556f5555 00:07:34.219 [2024-12-13 07:02:52.442077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.499 #35 NEW cov: 11766 ft: 12837 corp: 6/506b lim: 320 exec/s: 0 rss: 67Mb L: 113/113 MS: 1 InsertByte- 00:07:34.499 [2024-12-13 07:02:52.482150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x55555555556f5555 00:07:34.499 [2024-12-13 07:02:52.482174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.499 #46 NEW cov: 11766 ft: 13066 corp: 7/619b lim: 320 exec/s: 0 rss: 67Mb L: 113/113 MS: 1 ShuffleBytes- 00:07:34.499 [2024-12-13 07:02:52.522281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (0b) qid:0 cid:4 nsid:1b1b1b1b cdw10:1b1b1b1b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x1b1b1b1b1b1b1b1b 00:07:34.499 [2024-12-13 07:02:52.522306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.499 #47 NEW cov: 11766 ft: 13143 corp: 8/719b lim: 320 exec/s: 0 rss: 68Mb L: 100/113 MS: 1 CopyPart- 00:07:34.499 [2024-12-13 07:02:52.562405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:34.499 [2024-12-13 07:02:52.562429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.499 #53 NEW cov: 11766 ft: 13231 corp: 9/830b lim: 320 exec/s: 0 rss: 68Mb L: 111/113 MS: 1 EraseBytes- 00:07:34.499 [2024-12-13 07:02:52.602521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:34.499 [2024-12-13 07:02:52.602545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.499 #54 NEW cov: 11766 ft: 13301 corp: 10/942b lim: 320 exec/s: 0 rss: 68Mb L: 112/113 MS: 1 CMP- DE: "\377\377\377\377\377\377\000\000"- 00:07:34.499 [2024-12-13 07:02:52.632810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:1b1b1b0b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:34.499 [2024-12-13 07:02:52.632834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.499 [2024-12-13 07:02:52.632893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1b) qid:0 cid:5 nsid:1b1b1b1b cdw10:1b1b1b1b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x1b1b1b1b1b1b1b1b 00:07:34.499 [2024-12-13 07:02:52.632906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.499 [2024-12-13 07:02:52.632963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e4) qid:0 cid:6 nsid:55555555 cdw10:55555555 cdw11:55555555 00:07:34.499 [2024-12-13 07:02:52.632977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.499 #55 NEW cov: 11786 ft: 13662 corp: 11/1154b lim: 320 exec/s: 0 rss: 68Mb L: 212/212 MS: 1 CrossOver- 00:07:34.499 [2024-12-13 07:02:52.672972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:1b1b1b0b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:34.499 [2024-12-13 07:02:52.672996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.499 [2024-12-13 07:02:52.673060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1b) qid:0 cid:5 nsid:1b1b1b1b cdw10:1b1b1b1b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x1b1b1b1b1b1b1b1b 00:07:34.499 [2024-12-13 07:02:52.673074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.499 [2024-12-13 07:02:52.673131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e4) qid:0 cid:6 nsid:55555555 cdw10:55555555 cdw11:55555555 00:07:34.499 [2024-12-13 07:02:52.673144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.499 #56 NEW cov: 11786 ft: 13749 corp: 12/1366b lim: 320 exec/s: 0 rss: 68Mb L: 212/212 MS: 1 ChangeBinInt- 00:07:34.499 [2024-12-13 07:02:52.712862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:34.499 [2024-12-13 07:02:52.712885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.816 #57 NEW cov: 11786 ft: 13839 corp: 13/1478b lim: 320 exec/s: 0 rss: 68Mb L: 112/212 MS: 1 ChangeByte- 00:07:34.816 [2024-12-13 07:02:52.752959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (0b) qid:0 cid:4 nsid:1b1b1b1b cdw10:1b1b8b1b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x1b1b1b1b1b1b1b1b 00:07:34.816 [2024-12-13 07:02:52.752984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.816 #58 NEW cov: 11786 ft: 13866 corp: 14/1563b lim: 320 exec/s: 0 rss: 68Mb L: 85/212 MS: 1 InsertByte- 00:07:34.816 [2024-12-13 07:02:52.793234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x55555555556f5555 00:07:34.816 [2024-12-13 07:02:52.793258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.816 [2024-12-13 07:02:52.793318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (55) qid:0 cid:5 nsid:55555555 cdw10:555555ff cdw11:55555555 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.816 [2024-12-13 07:02:52.793332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.816 NEW_FUNC[1/1]: 0x16dd468 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:07:34.816 #59 NEW cov: 11799 ft: 14325 corp: 15/1707b lim: 320 exec/s: 0 rss: 68Mb L: 144/212 MS: 1 CopyPart- 00:07:34.816 [2024-12-13 07:02:52.833520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:1b1b1b0b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:34.816 [2024-12-13 07:02:52.833544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.816 [2024-12-13 07:02:52.833603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1b) qid:0 cid:5 nsid:1b1b1b1b cdw10:1b1b1b1b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x1b1b1b1b1b1b1b1b 00:07:34.816 [2024-12-13 07:02:52.833617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.816 [2024-12-13 07:02:52.833679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e4) qid:0 cid:6 nsid:55555555 cdw10:55555555 cdw11:55555555 00:07:34.816 [2024-12-13 07:02:52.833700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.816 [2024-12-13 07:02:52.833774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (55) qid:0 cid:7 nsid:55555555 cdw10:ffffffff cdw11:5555e4ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.816 [2024-12-13 07:02:52.833791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:34.816 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:34.816 #60 NEW cov: 11822 ft: 14573 corp: 16/1979b lim: 320 exec/s: 0 rss: 68Mb L: 272/272 MS: 1 CopyPart- 00:07:34.816 [2024-12-13 07:02:52.883502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:34.816 [2024-12-13 07:02:52.883527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.816 [2024-12-13 07:02:52.883599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (55) qid:0 cid:5 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.816 [2024-12-13 07:02:52.883613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.816 #61 NEW cov: 11822 ft: 14581 corp: 17/2154b lim: 320 exec/s: 0 rss: 68Mb L: 175/272 MS: 1 CopyPart- 00:07:34.816 [2024-12-13 07:02:52.923473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:5555ffff cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:34.816 [2024-12-13 07:02:52.923498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.816 #62 NEW cov: 11822 ft: 14588 corp: 18/2265b lim: 320 exec/s: 0 rss: 68Mb L: 111/272 MS: 1 CopyPart- 00:07:34.816 [2024-12-13 07:02:52.963713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x55555555556f5555 00:07:34.816 [2024-12-13 07:02:52.963739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.816 [2024-12-13 07:02:52.963796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (55) qid:0 cid:5 nsid:55555555 cdw10:555555ff cdw11:55555555 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.816 [2024-12-13 07:02:52.963809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.816 #63 NEW cov: 11822 ft: 14664 corp: 19/2409b lim: 320 exec/s: 63 rss: 68Mb L: 144/272 MS: 1 ShuffleBytes- 00:07:34.816 [2024-12-13 07:02:53.003726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x55555555556f5555 00:07:34.816 [2024-12-13 07:02:53.003752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.816 #64 NEW cov: 11822 ft: 14687 corp: 20/2522b lim: 320 exec/s: 64 rss: 68Mb L: 113/272 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\000\000"- 00:07:35.105 [2024-12-13 07:02:53.043851] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:7e7e7e7e SGL TRANSPORT DATA BLOCK TRANSPORT 0x7e7e7e7e7e7e7e7e 00:07:35.105 [2024-12-13 07:02:53.043876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.105 #65 NEW cov: 11839 ft: 14750 corp: 21/2629b lim: 320 exec/s: 65 rss: 68Mb L: 107/272 MS: 1 InsertRepeatedBytes- 00:07:35.105 [2024-12-13 07:02:53.083925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:5555ffff cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:35.105 [2024-12-13 07:02:53.083952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.105 #66 NEW cov: 11839 ft: 14814 corp: 22/2740b lim: 320 exec/s: 66 rss: 68Mb L: 111/272 MS: 1 ChangeByte- 00:07:35.105 [2024-12-13 07:02:53.124042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (0b) qid:0 cid:4 nsid:1b1b1b1b cdw10:1b1b1b1b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x1b1b1b1b1b1b1b1b 00:07:35.105 [2024-12-13 07:02:53.124069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.105 #67 NEW cov: 11839 ft: 14842 corp: 23/2840b lim: 320 exec/s: 67 rss: 68Mb L: 100/272 MS: 1 ChangeByte- 00:07:35.105 [2024-12-13 07:02:53.164148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffdf cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:35.105 [2024-12-13 07:02:53.164173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.105 #68 NEW cov: 11839 ft: 14861 corp: 24/2952b lim: 320 exec/s: 68 rss: 69Mb L: 112/272 MS: 1 ChangeBit- 00:07:35.105 [2024-12-13 07:02:53.204422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:35.105 [2024-12-13 07:02:53.204447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.105 [2024-12-13 07:02:53.204504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (55) qid:0 cid:5 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.105 [2024-12-13 07:02:53.204518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.105 #69 NEW cov: 11839 ft: 14863 corp: 25/3127b lim: 320 exec/s: 69 rss: 69Mb L: 175/272 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\000\000"- 00:07:35.105 [2024-12-13 07:02:53.244378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (0b) qid:0 cid:4 nsid:1b1b1b1b cdw10:1b1b1b1b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x1b1b1b1b1b1b1b1b 00:07:35.105 [2024-12-13 07:02:53.244403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.105 #70 NEW cov: 11839 ft: 14885 corp: 26/3227b lim: 320 exec/s: 70 rss: 69Mb L: 100/272 MS: 1 ChangeBinInt- 00:07:35.105 [2024-12-13 07:02:53.284678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x55555555556f5555 00:07:35.105 [2024-12-13 07:02:53.284703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.105 [2024-12-13 07:02:53.284761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (7e) qid:0 cid:5 nsid:7e7e7e7e cdw10:555555ff cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffff55555555 00:07:35.105 [2024-12-13 07:02:53.284775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.105 #71 NEW cov: 11839 ft: 14907 corp: 27/3371b lim: 320 exec/s: 71 rss: 69Mb L: 144/272 MS: 1 CrossOver- 00:07:35.105 [2024-12-13 07:02:53.325025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:1b1b1b0b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:35.105 [2024-12-13 07:02:53.325050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.105 [2024-12-13 07:02:53.325106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1b) qid:0 cid:5 nsid:1b1b1b1b cdw10:1b1b1b1b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x1b1b1b1b1b1b1b1b 00:07:35.105 [2024-12-13 07:02:53.325120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.106 [2024-12-13 07:02:53.325173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e4) qid:0 cid:6 nsid:55555555 cdw10:55555555 cdw11:55555555 00:07:35.106 [2024-12-13 07:02:53.325191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.106 [2024-12-13 07:02:53.325265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (55) qid:0 cid:7 nsid:55555555 cdw10:ffffffff cdw11:5555e4ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.106 [2024-12-13 07:02:53.325281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:35.379 #72 NEW cov: 11839 ft: 14961 corp: 28/3643b lim: 320 exec/s: 72 rss: 69Mb L: 272/272 MS: 1 ChangeByte- 00:07:35.379 [2024-12-13 07:02:53.374936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:5555ffff cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:35.379 [2024-12-13 07:02:53.374961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.379 [2024-12-13 07:02:53.375017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (55) qid:0 cid:5 nsid:55555555 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.379 [2024-12-13 07:02:53.375030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.379 #73 NEW cov: 11839 ft: 14970 corp: 29/3797b lim: 320 exec/s: 73 rss: 69Mb L: 154/272 MS: 1 InsertRepeatedBytes- 00:07:35.379 [2024-12-13 07:02:53.415100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:1b1b1b0b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:35.379 [2024-12-13 07:02:53.415124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.379 [2024-12-13 07:02:53.415191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1b) qid:0 cid:5 nsid:1b1b1b1b cdw10:1b1b1b1b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x1b1b1b1b1b1b1b1b 00:07:35.379 [2024-12-13 07:02:53.415205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.379 [2024-12-13 07:02:53.415265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (e4) qid:0 cid:6 nsid:1b1b1b1b cdw10:1b1b1b1b cdw11:5555551b 00:07:35.379 [2024-12-13 07:02:53.415279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.379 #74 NEW cov: 11839 ft: 14991 corp: 30/4051b lim: 320 exec/s: 74 rss: 69Mb L: 254/272 MS: 1 CrossOver- 00:07:35.379 [2024-12-13 07:02:53.455160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:55ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:35.379 [2024-12-13 07:02:53.455185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.379 [2024-12-13 07:02:53.455266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (55) qid:0 cid:5 nsid:55555555 cdw10:55555555 cdw11:55555555 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.379 [2024-12-13 07:02:53.455280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.379 #75 NEW cov: 11839 ft: 14994 corp: 31/4192b lim: 320 exec/s: 75 rss: 69Mb L: 141/272 MS: 1 CopyPart- 00:07:35.379 [2024-12-13 07:02:53.495205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:1b1b1b0b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:35.379 [2024-12-13 07:02:53.495229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.379 [2024-12-13 07:02:53.495304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1b) qid:0 cid:5 nsid:ff1b1b1b cdw10:1b1b1b55 cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x55555555555555ff 00:07:35.379 [2024-12-13 07:02:53.495318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.379 #76 NEW cov: 11839 ft: 15002 corp: 32/4352b lim: 320 exec/s: 76 rss: 69Mb L: 160/272 MS: 1 CrossOver- 00:07:35.379 [2024-12-13 07:02:53.535427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:35.379 [2024-12-13 07:02:53.535454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.379 [2024-12-13 07:02:53.535528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (55) qid:0 cid:5 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.379 [2024-12-13 07:02:53.535543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.380 #77 NEW cov: 11839 ft: 15030 corp: 33/4528b lim: 320 exec/s: 77 rss: 69Mb L: 176/272 MS: 1 InsertByte- 00:07:35.380 [2024-12-13 07:02:53.575413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:5555ffff cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:35.380 [2024-12-13 07:02:53.575438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.380 #78 NEW cov: 11839 ft: 15033 corp: 34/4639b lim: 320 exec/s: 78 rss: 69Mb L: 111/272 MS: 1 ChangeBit- 00:07:35.380 [2024-12-13 07:02:53.615688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x55555555556f5555 00:07:35.380 [2024-12-13 07:02:53.615713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.380 [2024-12-13 07:02:53.615771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (55) qid:0 cid:5 nsid:55555555 cdw10:58585858 cdw11:58585858 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.380 [2024-12-13 07:02:53.615785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.663 #79 NEW cov: 11839 ft: 15047 corp: 35/4806b lim: 320 exec/s: 79 rss: 69Mb L: 167/272 MS: 1 InsertRepeatedBytes- 00:07:35.663 [2024-12-13 07:02:53.655606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:35.663 [2024-12-13 07:02:53.655630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.663 #80 NEW cov: 11839 ft: 15056 corp: 36/4918b lim: 320 exec/s: 80 rss: 69Mb L: 112/272 MS: 1 ChangeBit- 00:07:35.663 [2024-12-13 07:02:53.685834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x55555555556f5555 00:07:35.663 [2024-12-13 07:02:53.685858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.663 [2024-12-13 07:02:53.685914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (55) qid:0 cid:5 nsid:55555555 cdw10:58585858 cdw11:58585858 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.663 [2024-12-13 07:02:53.685928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.663 #81 NEW cov: 11839 ft: 15058 corp: 37/5085b lim: 320 exec/s: 81 rss: 69Mb L: 167/272 MS: 1 ChangeBit- 00:07:35.663 [2024-12-13 07:02:53.725803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (0b) qid:0 cid:4 nsid:1b1b1b1b cdw10:1b1b1b1b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x1b1b1b1b1b1b1b1b 00:07:35.663 [2024-12-13 07:02:53.725826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.663 #82 NEW cov: 11839 ft: 15086 corp: 38/5169b lim: 320 exec/s: 82 rss: 69Mb L: 84/272 MS: 1 ShuffleBytes- 00:07:35.663 [2024-12-13 07:02:53.756130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:1b1b1b0b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:35.663 [2024-12-13 07:02:53.756155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.663 [2024-12-13 07:02:53.756236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1b) qid:0 cid:5 nsid:1b1b1b1b cdw10:1b1b1b1b cdw11:e4e41b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x1b1b1b1b1b1b1b1b 00:07:35.663 [2024-12-13 07:02:53.756251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.663 [2024-12-13 07:02:53.756310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (55) qid:0 cid:6 nsid:55555555 cdw10:55555555 cdw11:55555555 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.663 [2024-12-13 07:02:53.756324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.663 #83 NEW cov: 11839 ft: 15108 corp: 39/5424b lim: 320 exec/s: 83 rss: 69Mb L: 255/272 MS: 1 EraseBytes- 00:07:35.663 [2024-12-13 07:02:53.796079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:1b1b1b0b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:35.663 [2024-12-13 07:02:53.796103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.663 [2024-12-13 07:02:53.796179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1b) qid:0 cid:5 nsid:1b1b1b1b cdw10:1b1b1b1b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.663 [2024-12-13 07:02:53.796198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.663 #84 NEW cov: 11839 ft: 15134 corp: 40/5584b lim: 320 exec/s: 84 rss: 69Mb L: 160/272 MS: 1 CrossOver- 00:07:35.663 [2024-12-13 07:02:53.836107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x55555555556f5555 00:07:35.663 [2024-12-13 07:02:53.836131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.663 #85 NEW cov: 11839 ft: 15196 corp: 41/5697b lim: 320 exec/s: 85 rss: 69Mb L: 113/272 MS: 1 CopyPart- 00:07:35.663 [2024-12-13 07:02:53.876308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:1b1b1b0b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x5555555555555555 00:07:35.663 [2024-12-13 07:02:53.876333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.663 [2024-12-13 07:02:53.876389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (1b) qid:0 cid:5 nsid:ff1b1b1b cdw10:1b1b1b55 cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x55555555555555ff 00:07:35.663 [2024-12-13 07:02:53.876403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.942 #86 NEW cov: 11839 ft: 15223 corp: 42/5857b lim: 320 exec/s: 86 rss: 70Mb L: 160/272 MS: 1 ChangeByte- 00:07:35.942 [2024-12-13 07:02:53.916400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x55555555556f5555 00:07:35.942 [2024-12-13 07:02:53.916425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.942 #87 NEW cov: 11839 ft: 15226 corp: 43/5970b lim: 320 exec/s: 87 rss: 70Mb L: 113/272 MS: 1 ChangeBinInt- 00:07:35.942 [2024-12-13 07:02:53.956453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (0b) qid:0 cid:4 nsid:1b1b1b1b cdw10:1b1b1b1b cdw11:1b1b1b1b SGL TRANSPORT DATA BLOCK TRANSPORT 0x1b1b1b1b1b1b1b1b 00:07:35.942 [2024-12-13 07:02:53.956477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.942 #88 NEW cov: 11839 ft: 15270 corp: 44/6054b lim: 320 exec/s: 44 rss: 70Mb L: 84/272 MS: 1 ChangeByte- 00:07:35.942 #88 DONE cov: 11839 ft: 15270 corp: 44/6054b lim: 320 exec/s: 44 rss: 70Mb 00:07:35.942 ###### Recommended dictionary. ###### 00:07:35.942 "\377\377\377\377\377\377\000\000" # Uses: 2 00:07:35.942 ###### End of recommended dictionary. ###### 00:07:35.942 Done 88 runs in 2 second(s) 00:07:35.942 07:02:54 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:07:35.942 07:02:54 -- ../common.sh@72 -- # (( i++ )) 00:07:35.942 07:02:54 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:35.942 07:02:54 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:35.942 07:02:54 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:35.942 07:02:54 -- nvmf/run.sh@24 -- # local timen=1 00:07:35.942 07:02:54 -- nvmf/run.sh@25 -- # local core=0x1 00:07:35.942 07:02:54 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:35.942 07:02:54 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:35.942 07:02:54 -- nvmf/run.sh@29 -- # printf %02d 1 00:07:35.942 07:02:54 -- nvmf/run.sh@29 -- # port=4401 00:07:35.942 07:02:54 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:35.942 07:02:54 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:35.942 07:02:54 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:35.942 07:02:54 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:07:35.942 [2024-12-13 07:02:54.128439] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:35.942 [2024-12-13 07:02:54.128513] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid489668 ] 00:07:35.942 EAL: No free 2048 kB hugepages reported on node 1 00:07:36.230 [2024-12-13 07:02:54.308969] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.230 [2024-12-13 07:02:54.328483] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:36.230 [2024-12-13 07:02:54.328618] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.230 [2024-12-13 07:02:54.379861] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:36.230 [2024-12-13 07:02:54.396207] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:36.230 INFO: Running with entropic power schedule (0xFF, 100). 00:07:36.230 INFO: Seed: 3079699392 00:07:36.230 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:36.231 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:36.231 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:36.231 INFO: A corpus is not provided, starting from an empty corpus 00:07:36.231 #2 INITED exec/s: 0 rss: 59Mb 00:07:36.231 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:36.231 This may also happen if the target rejected all inputs we tried so far 00:07:36.231 [2024-12-13 07:02:54.441312] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:36.231 [2024-12-13 07:02:54.441435] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:36.231 [2024-12-13 07:02:54.441542] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:36.231 [2024-12-13 07:02:54.441764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.231 [2024-12-13 07:02:54.441795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.231 [2024-12-13 07:02:54.441853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.231 [2024-12-13 07:02:54.441870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.231 [2024-12-13 07:02:54.441925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.231 [2024-12-13 07:02:54.441939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.799 NEW_FUNC[1/671]: 0x451d18 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:36.799 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:36.799 #7 NEW cov: 11619 ft: 11617 corp: 2/21b lim: 30 exec/s: 0 rss: 67Mb L: 20/20 MS: 5 CopyPart-ShuffleBytes-CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:07:36.799 [2024-12-13 07:02:54.752132] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:36.799 [2024-12-13 07:02:54.752266] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:36.799 [2024-12-13 07:02:54.752381] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:36.799 [2024-12-13 07:02:54.752608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.799 [2024-12-13 07:02:54.752640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.799 [2024-12-13 07:02:54.752701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.799 [2024-12-13 07:02:54.752714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.799 [2024-12-13 07:02:54.752770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.799 [2024-12-13 07:02:54.752784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.799 #23 NEW cov: 11739 ft: 12054 corp: 3/41b lim: 30 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 ShuffleBytes- 00:07:36.799 [2024-12-13 07:02:54.802161] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:36.799 [2024-12-13 07:02:54.802288] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.799 [2024-12-13 07:02:54.802403] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:36.799 [2024-12-13 07:02:54.802635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.799 [2024-12-13 07:02:54.802661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.799 [2024-12-13 07:02:54.802722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8f8f83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.799 [2024-12-13 07:02:54.802736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.799 [2024-12-13 07:02:54.802794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.799 [2024-12-13 07:02:54.802808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.799 #24 NEW cov: 11745 ft: 12383 corp: 4/61b lim: 30 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:07:36.799 [2024-12-13 07:02:54.842250] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:36.799 [2024-12-13 07:02:54.842375] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:36.799 [2024-12-13 07:02:54.842493] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:36.799 [2024-12-13 07:02:54.842715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.799 [2024-12-13 07:02:54.842740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.799 [2024-12-13 07:02:54.842799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:878f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.799 [2024-12-13 07:02:54.842813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.799 [2024-12-13 07:02:54.842871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.799 [2024-12-13 07:02:54.842885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.800 #25 NEW cov: 11830 ft: 12630 corp: 5/81b lim: 30 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 ChangeBit- 00:07:36.800 [2024-12-13 07:02:54.882347] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:36.800 [2024-12-13 07:02:54.882481] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:36.800 [2024-12-13 07:02:54.882594] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:36.800 [2024-12-13 07:02:54.882820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.800 [2024-12-13 07:02:54.882846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.800 [2024-12-13 07:02:54.882905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.800 [2024-12-13 07:02:54.882919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.800 [2024-12-13 07:02:54.882978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.800 [2024-12-13 07:02:54.882992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.800 #26 NEW cov: 11830 ft: 12695 corp: 6/101b lim: 30 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 ChangeByte- 00:07:36.800 [2024-12-13 07:02:54.922537] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100000101 00:07:36.800 [2024-12-13 07:02:54.922676] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100000101 00:07:36.800 [2024-12-13 07:02:54.922787] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:36.800 [2024-12-13 07:02:54.922899] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.800 [2024-12-13 07:02:54.923011] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f0a 00:07:36.800 [2024-12-13 07:02:54.923236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a8101 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.800 [2024-12-13 07:02:54.923262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.800 [2024-12-13 07:02:54.923331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:01018101 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.800 [2024-12-13 07:02:54.923346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.800 [2024-12-13 07:02:54.923401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0a8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.800 [2024-12-13 07:02:54.923418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.800 [2024-12-13 07:02:54.923475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.800 [2024-12-13 07:02:54.923489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.800 [2024-12-13 07:02:54.923546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.800 [2024-12-13 07:02:54.923560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:36.800 #27 NEW cov: 11830 ft: 13317 corp: 7/131b lim: 30 exec/s: 0 rss: 67Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:07:36.800 [2024-12-13 07:02:54.972608] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:36.800 [2024-12-13 07:02:54.972731] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.800 [2024-12-13 07:02:54.972843] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:36.800 [2024-12-13 07:02:54.973064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.800 [2024-12-13 07:02:54.973090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.800 [2024-12-13 07:02:54.973144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8fff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.800 [2024-12-13 07:02:54.973158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.800 [2024-12-13 07:02:54.973230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.800 [2024-12-13 07:02:54.973245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.800 #28 NEW cov: 11830 ft: 13417 corp: 8/151b lim: 30 exec/s: 0 rss: 67Mb L: 20/30 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:07:36.800 [2024-12-13 07:02:55.012733] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200008f8f 00:07:36.800 [2024-12-13 07:02:55.012863] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:36.800 [2024-12-13 07:02:55.012993] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:36.800 [2024-12-13 07:02:55.013239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a26020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.800 [2024-12-13 07:02:55.013272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.800 [2024-12-13 07:02:55.013331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.800 [2024-12-13 07:02:55.013346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.800 [2024-12-13 07:02:55.013401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.800 [2024-12-13 07:02:55.013416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.800 #29 NEW cov: 11830 ft: 13456 corp: 9/172b lim: 30 exec/s: 0 rss: 67Mb L: 21/30 MS: 1 InsertByte- 00:07:37.059 [2024-12-13 07:02:55.052918] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100000101 00:07:37.059 [2024-12-13 07:02:55.053057] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100000101 00:07:37.059 [2024-12-13 07:02:55.053171] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.059 [2024-12-13 07:02:55.053288] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.059 [2024-12-13 07:02:55.053398] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f0a 00:07:37.059 [2024-12-13 07:02:55.053611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a8101 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.059 [2024-12-13 07:02:55.053637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.059 [2024-12-13 07:02:55.053699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:01018101 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.059 [2024-12-13 07:02:55.053713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.059 [2024-12-13 07:02:55.053773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0a8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.059 [2024-12-13 07:02:55.053788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.059 [2024-12-13 07:02:55.053845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.059 [2024-12-13 07:02:55.053859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.059 [2024-12-13 07:02:55.053916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.059 [2024-12-13 07:02:55.053930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.059 #30 NEW cov: 11830 ft: 13502 corp: 10/202b lim: 30 exec/s: 0 rss: 67Mb L: 30/30 MS: 1 CopyPart- 00:07:37.059 [2024-12-13 07:02:55.092952] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.059 [2024-12-13 07:02:55.093074] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.059 [2024-12-13 07:02:55.093193] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.059 [2024-12-13 07:02:55.093409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.059 [2024-12-13 07:02:55.093434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.059 [2024-12-13 07:02:55.093493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.059 [2024-12-13 07:02:55.093507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.059 [2024-12-13 07:02:55.093565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.059 [2024-12-13 07:02:55.093578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.059 #31 NEW cov: 11830 ft: 13588 corp: 11/222b lim: 30 exec/s: 0 rss: 67Mb L: 20/30 MS: 1 ChangeBit- 00:07:37.059 [2024-12-13 07:02:55.133032] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.059 [2024-12-13 07:02:55.133151] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.059 [2024-12-13 07:02:55.133372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.059 [2024-12-13 07:02:55.133400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.059 [2024-12-13 07:02:55.133454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8f8f83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.059 [2024-12-13 07:02:55.133468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.059 #32 NEW cov: 11830 ft: 13933 corp: 12/236b lim: 30 exec/s: 0 rss: 67Mb L: 14/30 MS: 1 EraseBytes- 00:07:37.059 [2024-12-13 07:02:55.173109] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300002f8f 00:07:37.059 [2024-12-13 07:02:55.173233] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.059 [2024-12-13 07:02:55.173448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.059 [2024-12-13 07:02:55.173474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.059 [2024-12-13 07:02:55.173535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.059 [2024-12-13 07:02:55.173549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.059 #33 NEW cov: 11830 ft: 14075 corp: 13/251b lim: 30 exec/s: 0 rss: 67Mb L: 15/30 MS: 1 CrossOver- 00:07:37.059 [2024-12-13 07:02:55.213193] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:37.059 [2024-12-13 07:02:55.213546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.059 [2024-12-13 07:02:55.213571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.059 [2024-12-13 07:02:55.213631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.059 [2024-12-13 07:02:55.213646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.059 #36 NEW cov: 11870 ft: 14146 corp: 14/264b lim: 30 exec/s: 0 rss: 67Mb L: 13/30 MS: 3 ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:07:37.059 [2024-12-13 07:02:55.253367] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.059 [2024-12-13 07:02:55.253490] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.059 [2024-12-13 07:02:55.253603] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.059 [2024-12-13 07:02:55.253837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.059 [2024-12-13 07:02:55.253863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.059 [2024-12-13 07:02:55.253922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.059 [2024-12-13 07:02:55.253937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.059 [2024-12-13 07:02:55.253993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.059 [2024-12-13 07:02:55.254006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.059 #37 NEW cov: 11870 ft: 14184 corp: 15/284b lim: 30 exec/s: 0 rss: 68Mb L: 20/30 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:07:37.059 [2024-12-13 07:02:55.293543] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100000101 00:07:37.059 [2024-12-13 07:02:55.293665] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000ffff 00:07:37.059 [2024-12-13 07:02:55.293795] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.059 [2024-12-13 07:02:55.293909] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.059 [2024-12-13 07:02:55.294023] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f0a 00:07:37.060 [2024-12-13 07:02:55.294253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a8101 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.060 [2024-12-13 07:02:55.294279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.060 [2024-12-13 07:02:55.294339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:01018101 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.060 [2024-12-13 07:02:55.294353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.060 [2024-12-13 07:02:55.294411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.060 [2024-12-13 07:02:55.294426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.060 [2024-12-13 07:02:55.294483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.060 [2024-12-13 07:02:55.294497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.060 [2024-12-13 07:02:55.294556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.060 [2024-12-13 07:02:55.294570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.318 #38 NEW cov: 11870 ft: 14223 corp: 16/314b lim: 30 exec/s: 0 rss: 68Mb L: 30/30 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:07:37.318 [2024-12-13 07:02:55.333508] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:37.318 [2024-12-13 07:02:55.333756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.318 [2024-12-13 07:02:55.333782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.318 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:37.318 #39 NEW cov: 11893 ft: 14678 corp: 17/324b lim: 30 exec/s: 0 rss: 68Mb L: 10/30 MS: 1 EraseBytes- 00:07:37.318 [2024-12-13 07:02:55.383726] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xff 00:07:37.318 [2024-12-13 07:02:55.383864] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.318 [2024-12-13 07:02:55.383978] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (261124) > buf size (4096) 00:07:37.318 [2024-12-13 07:02:55.384194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.318 [2024-12-13 07:02:55.384220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.318 [2024-12-13 07:02:55.384276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.318 [2024-12-13 07:02:55.384294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.318 [2024-12-13 07:02:55.384348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.318 [2024-12-13 07:02:55.384361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.318 #40 NEW cov: 11893 ft: 14773 corp: 18/345b lim: 30 exec/s: 0 rss: 68Mb L: 21/30 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:07:37.318 [2024-12-13 07:02:55.423849] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.318 [2024-12-13 07:02:55.423988] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.318 [2024-12-13 07:02:55.424101] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.318 [2024-12-13 07:02:55.424356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.318 [2024-12-13 07:02:55.424382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.318 [2024-12-13 07:02:55.424441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:878f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.318 [2024-12-13 07:02:55.424454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.318 [2024-12-13 07:02:55.424500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.318 [2024-12-13 07:02:55.424513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.318 #41 NEW cov: 11893 ft: 14824 corp: 19/365b lim: 30 exec/s: 41 rss: 68Mb L: 20/30 MS: 1 ChangeByte- 00:07:37.318 [2024-12-13 07:02:55.464034] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100000101 00:07:37.318 [2024-12-13 07:02:55.464171] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000010a 00:07:37.318 [2024-12-13 07:02:55.464303] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.318 [2024-12-13 07:02:55.464413] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.318 [2024-12-13 07:02:55.464525] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f0a 00:07:37.318 [2024-12-13 07:02:55.464777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a8101 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.318 [2024-12-13 07:02:55.464803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.318 [2024-12-13 07:02:55.464856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:01018101 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.318 [2024-12-13 07:02:55.464870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.318 [2024-12-13 07:02:55.464926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0a8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.318 [2024-12-13 07:02:55.464940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.318 [2024-12-13 07:02:55.464994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.318 [2024-12-13 07:02:55.465007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.318 [2024-12-13 07:02:55.465061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.318 [2024-12-13 07:02:55.465079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.318 #42 NEW cov: 11893 ft: 14882 corp: 20/395b lim: 30 exec/s: 42 rss: 68Mb L: 30/30 MS: 1 CrossOver- 00:07:37.318 [2024-12-13 07:02:55.504014] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.319 [2024-12-13 07:02:55.504261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.319 [2024-12-13 07:02:55.504286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.319 #43 NEW cov: 11893 ft: 14943 corp: 21/406b lim: 30 exec/s: 43 rss: 68Mb L: 11/30 MS: 1 EraseBytes- 00:07:37.319 [2024-12-13 07:02:55.544128] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8b 00:07:37.319 [2024-12-13 07:02:55.544375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.319 [2024-12-13 07:02:55.544407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.578 #44 NEW cov: 11893 ft: 14971 corp: 22/417b lim: 30 exec/s: 44 rss: 68Mb L: 11/30 MS: 1 ChangeBit- 00:07:37.578 [2024-12-13 07:02:55.584366] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.578 [2024-12-13 07:02:55.584499] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.578 [2024-12-13 07:02:55.584612] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200006262 00:07:37.578 [2024-12-13 07:02:55.584724] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.578 [2024-12-13 07:02:55.584963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.578 [2024-12-13 07:02:55.584989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.578 [2024-12-13 07:02:55.585050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:878f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.578 [2024-12-13 07:02:55.585064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.578 [2024-12-13 07:02:55.585124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:62620262 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.578 [2024-12-13 07:02:55.585138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.578 [2024-12-13 07:02:55.585196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:6262838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.578 [2024-12-13 07:02:55.585209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.578 #50 NEW cov: 11893 ft: 14992 corp: 23/445b lim: 30 exec/s: 50 rss: 68Mb L: 28/30 MS: 1 InsertRepeatedBytes- 00:07:37.578 [2024-12-13 07:02:55.624504] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100000101 00:07:37.578 [2024-12-13 07:02:55.624643] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000ffff 00:07:37.578 [2024-12-13 07:02:55.624755] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.578 [2024-12-13 07:02:55.624863] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.578 [2024-12-13 07:02:55.624974] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f0a 00:07:37.578 [2024-12-13 07:02:55.625196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0b8101 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.578 [2024-12-13 07:02:55.625225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.578 [2024-12-13 07:02:55.625285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:01018101 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.578 [2024-12-13 07:02:55.625310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.578 [2024-12-13 07:02:55.625367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.578 [2024-12-13 07:02:55.625380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.578 [2024-12-13 07:02:55.625438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.578 [2024-12-13 07:02:55.625451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.578 [2024-12-13 07:02:55.625509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:ffff838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.578 [2024-12-13 07:02:55.625523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:37.578 #51 NEW cov: 11893 ft: 15007 corp: 24/475b lim: 30 exec/s: 51 rss: 68Mb L: 30/30 MS: 1 ChangeBit- 00:07:37.578 [2024-12-13 07:02:55.664540] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.578 [2024-12-13 07:02:55.664681] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.578 [2024-12-13 07:02:55.664896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.578 [2024-12-13 07:02:55.664921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.578 [2024-12-13 07:02:55.664982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8f8f83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.578 [2024-12-13 07:02:55.664996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.578 #52 NEW cov: 11893 ft: 15043 corp: 25/492b lim: 30 exec/s: 52 rss: 68Mb L: 17/30 MS: 1 EraseBytes- 00:07:37.578 [2024-12-13 07:02:55.704674] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.578 [2024-12-13 07:02:55.704798] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.578 [2024-12-13 07:02:55.704915] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.578 [2024-12-13 07:02:55.705132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.578 [2024-12-13 07:02:55.705158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.578 [2024-12-13 07:02:55.705196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.578 [2024-12-13 07:02:55.705207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.578 [2024-12-13 07:02:55.705229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.579 [2024-12-13 07:02:55.705243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.579 #53 NEW cov: 11893 ft: 15100 corp: 26/514b lim: 30 exec/s: 53 rss: 68Mb L: 22/30 MS: 1 CopyPart- 00:07:37.579 [2024-12-13 07:02:55.744825] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.579 [2024-12-13 07:02:55.744946] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.579 [2024-12-13 07:02:55.745059] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000002e7 00:07:37.579 [2024-12-13 07:02:55.745177] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009a8f 00:07:37.579 [2024-12-13 07:02:55.745399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.579 [2024-12-13 07:02:55.745425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.579 [2024-12-13 07:02:55.745485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.579 [2024-12-13 07:02:55.745500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.579 [2024-12-13 07:02:55.745560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8f8f818f cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.579 [2024-12-13 07:02:55.745574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.579 [2024-12-13 07:02:55.745631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:996b830b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.579 [2024-12-13 07:02:55.745645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.579 #54 NEW cov: 11893 ft: 15111 corp: 27/542b lim: 30 exec/s: 54 rss: 68Mb L: 28/30 MS: 1 CMP- DE: "\001\002\347\231k\013\223\232"- 00:07:37.579 [2024-12-13 07:02:55.784822] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.579 [2024-12-13 07:02:55.785054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.579 [2024-12-13 07:02:55.785079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.579 #55 NEW cov: 11893 ft: 15120 corp: 28/553b lim: 30 exec/s: 55 rss: 68Mb L: 11/30 MS: 1 EraseBytes- 00:07:37.838 [2024-12-13 07:02:55.824996] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.838 [2024-12-13 07:02:55.825115] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007fff 00:07:37.838 [2024-12-13 07:02:55.825366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.838 [2024-12-13 07:02:55.825392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.838 [2024-12-13 07:02:55.825451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8f8f83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.838 [2024-12-13 07:02:55.825465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.838 #56 NEW cov: 11893 ft: 15142 corp: 29/567b lim: 30 exec/s: 56 rss: 68Mb L: 14/30 MS: 1 ChangeBit- 00:07:37.838 [2024-12-13 07:02:55.865113] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.838 [2024-12-13 07:02:55.865258] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.838 [2024-12-13 07:02:55.865502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a832f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.838 [2024-12-13 07:02:55.865530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.838 [2024-12-13 07:02:55.865588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.838 [2024-12-13 07:02:55.865602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.838 #57 NEW cov: 11893 ft: 15151 corp: 30/580b lim: 30 exec/s: 57 rss: 68Mb L: 13/30 MS: 1 EraseBytes- 00:07:37.838 [2024-12-13 07:02:55.905271] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200008f8f 00:07:37.838 [2024-12-13 07:02:55.905396] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.838 [2024-12-13 07:02:55.905507] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f0a 00:07:37.838 [2024-12-13 07:02:55.905736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ab3020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.838 [2024-12-13 07:02:55.905762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.838 [2024-12-13 07:02:55.905820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.838 [2024-12-13 07:02:55.905834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.838 [2024-12-13 07:02:55.905894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.838 [2024-12-13 07:02:55.905907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.838 #58 NEW cov: 11893 ft: 15157 corp: 31/598b lim: 30 exec/s: 58 rss: 69Mb L: 18/30 MS: 1 InsertByte- 00:07:37.838 [2024-12-13 07:02:55.945417] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.838 [2024-12-13 07:02:55.945539] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.838 [2024-12-13 07:02:55.945654] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.838 [2024-12-13 07:02:55.945765] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f0a 00:07:37.838 [2024-12-13 07:02:55.945981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.838 [2024-12-13 07:02:55.946007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.838 [2024-12-13 07:02:55.946064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.838 [2024-12-13 07:02:55.946078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.838 [2024-12-13 07:02:55.946133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.838 [2024-12-13 07:02:55.946146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.838 [2024-12-13 07:02:55.946202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.838 [2024-12-13 07:02:55.946216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.838 #59 NEW cov: 11893 ft: 15182 corp: 32/622b lim: 30 exec/s: 59 rss: 69Mb L: 24/30 MS: 1 CrossOver- 00:07:37.839 [2024-12-13 07:02:55.985503] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200008f8f 00:07:37.839 [2024-12-13 07:02:55.985627] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000008f 00:07:37.839 [2024-12-13 07:02:55.985741] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.839 [2024-12-13 07:02:55.985961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a26020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.839 [2024-12-13 07:02:55.985986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.839 [2024-12-13 07:02:55.986046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8f8f818f cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.839 [2024-12-13 07:02:55.986061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.839 [2024-12-13 07:02:55.986119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.839 [2024-12-13 07:02:55.986133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.839 #60 NEW cov: 11893 ft: 15194 corp: 33/643b lim: 30 exec/s: 60 rss: 69Mb L: 21/30 MS: 1 ChangeBinInt- 00:07:37.839 [2024-12-13 07:02:56.025612] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.839 [2024-12-13 07:02:56.025732] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.839 [2024-12-13 07:02:56.025847] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.839 [2024-12-13 07:02:56.026067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.839 [2024-12-13 07:02:56.026092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.839 [2024-12-13 07:02:56.026150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8fff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.839 [2024-12-13 07:02:56.026165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.839 [2024-12-13 07:02:56.026241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.839 [2024-12-13 07:02:56.026256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.839 #61 NEW cov: 11893 ft: 15206 corp: 34/663b lim: 30 exec/s: 61 rss: 69Mb L: 20/30 MS: 1 ShuffleBytes- 00:07:37.839 [2024-12-13 07:02:56.065758] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.839 [2024-12-13 07:02:56.065879] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100006b0b 00:07:37.839 [2024-12-13 07:02:56.065997] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:37.839 [2024-12-13 07:02:56.066220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.839 [2024-12-13 07:02:56.066246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.839 [2024-12-13 07:02:56.066303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:010281e7 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.839 [2024-12-13 07:02:56.066318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.839 [2024-12-13 07:02:56.066376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:939a83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.839 [2024-12-13 07:02:56.066389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.098 #62 NEW cov: 11893 ft: 15210 corp: 35/683b lim: 30 exec/s: 62 rss: 69Mb L: 20/30 MS: 1 PersAutoDict- DE: "\001\002\347\231k\013\223\232"- 00:07:38.098 [2024-12-13 07:02:56.105871] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:38.098 [2024-12-13 07:02:56.106006] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:38.098 [2024-12-13 07:02:56.106121] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:38.098 [2024-12-13 07:02:56.106255] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f0a 00:07:38.098 [2024-12-13 07:02:56.106486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.098 [2024-12-13 07:02:56.106512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.098 [2024-12-13 07:02:56.106572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.099 [2024-12-13 07:02:56.106586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.099 [2024-12-13 07:02:56.106643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.099 [2024-12-13 07:02:56.106657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.099 [2024-12-13 07:02:56.106715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.099 [2024-12-13 07:02:56.106729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.099 #63 NEW cov: 11893 ft: 15275 corp: 36/707b lim: 30 exec/s: 63 rss: 69Mb L: 24/30 MS: 1 ShuffleBytes- 00:07:38.099 [2024-12-13 07:02:56.155930] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:38.099 [2024-12-13 07:02:56.156152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.099 [2024-12-13 07:02:56.156176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.099 #64 NEW cov: 11893 ft: 15283 corp: 37/717b lim: 30 exec/s: 64 rss: 69Mb L: 10/30 MS: 1 ShuffleBytes- 00:07:38.099 [2024-12-13 07:02:56.196156] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:38.099 [2024-12-13 07:02:56.196313] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262144) > buf size (4096) 00:07:38.099 [2024-12-13 07:02:56.196429] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x188f 00:07:38.099 [2024-12-13 07:02:56.196543] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f0a 00:07:38.099 [2024-12-13 07:02:56.196763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.099 [2024-12-13 07:02:56.196788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.099 [2024-12-13 07:02:56.196846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.099 [2024-12-13 07:02:56.196860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.099 [2024-12-13 07:02:56.196914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.099 [2024-12-13 07:02:56.196930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.099 [2024-12-13 07:02:56.196986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.099 [2024-12-13 07:02:56.196999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.099 #65 NEW cov: 11893 ft: 15287 corp: 38/741b lim: 30 exec/s: 65 rss: 69Mb L: 24/30 MS: 1 ChangeBinInt- 00:07:38.099 [2024-12-13 07:02:56.236179] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200008f8f 00:07:38.099 [2024-12-13 07:02:56.236333] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:38.099 [2024-12-13 07:02:56.236448] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:38.099 [2024-12-13 07:02:56.236668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a26020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.099 [2024-12-13 07:02:56.236694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.099 [2024-12-13 07:02:56.236755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.099 [2024-12-13 07:02:56.236768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.099 [2024-12-13 07:02:56.236827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.099 [2024-12-13 07:02:56.236841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.099 #66 NEW cov: 11893 ft: 15298 corp: 39/762b lim: 30 exec/s: 66 rss: 69Mb L: 21/30 MS: 1 ChangeBit- 00:07:38.099 [2024-12-13 07:02:56.276422] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:38.099 [2024-12-13 07:02:56.276541] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:38.099 [2024-12-13 07:02:56.276655] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x8f8f 00:07:38.099 [2024-12-13 07:02:56.276765] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:38.099 [2024-12-13 07:02:56.276982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.099 [2024-12-13 07:02:56.277007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.099 [2024-12-13 07:02:56.277066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:878f83fe cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.099 [2024-12-13 07:02:56.277079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.099 [2024-12-13 07:02:56.277136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.099 [2024-12-13 07:02:56.277149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.099 [2024-12-13 07:02:56.277207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.099 [2024-12-13 07:02:56.277220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.099 #67 NEW cov: 11893 ft: 15315 corp: 40/790b lim: 30 exec/s: 67 rss: 69Mb L: 28/30 MS: 1 CMP- DE: "\376\377\377\377\000\000\000\000"- 00:07:38.099 [2024-12-13 07:02:56.316528] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:38.099 [2024-12-13 07:02:56.316651] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:38.099 [2024-12-13 07:02:56.316761] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a5a5 00:07:38.099 [2024-12-13 07:02:56.316872] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000a58f 00:07:38.099 [2024-12-13 07:02:56.317086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.099 [2024-12-13 07:02:56.317112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.099 [2024-12-13 07:02:56.317170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8fff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.099 [2024-12-13 07:02:56.317184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.099 [2024-12-13 07:02:56.317244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff81ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.099 [2024-12-13 07:02:56.317258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.099 [2024-12-13 07:02:56.317316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:a5a581a5 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.099 [2024-12-13 07:02:56.317330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.359 #68 NEW cov: 11893 ft: 15362 corp: 41/818b lim: 30 exec/s: 68 rss: 69Mb L: 28/30 MS: 1 InsertRepeatedBytes- 00:07:38.359 [2024-12-13 07:02:56.356553] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (252888) > buf size (4096) 00:07:38.359 [2024-12-13 07:02:56.356673] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:38.359 [2024-12-13 07:02:56.356916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:f6f500d0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.359 [2024-12-13 07:02:56.356942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.359 [2024-12-13 07:02:56.357002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:706c838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.359 [2024-12-13 07:02:56.357016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.359 #69 NEW cov: 11893 ft: 15398 corp: 42/831b lim: 30 exec/s: 69 rss: 69Mb L: 13/30 MS: 1 ChangeBinInt- 00:07:38.359 [2024-12-13 07:02:56.396716] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a8f 00:07:38.359 [2024-12-13 07:02:56.396851] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:38.359 [2024-12-13 07:02:56.396960] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000a8f 00:07:38.359 [2024-12-13 07:02:56.397182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.359 [2024-12-13 07:02:56.397212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.359 [2024-12-13 07:02:56.397271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8f8f8387 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.359 [2024-12-13 07:02:56.397285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.359 [2024-12-13 07:02:56.397341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.359 [2024-12-13 07:02:56.397358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.359 #70 NEW cov: 11893 ft: 15408 corp: 43/849b lim: 30 exec/s: 70 rss: 69Mb L: 18/30 MS: 1 CrossOver- 00:07:38.359 [2024-12-13 07:02:56.436807] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000368f 00:07:38.359 [2024-12-13 07:02:56.436933] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000008f 00:07:38.359 [2024-12-13 07:02:56.437048] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300008f8f 00:07:38.359 [2024-12-13 07:02:56.437273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a26020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.359 [2024-12-13 07:02:56.437299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.359 [2024-12-13 07:02:56.437358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:8f8f818f cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.359 [2024-12-13 07:02:56.437372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.359 [2024-12-13 07:02:56.437428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:8f8f838f cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.359 [2024-12-13 07:02:56.437443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.359 #71 NEW cov: 11893 ft: 15416 corp: 44/870b lim: 30 exec/s: 35 rss: 69Mb L: 21/30 MS: 1 ChangeByte- 00:07:38.359 #71 DONE cov: 11893 ft: 15416 corp: 44/870b lim: 30 exec/s: 35 rss: 69Mb 00:07:38.359 ###### Recommended dictionary. ###### 00:07:38.359 "\377\377\377\377\377\377\377\377" # Uses: 4 00:07:38.359 "\001\002\347\231k\013\223\232" # Uses: 1 00:07:38.359 "\376\377\377\377\000\000\000\000" # Uses: 0 00:07:38.359 ###### End of recommended dictionary. ###### 00:07:38.359 Done 71 runs in 2 second(s) 00:07:38.359 07:02:56 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:07:38.359 07:02:56 -- ../common.sh@72 -- # (( i++ )) 00:07:38.359 07:02:56 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:38.359 07:02:56 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:38.359 07:02:56 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:38.359 07:02:56 -- nvmf/run.sh@24 -- # local timen=1 00:07:38.359 07:02:56 -- nvmf/run.sh@25 -- # local core=0x1 00:07:38.359 07:02:56 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:38.359 07:02:56 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:38.359 07:02:56 -- nvmf/run.sh@29 -- # printf %02d 2 00:07:38.359 07:02:56 -- nvmf/run.sh@29 -- # port=4402 00:07:38.359 07:02:56 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:38.359 07:02:56 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:38.359 07:02:56 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:38.359 07:02:56 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:07:38.618 [2024-12-13 07:02:56.621310] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:38.618 [2024-12-13 07:02:56.621388] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid490213 ] 00:07:38.618 EAL: No free 2048 kB hugepages reported on node 1 00:07:38.618 [2024-12-13 07:02:56.797181] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:38.618 [2024-12-13 07:02:56.816568] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:38.618 [2024-12-13 07:02:56.816706] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.877 [2024-12-13 07:02:56.867972] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:38.877 [2024-12-13 07:02:56.884308] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:38.877 INFO: Running with entropic power schedule (0xFF, 100). 00:07:38.877 INFO: Seed: 1271709797 00:07:38.877 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:38.877 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:38.877 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:38.877 INFO: A corpus is not provided, starting from an empty corpus 00:07:38.877 #2 INITED exec/s: 0 rss: 59Mb 00:07:38.877 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:38.877 This may also happen if the target rejected all inputs we tried so far 00:07:38.877 [2024-12-13 07:02:56.929292] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.877 [2024-12-13 07:02:56.929538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.877 [2024-12-13 07:02:56.929571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.136 NEW_FUNC[1/669]: 0x454738 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:39.136 NEW_FUNC[2/669]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:39.136 #10 NEW cov: 11582 ft: 11584 corp: 2/11b lim: 35 exec/s: 0 rss: 67Mb L: 10/10 MS: 3 InsertByte-ShuffleBytes-CMP- DE: "\000\000\000\000\000\000\000\000"- 00:07:39.136 [2024-12-13 07:02:57.240308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff01000a cdw11:a200e79a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.136 [2024-12-13 07:02:57.240349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.136 NEW_FUNC[1/1]: 0x1c76b18 in thread_update_stats /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:920 00:07:39.136 #11 NEW cov: 11706 ft: 12160 corp: 3/20b lim: 35 exec/s: 0 rss: 67Mb L: 9/10 MS: 1 CMP- DE: "\377\001\347\232\242c\305z"- 00:07:39.136 [2024-12-13 07:02:57.280109] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.136 [2024-12-13 07:02:57.280463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.136 [2024-12-13 07:02:57.280493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.136 [2024-12-13 07:02:57.280551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:01e700ff cdw11:63009aa2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.136 [2024-12-13 07:02:57.280567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.136 #12 NEW cov: 11712 ft: 12729 corp: 4/38b lim: 35 exec/s: 0 rss: 67Mb L: 18/18 MS: 1 PersAutoDict- DE: "\377\001\347\232\242c\305z"- 00:07:39.136 [2024-12-13 07:02:57.320399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff01000a cdw11:a200e79a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.136 [2024-12-13 07:02:57.320424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.136 #13 NEW cov: 11797 ft: 12988 corp: 5/47b lim: 35 exec/s: 0 rss: 67Mb L: 9/18 MS: 1 ChangeByte- 00:07:39.136 [2024-12-13 07:02:57.360392] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.136 [2024-12-13 07:02:57.360746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.136 [2024-12-13 07:02:57.360778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.136 [2024-12-13 07:02:57.360838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:01e700ff cdw11:63009aa2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.136 [2024-12-13 07:02:57.360852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.396 #14 NEW cov: 11797 ft: 13132 corp: 6/64b lim: 35 exec/s: 0 rss: 67Mb L: 17/18 MS: 1 EraseBytes- 00:07:39.396 [2024-12-13 07:02:57.400437] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.396 [2024-12-13 07:02:57.400664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.396 [2024-12-13 07:02:57.400693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.396 #15 NEW cov: 11797 ft: 13195 corp: 7/75b lim: 35 exec/s: 0 rss: 67Mb L: 11/18 MS: 1 CrossOver- 00:07:39.396 [2024-12-13 07:02:57.440607] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.396 [2024-12-13 07:02:57.440947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00020000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.396 [2024-12-13 07:02:57.440975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.396 [2024-12-13 07:02:57.441031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:01e700ff cdw11:63009aa2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.396 [2024-12-13 07:02:57.441045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.396 #16 NEW cov: 11797 ft: 13315 corp: 8/93b lim: 35 exec/s: 0 rss: 67Mb L: 18/18 MS: 1 ChangeBit- 00:07:39.396 [2024-12-13 07:02:57.480873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.396 [2024-12-13 07:02:57.480897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.396 #18 NEW cov: 11797 ft: 13367 corp: 9/102b lim: 35 exec/s: 0 rss: 67Mb L: 9/18 MS: 2 ShuffleBytes-PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:39.396 [2024-12-13 07:02:57.520836] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.396 [2024-12-13 07:02:57.520972] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.396 [2024-12-13 07:02:57.521191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00020000 cdw11:9a000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.396 [2024-12-13 07:02:57.521219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.396 [2024-12-13 07:02:57.521278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:e7000000 cdw11:630001a2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.396 [2024-12-13 07:02:57.521293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.396 #19 NEW cov: 11797 ft: 13451 corp: 10/120b lim: 35 exec/s: 0 rss: 67Mb L: 18/18 MS: 1 ShuffleBytes- 00:07:39.396 [2024-12-13 07:02:57.561107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff01000a cdw11:9f00e79a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.396 [2024-12-13 07:02:57.561133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.396 #20 NEW cov: 11797 ft: 13492 corp: 11/130b lim: 35 exec/s: 0 rss: 67Mb L: 10/18 MS: 1 InsertByte- 00:07:39.396 [2024-12-13 07:02:57.591071] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.396 [2024-12-13 07:02:57.591198] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.396 [2024-12-13 07:02:57.591407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.396 [2024-12-13 07:02:57.591434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.396 [2024-12-13 07:02:57.591491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:f7640000 cdw11:64006464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.396 [2024-12-13 07:02:57.591507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.396 #21 NEW cov: 11797 ft: 13547 corp: 12/149b lim: 35 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:39.396 [2024-12-13 07:02:57.631200] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.396 [2024-12-13 07:02:57.631321] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.396 [2024-12-13 07:02:57.631542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.396 [2024-12-13 07:02:57.631569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.396 [2024-12-13 07:02:57.631627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:f72d0000 cdw11:64006464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.396 [2024-12-13 07:02:57.631642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.655 #22 NEW cov: 11797 ft: 13602 corp: 13/169b lim: 35 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 InsertByte- 00:07:39.655 [2024-12-13 07:02:57.671263] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.655 [2024-12-13 07:02:57.671509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00020000 cdw11:9a000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.655 [2024-12-13 07:02:57.671536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.655 #23 NEW cov: 11797 ft: 13624 corp: 14/178b lim: 35 exec/s: 0 rss: 68Mb L: 9/20 MS: 1 EraseBytes- 00:07:39.655 [2024-12-13 07:02:57.711951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0ae20024 cdw11:e200e2e2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.655 [2024-12-13 07:02:57.711977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.655 [2024-12-13 07:02:57.712033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:e2e200e2 cdw11:e200e2e2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.655 [2024-12-13 07:02:57.712047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.655 [2024-12-13 07:02:57.712103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:e2e200e2 cdw11:e200e2e2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.655 [2024-12-13 07:02:57.712117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.655 [2024-12-13 07:02:57.712172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:e2e200e2 cdw11:e200e2e2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.655 [2024-12-13 07:02:57.712190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.655 #25 NEW cov: 11797 ft: 14189 corp: 15/210b lim: 35 exec/s: 0 rss: 68Mb L: 32/32 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:39.655 [2024-12-13 07:02:57.751686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:01e7000a cdw11:21009aa2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.655 [2024-12-13 07:02:57.751711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.656 #26 NEW cov: 11797 ft: 14214 corp: 16/218b lim: 35 exec/s: 0 rss: 68Mb L: 8/32 MS: 1 CrossOver- 00:07:39.656 [2024-12-13 07:02:57.781886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:000000f4 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.656 [2024-12-13 07:02:57.781911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.656 [2024-12-13 07:02:57.781984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:01e700ff cdw11:63009aa2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.656 [2024-12-13 07:02:57.781999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.656 #27 NEW cov: 11797 ft: 14282 corp: 17/236b lim: 35 exec/s: 0 rss: 68Mb L: 18/32 MS: 1 ChangeByte- 00:07:39.656 [2024-12-13 07:02:57.821683] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.656 [2024-12-13 07:02:57.821919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:000a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.656 [2024-12-13 07:02:57.821947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.656 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:39.656 #28 NEW cov: 11820 ft: 14314 corp: 18/247b lim: 35 exec/s: 0 rss: 68Mb L: 11/32 MS: 1 ShuffleBytes- 00:07:39.656 [2024-12-13 07:02:57.862004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:01e7000a cdw11:21009aa2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.656 [2024-12-13 07:02:57.862029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.656 #29 NEW cov: 11820 ft: 14355 corp: 19/255b lim: 35 exec/s: 0 rss: 68Mb L: 8/32 MS: 1 EraseBytes- 00:07:39.915 [2024-12-13 07:02:57.901951] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.915 [2024-12-13 07:02:57.902190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:000a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.915 [2024-12-13 07:02:57.902219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.915 #30 NEW cov: 11820 ft: 14407 corp: 20/266b lim: 35 exec/s: 30 rss: 68Mb L: 11/32 MS: 1 ShuffleBytes- 00:07:39.915 [2024-12-13 07:02:57.942210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:01e700ff cdw11:63009aa2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.915 [2024-12-13 07:02:57.942235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.915 #31 NEW cov: 11820 ft: 14422 corp: 21/274b lim: 35 exec/s: 31 rss: 68Mb L: 8/32 MS: 1 PersAutoDict- DE: "\377\001\347\232\242c\305z"- 00:07:39.915 [2024-12-13 07:02:57.982164] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.915 [2024-12-13 07:02:57.982518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00ff0000 cdw11:9a0001e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.915 [2024-12-13 07:02:57.982545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.915 [2024-12-13 07:02:57.982610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:c57a0063 cdw11:64006464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.915 [2024-12-13 07:02:57.982624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.915 #32 NEW cov: 11820 ft: 14441 corp: 22/293b lim: 35 exec/s: 32 rss: 68Mb L: 19/32 MS: 1 PersAutoDict- DE: "\377\001\347\232\242c\305z"- 00:07:39.915 [2024-12-13 07:02:58.022323] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.915 [2024-12-13 07:02:58.022643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00ff0000 cdw11:9a0001e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.915 [2024-12-13 07:02:58.022669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.915 [2024-12-13 07:02:58.022724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:c57a0063 cdw11:64006464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.915 [2024-12-13 07:02:58.022738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.915 #33 NEW cov: 11820 ft: 14467 corp: 23/309b lim: 35 exec/s: 33 rss: 68Mb L: 16/32 MS: 1 EraseBytes- 00:07:39.915 [2024-12-13 07:02:58.062445] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.915 [2024-12-13 07:02:58.062565] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.915 [2024-12-13 07:02:58.062807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.915 [2024-12-13 07:02:58.062834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.915 [2024-12-13 07:02:58.062894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.915 [2024-12-13 07:02:58.062910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.915 #34 NEW cov: 11820 ft: 14505 corp: 24/328b lim: 35 exec/s: 34 rss: 68Mb L: 19/32 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:07:39.915 [2024-12-13 07:02:58.102704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff01000a cdw11:b300e79a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.915 [2024-12-13 07:02:58.102729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.916 #35 NEW cov: 11820 ft: 14513 corp: 25/339b lim: 35 exec/s: 35 rss: 68Mb L: 11/32 MS: 1 InsertByte- 00:07:39.916 [2024-12-13 07:02:58.142718] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.916 [2024-12-13 07:02:58.142836] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.916 [2024-12-13 07:02:58.142947] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.916 [2024-12-13 07:02:58.143264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.916 [2024-12-13 07:02:58.143291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.916 [2024-12-13 07:02:58.143350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:f72d0000 cdw11:64006464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.916 [2024-12-13 07:02:58.143366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.916 [2024-12-13 07:02:58.143424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.916 [2024-12-13 07:02:58.143442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.916 [2024-12-13 07:02:58.143498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:6464002d cdw11:64006464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.916 [2024-12-13 07:02:58.143512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.175 #36 NEW cov: 11820 ft: 14557 corp: 26/372b lim: 35 exec/s: 36 rss: 68Mb L: 33/33 MS: 1 CopyPart- 00:07:40.175 [2024-12-13 07:02:58.192955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:01e7000a cdw11:21009aa2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.175 [2024-12-13 07:02:58.192979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.175 #37 NEW cov: 11820 ft: 14610 corp: 27/380b lim: 35 exec/s: 37 rss: 68Mb L: 8/33 MS: 1 ShuffleBytes- 00:07:40.175 [2024-12-13 07:02:58.233096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffe2000a cdw11:e200e2e2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.175 [2024-12-13 07:02:58.233120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.175 #38 NEW cov: 11820 ft: 14615 corp: 28/393b lim: 35 exec/s: 38 rss: 68Mb L: 13/33 MS: 1 CrossOver- 00:07:40.175 [2024-12-13 07:02:58.263033] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.175 [2024-12-13 07:02:58.263358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.175 [2024-12-13 07:02:58.263384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.175 [2024-12-13 07:02:58.263446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:01e700ff cdw11:63009aa2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.175 [2024-12-13 07:02:58.263460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.175 #39 NEW cov: 11820 ft: 14662 corp: 29/410b lim: 35 exec/s: 39 rss: 68Mb L: 17/33 MS: 1 CopyPart- 00:07:40.175 [2024-12-13 07:02:58.303143] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.175 [2024-12-13 07:02:58.303277] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.175 [2024-12-13 07:02:58.303491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.175 [2024-12-13 07:02:58.303519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.175 [2024-12-13 07:02:58.303579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:f7640000 cdw11:64006464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.175 [2024-12-13 07:02:58.303595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.175 #40 NEW cov: 11820 ft: 14668 corp: 30/429b lim: 35 exec/s: 40 rss: 68Mb L: 19/33 MS: 1 ChangeBinInt- 00:07:40.175 #41 NEW cov: 11820 ft: 15050 corp: 31/436b lim: 35 exec/s: 41 rss: 68Mb L: 7/33 MS: 1 EraseBytes- 00:07:40.175 [2024-12-13 07:02:58.383436] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.175 [2024-12-13 07:02:58.383674] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.175 [2024-12-13 07:02:58.383990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:f7000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.175 [2024-12-13 07:02:58.384018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.175 [2024-12-13 07:02:58.384077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:64640064 cdw11:00006464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.175 [2024-12-13 07:02:58.384091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.175 [2024-12-13 07:02:58.384147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:640000f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.175 [2024-12-13 07:02:58.384163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.175 [2024-12-13 07:02:58.384220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:64640064 cdw11:64006464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.175 [2024-12-13 07:02:58.384234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.175 #42 NEW cov: 11820 ft: 15055 corp: 32/465b lim: 35 exec/s: 42 rss: 69Mb L: 29/33 MS: 1 CopyPart- 00:07:40.434 [2024-12-13 07:02:58.433942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffe2000a cdw11:e200e2e2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.434 [2024-12-13 07:02:58.433966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.434 [2024-12-13 07:02:58.434026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ff0100e7 cdw11:a200e79a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.434 [2024-12-13 07:02:58.434039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.434 [2024-12-13 07:02:58.434095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:7a9a00c5 cdw11:c500a263 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.435 [2024-12-13 07:02:58.434108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.435 #43 NEW cov: 11820 ft: 15242 corp: 33/486b lim: 35 exec/s: 43 rss: 69Mb L: 21/33 MS: 1 PersAutoDict- DE: "\377\001\347\232\242c\305z"- 00:07:40.435 [2024-12-13 07:02:58.473856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff0100f4 cdw11:a200e79a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.435 [2024-12-13 07:02:58.473880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.435 #44 NEW cov: 11820 ft: 15255 corp: 34/498b lim: 35 exec/s: 44 rss: 69Mb L: 12/33 MS: 1 EraseBytes- 00:07:40.435 [2024-12-13 07:02:58.513931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:01e7000a cdw11:de00665d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.435 [2024-12-13 07:02:58.513955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.435 #45 NEW cov: 11820 ft: 15353 corp: 35/506b lim: 35 exec/s: 45 rss: 69Mb L: 8/33 MS: 1 ChangeBinInt- 00:07:40.435 [2024-12-13 07:02:58.553887] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.435 [2024-12-13 07:02:58.554025] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.435 [2024-12-13 07:02:58.554243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.435 [2024-12-13 07:02:58.554282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.435 [2024-12-13 07:02:58.554340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ff010000 cdw11:a200e79a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.435 [2024-12-13 07:02:58.554358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.435 #46 NEW cov: 11820 ft: 15410 corp: 36/524b lim: 35 exec/s: 46 rss: 69Mb L: 18/33 MS: 1 CopyPart- 00:07:40.435 [2024-12-13 07:02:58.594297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffe2000a cdw11:e200e2e2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.435 [2024-12-13 07:02:58.594322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.435 [2024-12-13 07:02:58.594378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:e79a0099 cdw11:c500a263 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.435 [2024-12-13 07:02:58.594392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.435 #47 NEW cov: 11820 ft: 15417 corp: 37/538b lim: 35 exec/s: 47 rss: 69Mb L: 14/33 MS: 1 InsertByte- 00:07:40.435 [2024-12-13 07:02:58.634122] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.435 [2024-12-13 07:02:58.634272] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.435 [2024-12-13 07:02:58.634491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:98000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.435 [2024-12-13 07:02:58.634517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.435 [2024-12-13 07:02:58.634576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.435 [2024-12-13 07:02:58.634593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.435 #48 NEW cov: 11820 ft: 15479 corp: 38/558b lim: 35 exec/s: 48 rss: 69Mb L: 20/33 MS: 1 InsertByte- 00:07:40.435 [2024-12-13 07:02:58.674273] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.694 [2024-12-13 07:02:58.674601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.694 [2024-12-13 07:02:58.674629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.694 [2024-12-13 07:02:58.674685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:01e700ff cdw11:12009aa2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.694 [2024-12-13 07:02:58.674699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.694 #49 NEW cov: 11820 ft: 15491 corp: 39/576b lim: 35 exec/s: 49 rss: 69Mb L: 18/33 MS: 1 ChangeBinInt- 00:07:40.694 [2024-12-13 07:02:58.714538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:019a000a cdw11:e700a221 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.694 [2024-12-13 07:02:58.714564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.694 #50 NEW cov: 11820 ft: 15536 corp: 40/584b lim: 35 exec/s: 50 rss: 69Mb L: 8/33 MS: 1 ShuffleBytes- 00:07:40.694 [2024-12-13 07:02:58.754513] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.694 [2024-12-13 07:02:58.754742] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.694 [2024-12-13 07:02:58.754957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff010000 cdw11:a200e79a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.694 [2024-12-13 07:02:58.754984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.694 [2024-12-13 07:02:58.755043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:7a0000c5 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.694 [2024-12-13 07:02:58.755062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.694 [2024-12-13 07:02:58.755118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00ff0000 cdw11:9a0001e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.694 [2024-12-13 07:02:58.755134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.694 [2024-12-13 07:02:58.794650] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.694 [2024-12-13 07:02:58.794885] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.694 [2024-12-13 07:02:58.795220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff010000 cdw11:a200e79a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.694 [2024-12-13 07:02:58.795246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.694 [2024-12-13 07:02:58.795304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:7a0000c5 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.694 [2024-12-13 07:02:58.795318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.694 [2024-12-13 07:02:58.795373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00dc0000 cdw11:dc00dcdc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.695 [2024-12-13 07:02:58.795388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.695 [2024-12-13 07:02:58.795443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:01e700ff cdw11:63009aa2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.695 [2024-12-13 07:02:58.795457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.695 #52 NEW cov: 11820 ft: 15559 corp: 41/615b lim: 35 exec/s: 52 rss: 69Mb L: 31/33 MS: 2 PersAutoDict-InsertRepeatedBytes- DE: "\377\001\347\232\242c\305z"- 00:07:40.695 [2024-12-13 07:02:58.834772] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.695 [2024-12-13 07:02:58.834995] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.695 [2024-12-13 07:02:58.835322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:f7000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.695 [2024-12-13 07:02:58.835350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.695 [2024-12-13 07:02:58.835407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:64640064 cdw11:00006464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.695 [2024-12-13 07:02:58.835421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.695 [2024-12-13 07:02:58.835475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:640000f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.695 [2024-12-13 07:02:58.835490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.695 [2024-12-13 07:02:58.835546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:3e640064 cdw11:64006464 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.695 [2024-12-13 07:02:58.835559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.695 #53 NEW cov: 11820 ft: 15569 corp: 42/645b lim: 35 exec/s: 53 rss: 69Mb L: 30/33 MS: 1 InsertByte- 00:07:40.695 [2024-12-13 07:02:58.874859] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.695 [2024-12-13 07:02:58.875178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:9a0001e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.695 [2024-12-13 07:02:58.875209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.695 [2024-12-13 07:02:58.875266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0a00009f cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.695 [2024-12-13 07:02:58.875280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.695 [2024-12-13 07:02:58.914982] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.695 [2024-12-13 07:02:58.915321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:9a0001e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.695 [2024-12-13 07:02:58.915348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.695 [2024-12-13 07:02:58.915404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0a00009f cdw11:f2000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.695 [2024-12-13 07:02:58.915418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.954 #55 NEW cov: 11820 ft: 15576 corp: 43/662b lim: 35 exec/s: 27 rss: 69Mb L: 17/33 MS: 2 CrossOver-InsertByte- 00:07:40.954 #55 DONE cov: 11820 ft: 15576 corp: 43/662b lim: 35 exec/s: 27 rss: 69Mb 00:07:40.954 ###### Recommended dictionary. ###### 00:07:40.954 "\000\000\000\000\000\000\000\000" # Uses: 1 00:07:40.954 "\377\001\347\232\242c\305z" # Uses: 5 00:07:40.954 ###### End of recommended dictionary. ###### 00:07:40.954 Done 55 runs in 2 second(s) 00:07:40.954 07:02:59 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:07:40.954 07:02:59 -- ../common.sh@72 -- # (( i++ )) 00:07:40.954 07:02:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:40.954 07:02:59 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:40.954 07:02:59 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:40.954 07:02:59 -- nvmf/run.sh@24 -- # local timen=1 00:07:40.954 07:02:59 -- nvmf/run.sh@25 -- # local core=0x1 00:07:40.954 07:02:59 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:40.954 07:02:59 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:40.954 07:02:59 -- nvmf/run.sh@29 -- # printf %02d 3 00:07:40.954 07:02:59 -- nvmf/run.sh@29 -- # port=4403 00:07:40.954 07:02:59 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:40.954 07:02:59 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:40.954 07:02:59 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:40.954 07:02:59 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:07:40.954 [2024-12-13 07:02:59.089428] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:40.954 [2024-12-13 07:02:59.089510] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid490509 ] 00:07:40.954 EAL: No free 2048 kB hugepages reported on node 1 00:07:41.214 [2024-12-13 07:02:59.271948] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:41.214 [2024-12-13 07:02:59.291761] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:41.214 [2024-12-13 07:02:59.291898] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.214 [2024-12-13 07:02:59.343308] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:41.214 [2024-12-13 07:02:59.359635] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:41.214 INFO: Running with entropic power schedule (0xFF, 100). 00:07:41.214 INFO: Seed: 3746730911 00:07:41.214 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:41.214 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:41.214 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:41.214 INFO: A corpus is not provided, starting from an empty corpus 00:07:41.214 #2 INITED exec/s: 0 rss: 59Mb 00:07:41.214 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:41.214 This may also happen if the target rejected all inputs we tried so far 00:07:41.473 NEW_FUNC[1/659]: 0x456418 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:41.473 NEW_FUNC[2/659]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:41.473 #4 NEW cov: 11480 ft: 11481 corp: 2/6b lim: 20 exec/s: 0 rss: 67Mb L: 5/5 MS: 2 CrossOver-CMP- DE: "\001\000\000\005"- 00:07:41.731 #5 NEW cov: 11593 ft: 12337 corp: 3/11b lim: 20 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 ChangeByte- 00:07:41.731 #6 NEW cov: 11599 ft: 12640 corp: 4/17b lim: 20 exec/s: 0 rss: 67Mb L: 6/6 MS: 1 InsertByte- 00:07:41.731 #10 NEW cov: 11701 ft: 13195 corp: 5/32b lim: 20 exec/s: 0 rss: 67Mb L: 15/15 MS: 4 InsertByte-ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:07:41.731 #16 NEW cov: 11701 ft: 13285 corp: 6/37b lim: 20 exec/s: 0 rss: 67Mb L: 5/15 MS: 1 ShuffleBytes- 00:07:41.990 #17 NEW cov: 11702 ft: 13588 corp: 7/45b lim: 20 exec/s: 0 rss: 67Mb L: 8/15 MS: 1 CopyPart- 00:07:41.990 #18 NEW cov: 11702 ft: 13629 corp: 8/53b lim: 20 exec/s: 0 rss: 67Mb L: 8/15 MS: 1 ChangeByte- 00:07:41.990 #19 NEW cov: 11702 ft: 13673 corp: 9/58b lim: 20 exec/s: 0 rss: 67Mb L: 5/15 MS: 1 ChangeBinInt- 00:07:41.990 #20 NEW cov: 11702 ft: 13705 corp: 10/63b lim: 20 exec/s: 0 rss: 67Mb L: 5/15 MS: 1 ChangeByte- 00:07:41.990 #21 NEW cov: 11702 ft: 13719 corp: 11/71b lim: 20 exec/s: 0 rss: 67Mb L: 8/15 MS: 1 CopyPart- 00:07:42.249 #22 NEW cov: 11702 ft: 13733 corp: 12/79b lim: 20 exec/s: 0 rss: 67Mb L: 8/15 MS: 1 ChangeByte- 00:07:42.249 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:42.249 #23 NEW cov: 11725 ft: 13778 corp: 13/87b lim: 20 exec/s: 0 rss: 68Mb L: 8/15 MS: 1 ShuffleBytes- 00:07:42.249 #24 NEW cov: 11725 ft: 13812 corp: 14/95b lim: 20 exec/s: 0 rss: 68Mb L: 8/15 MS: 1 ChangeByte- 00:07:42.249 #25 NEW cov: 11725 ft: 13829 corp: 15/103b lim: 20 exec/s: 25 rss: 68Mb L: 8/15 MS: 1 ChangeBit- 00:07:42.249 #26 NEW cov: 11725 ft: 13841 corp: 16/109b lim: 20 exec/s: 26 rss: 68Mb L: 6/15 MS: 1 InsertByte- 00:07:42.508 #27 NEW cov: 11725 ft: 13855 corp: 17/121b lim: 20 exec/s: 27 rss: 68Mb L: 12/15 MS: 1 PersAutoDict- DE: "\001\000\000\005"- 00:07:42.508 #28 NEW cov: 11725 ft: 13867 corp: 18/126b lim: 20 exec/s: 28 rss: 68Mb L: 5/15 MS: 1 ChangeBit- 00:07:42.508 #29 NEW cov: 11725 ft: 13890 corp: 19/138b lim: 20 exec/s: 29 rss: 68Mb L: 12/15 MS: 1 ChangeBinInt- 00:07:42.508 [2024-12-13 07:03:00.654108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.508 [2024-12-13 07:03:00.654160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.508 NEW_FUNC[1/20]: 0x1137598 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3224 00:07:42.508 NEW_FUNC[2/20]: 0x1138118 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3166 00:07:42.508 #30 NEW cov: 12067 ft: 14471 corp: 20/154b lim: 20 exec/s: 30 rss: 68Mb L: 16/16 MS: 1 InsertRepeatedBytes- 00:07:42.508 #31 NEW cov: 12067 ft: 14496 corp: 21/167b lim: 20 exec/s: 31 rss: 68Mb L: 13/16 MS: 1 InsertByte- 00:07:42.766 #32 NEW cov: 12067 ft: 14509 corp: 22/176b lim: 20 exec/s: 32 rss: 68Mb L: 9/16 MS: 1 InsertByte- 00:07:42.766 #33 NEW cov: 12067 ft: 14592 corp: 23/188b lim: 20 exec/s: 33 rss: 68Mb L: 12/16 MS: 1 ShuffleBytes- 00:07:42.766 #34 NEW cov: 12067 ft: 14622 corp: 24/203b lim: 20 exec/s: 34 rss: 68Mb L: 15/16 MS: 1 ChangeASCIIInt- 00:07:42.766 #35 NEW cov: 12067 ft: 14624 corp: 25/213b lim: 20 exec/s: 35 rss: 68Mb L: 10/16 MS: 1 CopyPart- 00:07:43.024 #40 NEW cov: 12067 ft: 14634 corp: 26/217b lim: 20 exec/s: 40 rss: 68Mb L: 4/16 MS: 5 CrossOver-ShuffleBytes-ChangeBit-EraseBytes-CrossOver- 00:07:43.024 #41 NEW cov: 12067 ft: 14656 corp: 27/224b lim: 20 exec/s: 41 rss: 68Mb L: 7/16 MS: 1 InsertByte- 00:07:43.024 #42 NEW cov: 12067 ft: 14675 corp: 28/229b lim: 20 exec/s: 42 rss: 69Mb L: 5/16 MS: 1 InsertByte- 00:07:43.024 #43 NEW cov: 12067 ft: 14695 corp: 29/239b lim: 20 exec/s: 43 rss: 69Mb L: 10/16 MS: 1 ChangeByte- 00:07:43.024 #44 NEW cov: 12067 ft: 14697 corp: 30/247b lim: 20 exec/s: 44 rss: 69Mb L: 8/16 MS: 1 EraseBytes- 00:07:43.283 #45 NEW cov: 12067 ft: 14723 corp: 31/259b lim: 20 exec/s: 45 rss: 69Mb L: 12/16 MS: 1 ChangeBinInt- 00:07:43.283 #46 NEW cov: 12067 ft: 14789 corp: 32/279b lim: 20 exec/s: 46 rss: 69Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:07:43.283 #47 NEW cov: 12067 ft: 14805 corp: 33/291b lim: 20 exec/s: 23 rss: 69Mb L: 12/20 MS: 1 ChangeBit- 00:07:43.283 #47 DONE cov: 12067 ft: 14805 corp: 33/291b lim: 20 exec/s: 23 rss: 69Mb 00:07:43.283 ###### Recommended dictionary. ###### 00:07:43.283 "\001\000\000\005" # Uses: 1 00:07:43.283 ###### End of recommended dictionary. ###### 00:07:43.283 Done 47 runs in 2 second(s) 00:07:43.283 07:03:01 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:07:43.283 07:03:01 -- ../common.sh@72 -- # (( i++ )) 00:07:43.283 07:03:01 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:43.283 07:03:01 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:43.283 07:03:01 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:43.283 07:03:01 -- nvmf/run.sh@24 -- # local timen=1 00:07:43.283 07:03:01 -- nvmf/run.sh@25 -- # local core=0x1 00:07:43.283 07:03:01 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:43.283 07:03:01 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:43.283 07:03:01 -- nvmf/run.sh@29 -- # printf %02d 4 00:07:43.283 07:03:01 -- nvmf/run.sh@29 -- # port=4404 00:07:43.283 07:03:01 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:43.542 07:03:01 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:43.542 07:03:01 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:43.542 07:03:01 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:07:43.542 [2024-12-13 07:03:01.558156] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:43.542 [2024-12-13 07:03:01.558227] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid491151 ] 00:07:43.542 EAL: No free 2048 kB hugepages reported on node 1 00:07:43.542 [2024-12-13 07:03:01.741287] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.542 [2024-12-13 07:03:01.760763] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:43.542 [2024-12-13 07:03:01.760898] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.801 [2024-12-13 07:03:01.812598] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:43.801 [2024-12-13 07:03:01.828905] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:43.801 INFO: Running with entropic power schedule (0xFF, 100). 00:07:43.801 INFO: Seed: 1921744367 00:07:43.801 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:43.801 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:43.801 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:43.801 INFO: A corpus is not provided, starting from an empty corpus 00:07:43.801 #2 INITED exec/s: 0 rss: 59Mb 00:07:43.801 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:43.801 This may also happen if the target rejected all inputs we tried so far 00:07:43.801 [2024-12-13 07:03:01.895626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.801 [2024-12-13 07:03:01.895665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.801 [2024-12-13 07:03:01.895760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.801 [2024-12-13 07:03:01.895779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.801 [2024-12-13 07:03:01.895902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.801 [2024-12-13 07:03:01.895918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.060 NEW_FUNC[1/671]: 0x457518 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:44.060 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:44.060 #11 NEW cov: 11603 ft: 11604 corp: 2/22b lim: 35 exec/s: 0 rss: 67Mb L: 21/21 MS: 4 CopyPart-CopyPart-CrossOver-InsertRepeatedBytes- 00:07:44.060 [2024-12-13 07:03:02.216817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.060 [2024-12-13 07:03:02.216880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.060 [2024-12-13 07:03:02.217055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.060 [2024-12-13 07:03:02.217087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.060 [2024-12-13 07:03:02.217253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.060 [2024-12-13 07:03:02.217287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.060 #12 NEW cov: 11718 ft: 12293 corp: 3/43b lim: 35 exec/s: 0 rss: 67Mb L: 21/21 MS: 1 ChangeByte- 00:07:44.060 [2024-12-13 07:03:02.276655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.060 [2024-12-13 07:03:02.276688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.060 [2024-12-13 07:03:02.276823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.060 [2024-12-13 07:03:02.276840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.060 [2024-12-13 07:03:02.276984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.060 [2024-12-13 07:03:02.277001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.060 #13 NEW cov: 11724 ft: 12460 corp: 4/64b lim: 35 exec/s: 0 rss: 67Mb L: 21/21 MS: 1 CopyPart- 00:07:44.320 [2024-12-13 07:03:02.326760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.320 [2024-12-13 07:03:02.326790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.320 [2024-12-13 07:03:02.326914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.320 [2024-12-13 07:03:02.326933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.320 [2024-12-13 07:03:02.327065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.320 [2024-12-13 07:03:02.327084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.320 #14 NEW cov: 11809 ft: 12753 corp: 5/85b lim: 35 exec/s: 0 rss: 67Mb L: 21/21 MS: 1 CopyPart- 00:07:44.320 [2024-12-13 07:03:02.376933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.320 [2024-12-13 07:03:02.376962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.320 [2024-12-13 07:03:02.377116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000400 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.320 [2024-12-13 07:03:02.377135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.320 [2024-12-13 07:03:02.377264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.320 [2024-12-13 07:03:02.377281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.320 #15 NEW cov: 11809 ft: 12928 corp: 6/106b lim: 35 exec/s: 0 rss: 67Mb L: 21/21 MS: 1 ChangeBit- 00:07:44.320 [2024-12-13 07:03:02.426441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.320 [2024-12-13 07:03:02.426470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.320 #16 NEW cov: 11809 ft: 13762 corp: 7/118b lim: 35 exec/s: 0 rss: 67Mb L: 12/21 MS: 1 EraseBytes- 00:07:44.320 [2024-12-13 07:03:02.487255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.320 [2024-12-13 07:03:02.487303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.320 [2024-12-13 07:03:02.487444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.320 [2024-12-13 07:03:02.487462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.320 [2024-12-13 07:03:02.487592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.320 [2024-12-13 07:03:02.487612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.320 #17 NEW cov: 11809 ft: 13799 corp: 8/139b lim: 35 exec/s: 0 rss: 67Mb L: 21/21 MS: 1 ChangeBinInt- 00:07:44.320 [2024-12-13 07:03:02.546826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.320 [2024-12-13 07:03:02.546859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.579 #18 NEW cov: 11809 ft: 13825 corp: 9/150b lim: 35 exec/s: 0 rss: 67Mb L: 11/21 MS: 1 EraseBytes- 00:07:44.579 [2024-12-13 07:03:02.607676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.579 [2024-12-13 07:03:02.607704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.579 [2024-12-13 07:03:02.607836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.579 [2024-12-13 07:03:02.607855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.579 [2024-12-13 07:03:02.607991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:07000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.579 [2024-12-13 07:03:02.608009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.579 #24 NEW cov: 11809 ft: 13841 corp: 10/171b lim: 35 exec/s: 0 rss: 67Mb L: 21/21 MS: 1 ChangeBinInt- 00:07:44.579 [2024-12-13 07:03:02.657111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:02e70a01 cdw11:a2cf0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.579 [2024-12-13 07:03:02.657137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.579 #25 NEW cov: 11809 ft: 13863 corp: 11/180b lim: 35 exec/s: 0 rss: 67Mb L: 9/21 MS: 1 CMP- DE: "\001\002\347\242\317-\037\332"- 00:07:44.579 [2024-12-13 07:03:02.707917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.579 [2024-12-13 07:03:02.707945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.579 [2024-12-13 07:03:02.708073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.579 [2024-12-13 07:03:02.708092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.579 [2024-12-13 07:03:02.708232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.579 [2024-12-13 07:03:02.708250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.579 #26 NEW cov: 11809 ft: 13911 corp: 12/204b lim: 35 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 CopyPart- 00:07:44.579 [2024-12-13 07:03:02.767629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:02e70a01 cdw11:a2cf0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.579 [2024-12-13 07:03:02.767657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.579 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:44.579 #27 NEW cov: 11832 ft: 13990 corp: 13/213b lim: 35 exec/s: 0 rss: 67Mb L: 9/24 MS: 1 ChangeBit- 00:07:44.838 [2024-12-13 07:03:02.828888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.838 [2024-12-13 07:03:02.828915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.838 [2024-12-13 07:03:02.829036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c0c00000 cdw11:c0c00003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.838 [2024-12-13 07:03:02.829059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.838 [2024-12-13 07:03:02.829184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c0c0c0c0 cdw11:c0c00003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.838 [2024-12-13 07:03:02.829205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.838 [2024-12-13 07:03:02.829344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:0000c0c0 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.838 [2024-12-13 07:03:02.829362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.838 [2024-12-13 07:03:02.829503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:07000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.838 [2024-12-13 07:03:02.829519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.838 #33 NEW cov: 11832 ft: 14376 corp: 14/248b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:44.838 [2024-12-13 07:03:02.888674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.838 [2024-12-13 07:03:02.888702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.838 [2024-12-13 07:03:02.888855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.838 [2024-12-13 07:03:02.888873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.838 [2024-12-13 07:03:02.888974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:06000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.838 [2024-12-13 07:03:02.888993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.839 [2024-12-13 07:03:02.889118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.839 [2024-12-13 07:03:02.889135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.839 #34 NEW cov: 11832 ft: 14406 corp: 15/277b lim: 35 exec/s: 34 rss: 68Mb L: 29/35 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:07:44.839 [2024-12-13 07:03:02.948404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.839 [2024-12-13 07:03:02.948432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.839 [2024-12-13 07:03:02.948571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.839 [2024-12-13 07:03:02.948589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.839 #35 NEW cov: 11832 ft: 14619 corp: 16/295b lim: 35 exec/s: 35 rss: 68Mb L: 18/35 MS: 1 EraseBytes- 00:07:44.839 [2024-12-13 07:03:02.998284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:002f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.839 [2024-12-13 07:03:02.998312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.839 #36 NEW cov: 11832 ft: 14633 corp: 17/306b lim: 35 exec/s: 36 rss: 68Mb L: 11/35 MS: 1 ChangeByte- 00:07:44.839 [2024-12-13 07:03:03.058793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.839 [2024-12-13 07:03:03.058825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.839 [2024-12-13 07:03:03.058963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.839 [2024-12-13 07:03:03.058981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.098 #37 NEW cov: 11832 ft: 14680 corp: 18/320b lim: 35 exec/s: 37 rss: 68Mb L: 14/35 MS: 1 EraseBytes- 00:07:45.098 [2024-12-13 07:03:03.109581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.098 [2024-12-13 07:03:03.109608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.098 [2024-12-13 07:03:03.109744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ff00ffff cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.098 [2024-12-13 07:03:03.109764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.098 [2024-12-13 07:03:03.109884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.098 [2024-12-13 07:03:03.109902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.098 [2024-12-13 07:03:03.110020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.098 [2024-12-13 07:03:03.110038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.098 #38 NEW cov: 11832 ft: 14695 corp: 19/352b lim: 35 exec/s: 38 rss: 68Mb L: 32/35 MS: 1 InsertRepeatedBytes- 00:07:45.098 [2024-12-13 07:03:03.169842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.098 [2024-12-13 07:03:03.169872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.098 [2024-12-13 07:03:03.170002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ff00ffff cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.098 [2024-12-13 07:03:03.170020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.098 [2024-12-13 07:03:03.170143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.098 [2024-12-13 07:03:03.170160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.098 [2024-12-13 07:03:03.170287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:000a0000 cdw11:0a0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.098 [2024-12-13 07:03:03.170304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.098 #39 NEW cov: 11832 ft: 14719 corp: 20/383b lim: 35 exec/s: 39 rss: 68Mb L: 31/35 MS: 1 CrossOver- 00:07:45.098 [2024-12-13 07:03:03.228986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00020003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.098 [2024-12-13 07:03:03.229014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.098 #40 NEW cov: 11832 ft: 14725 corp: 21/395b lim: 35 exec/s: 40 rss: 68Mb L: 12/35 MS: 1 InsertRepeatedBytes- 00:07:45.098 [2024-12-13 07:03:03.280011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.098 [2024-12-13 07:03:03.280040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.098 [2024-12-13 07:03:03.280174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ff00ffff cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.098 [2024-12-13 07:03:03.280197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.098 [2024-12-13 07:03:03.280319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.098 [2024-12-13 07:03:03.280336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.098 [2024-12-13 07:03:03.280453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:000a0000 cdw11:0a0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.098 [2024-12-13 07:03:03.280472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.098 #41 NEW cov: 11832 ft: 14739 corp: 22/426b lim: 35 exec/s: 41 rss: 68Mb L: 31/35 MS: 1 ChangeBit- 00:07:45.357 [2024-12-13 07:03:03.340223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.358 [2024-12-13 07:03:03.340251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.358 [2024-12-13 07:03:03.340395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ff00ffff cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.358 [2024-12-13 07:03:03.340415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.358 [2024-12-13 07:03:03.340550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.358 [2024-12-13 07:03:03.340567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.358 [2024-12-13 07:03:03.340700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:000a0000 cdw11:0aa00000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.358 [2024-12-13 07:03:03.340719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.358 #42 NEW cov: 11832 ft: 14745 corp: 23/458b lim: 35 exec/s: 42 rss: 68Mb L: 32/35 MS: 1 InsertByte- 00:07:45.358 [2024-12-13 07:03:03.389891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a000a0a cdw11:00070000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.358 [2024-12-13 07:03:03.389919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.358 [2024-12-13 07:03:03.390043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.358 [2024-12-13 07:03:03.390059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.358 #43 NEW cov: 11832 ft: 14752 corp: 24/472b lim: 35 exec/s: 43 rss: 68Mb L: 14/35 MS: 1 CMP- DE: "\007\000\000\000"- 00:07:45.358 [2024-12-13 07:03:03.450094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.358 [2024-12-13 07:03:03.450120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.358 [2024-12-13 07:03:03.450272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.358 [2024-12-13 07:03:03.450291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.358 [2024-12-13 07:03:03.450410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:01020003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.358 [2024-12-13 07:03:03.450429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.358 #44 NEW cov: 11832 ft: 14780 corp: 25/498b lim: 35 exec/s: 44 rss: 69Mb L: 26/35 MS: 1 InsertRepeatedBytes- 00:07:45.358 [2024-12-13 07:03:03.500374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:01020003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.358 [2024-12-13 07:03:03.500404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.358 [2024-12-13 07:03:03.500529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2d1fa2cf cdw11:da000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.358 [2024-12-13 07:03:03.500546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.358 [2024-12-13 07:03:03.500676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.358 [2024-12-13 07:03:03.500691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.358 #45 NEW cov: 11832 ft: 14845 corp: 26/519b lim: 35 exec/s: 45 rss: 69Mb L: 21/35 MS: 1 PersAutoDict- DE: "\001\002\347\242\317-\037\332"- 00:07:45.358 [2024-12-13 07:03:03.550630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.358 [2024-12-13 07:03:03.550659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.358 [2024-12-13 07:03:03.550785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.358 [2024-12-13 07:03:03.550802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.358 [2024-12-13 07:03:03.550940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.358 [2024-12-13 07:03:03.550960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.358 #46 NEW cov: 11832 ft: 14860 corp: 27/543b lim: 35 exec/s: 46 rss: 69Mb L: 24/35 MS: 1 ChangeBit- 00:07:45.617 [2024-12-13 07:03:03.610541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a000a0a cdw11:00070000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.617 [2024-12-13 07:03:03.610569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.617 [2024-12-13 07:03:03.610696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:27000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.617 [2024-12-13 07:03:03.610713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.617 #47 NEW cov: 11832 ft: 14877 corp: 28/557b lim: 35 exec/s: 47 rss: 69Mb L: 14/35 MS: 1 ChangeByte- 00:07:45.617 [2024-12-13 07:03:03.671077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.617 [2024-12-13 07:03:03.671111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.617 [2024-12-13 07:03:03.671258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.617 [2024-12-13 07:03:03.671277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.617 [2024-12-13 07:03:03.671402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.617 [2024-12-13 07:03:03.671419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.617 #48 NEW cov: 11832 ft: 14896 corp: 29/584b lim: 35 exec/s: 48 rss: 69Mb L: 27/35 MS: 1 InsertRepeatedBytes- 00:07:45.617 [2024-12-13 07:03:03.721507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.617 [2024-12-13 07:03:03.721536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.617 [2024-12-13 07:03:03.721667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ff00ffff cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.617 [2024-12-13 07:03:03.721685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.617 [2024-12-13 07:03:03.721815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0010ffff cdw11:00060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.617 [2024-12-13 07:03:03.721833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.617 [2024-12-13 07:03:03.721965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:000a0000 cdw11:0a0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.617 [2024-12-13 07:03:03.721983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.617 #49 NEW cov: 11832 ft: 14909 corp: 30/615b lim: 35 exec/s: 49 rss: 69Mb L: 31/35 MS: 1 ChangeBit- 00:07:45.617 [2024-12-13 07:03:03.771378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.617 [2024-12-13 07:03:03.771407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.617 [2024-12-13 07:03:03.771537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:cececece cdw11:ce000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.617 [2024-12-13 07:03:03.771555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.617 [2024-12-13 07:03:03.771679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.617 [2024-12-13 07:03:03.771699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.617 #50 NEW cov: 11832 ft: 14935 corp: 31/642b lim: 35 exec/s: 50 rss: 69Mb L: 27/35 MS: 1 InsertRepeatedBytes- 00:07:45.617 [2024-12-13 07:03:03.821307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a000a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.617 [2024-12-13 07:03:03.821338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.617 [2024-12-13 07:03:03.821466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.617 [2024-12-13 07:03:03.821486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.618 #51 NEW cov: 11832 ft: 14956 corp: 32/656b lim: 35 exec/s: 51 rss: 69Mb L: 14/35 MS: 1 ShuffleBytes- 00:07:45.876 [2024-12-13 07:03:03.871787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a0a0a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.876 [2024-12-13 07:03:03.871818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.876 [2024-12-13 07:03:03.871916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:01020003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.876 [2024-12-13 07:03:03.871935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.876 [2024-12-13 07:03:03.872060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2d1fa2cf cdw11:da000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.876 [2024-12-13 07:03:03.872077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.877 #52 NEW cov: 11832 ft: 14970 corp: 33/677b lim: 35 exec/s: 26 rss: 69Mb L: 21/35 MS: 1 PersAutoDict- DE: "\001\002\347\242\317-\037\332"- 00:07:45.877 #52 DONE cov: 11832 ft: 14970 corp: 33/677b lim: 35 exec/s: 26 rss: 69Mb 00:07:45.877 ###### Recommended dictionary. ###### 00:07:45.877 "\001\002\347\242\317-\037\332" # Uses: 2 00:07:45.877 "\377\377\377\377\377\377\377\377" # Uses: 0 00:07:45.877 "\007\000\000\000" # Uses: 0 00:07:45.877 ###### End of recommended dictionary. ###### 00:07:45.877 Done 52 runs in 2 second(s) 00:07:45.877 07:03:04 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:07:45.877 07:03:04 -- ../common.sh@72 -- # (( i++ )) 00:07:45.877 07:03:04 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:45.877 07:03:04 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:45.877 07:03:04 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:45.877 07:03:04 -- nvmf/run.sh@24 -- # local timen=1 00:07:45.877 07:03:04 -- nvmf/run.sh@25 -- # local core=0x1 00:07:45.877 07:03:04 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:45.877 07:03:04 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:45.877 07:03:04 -- nvmf/run.sh@29 -- # printf %02d 5 00:07:45.877 07:03:04 -- nvmf/run.sh@29 -- # port=4405 00:07:45.877 07:03:04 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:45.877 07:03:04 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:45.877 07:03:04 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:45.877 07:03:04 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:07:45.877 [2024-12-13 07:03:04.053286] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:45.877 [2024-12-13 07:03:04.053361] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid491777 ] 00:07:45.877 EAL: No free 2048 kB hugepages reported on node 1 00:07:46.136 [2024-12-13 07:03:04.235183] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.136 [2024-12-13 07:03:04.254843] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:46.136 [2024-12-13 07:03:04.254978] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.136 [2024-12-13 07:03:04.306336] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:46.136 [2024-12-13 07:03:04.322662] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:46.136 INFO: Running with entropic power schedule (0xFF, 100). 00:07:46.136 INFO: Seed: 121777623 00:07:46.136 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:46.136 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:46.136 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:46.136 INFO: A corpus is not provided, starting from an empty corpus 00:07:46.136 #2 INITED exec/s: 0 rss: 59Mb 00:07:46.136 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:46.136 This may also happen if the target rejected all inputs we tried so far 00:07:46.136 [2024-12-13 07:03:04.368115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.136 [2024-12-13 07:03:04.368144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.136 [2024-12-13 07:03:04.368201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.136 [2024-12-13 07:03:04.368216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.136 [2024-12-13 07:03:04.368268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.136 [2024-12-13 07:03:04.368281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.654 NEW_FUNC[1/670]: 0x4596b8 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:46.654 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:46.654 #13 NEW cov: 11607 ft: 11617 corp: 2/35b lim: 45 exec/s: 0 rss: 67Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:07:46.654 [2024-12-13 07:03:04.678563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5e5e0a5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.654 [2024-12-13 07:03:04.678599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.654 NEW_FUNC[1/1]: 0x1246768 in nvmf_poll_group_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/nvmf.c:153 00:07:46.654 #14 NEW cov: 11729 ft: 12822 corp: 3/46b lim: 45 exec/s: 0 rss: 67Mb L: 11/34 MS: 1 CrossOver- 00:07:46.654 [2024-12-13 07:03:04.718583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5e5e0a5e cdw11:985e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.654 [2024-12-13 07:03:04.718608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.654 #15 NEW cov: 11735 ft: 13010 corp: 4/57b lim: 45 exec/s: 0 rss: 67Mb L: 11/34 MS: 1 ChangeBinInt- 00:07:46.654 [2024-12-13 07:03:04.758688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e79e0102 cdw11:a7f40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.654 [2024-12-13 07:03:04.758713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.654 #16 NEW cov: 11820 ft: 13225 corp: 5/68b lim: 45 exec/s: 0 rss: 67Mb L: 11/34 MS: 1 CMP- DE: "\001\002\347\236\247\3644\240"- 00:07:46.654 [2024-12-13 07:03:04.798855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e79e0102 cdw11:27f40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.654 [2024-12-13 07:03:04.798879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.654 #17 NEW cov: 11820 ft: 13300 corp: 6/79b lim: 45 exec/s: 0 rss: 67Mb L: 11/34 MS: 1 ChangeBit- 00:07:46.654 [2024-12-13 07:03:04.838969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5e5e0a5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.654 [2024-12-13 07:03:04.838993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.654 #18 NEW cov: 11820 ft: 13478 corp: 7/93b lim: 45 exec/s: 0 rss: 67Mb L: 14/34 MS: 1 CopyPart- 00:07:46.654 [2024-12-13 07:03:04.879231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad0a5e cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.654 [2024-12-13 07:03:04.879256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.654 [2024-12-13 07:03:04.879322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.654 [2024-12-13 07:03:04.879336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.913 #19 NEW cov: 11820 ft: 13832 corp: 8/113b lim: 45 exec/s: 0 rss: 67Mb L: 20/34 MS: 1 InsertRepeatedBytes- 00:07:46.913 [2024-12-13 07:03:04.919221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:02e70201 cdw11:9ea70007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.913 [2024-12-13 07:03:04.919246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.913 #22 NEW cov: 11820 ft: 13909 corp: 9/122b lim: 45 exec/s: 0 rss: 67Mb L: 9/34 MS: 3 CopyPart-ChangeBit-PersAutoDict- DE: "\001\002\347\236\247\3644\240"- 00:07:46.913 [2024-12-13 07:03:04.959336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e79e0102 cdw11:26f40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.913 [2024-12-13 07:03:04.959362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.913 #23 NEW cov: 11820 ft: 13935 corp: 10/133b lim: 45 exec/s: 0 rss: 67Mb L: 11/34 MS: 1 ChangeByte- 00:07:46.913 [2024-12-13 07:03:04.999432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.913 [2024-12-13 07:03:04.999457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.913 #24 NEW cov: 11820 ft: 13988 corp: 11/150b lim: 45 exec/s: 0 rss: 67Mb L: 17/34 MS: 1 EraseBytes- 00:07:46.913 [2024-12-13 07:03:05.039737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e79e0102 cdw11:a7f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.913 [2024-12-13 07:03:05.039761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.913 [2024-12-13 07:03:05.039810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:f4349ea7 cdw11:a0340005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.913 [2024-12-13 07:03:05.039824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.913 #25 NEW cov: 11820 ft: 14070 corp: 12/169b lim: 45 exec/s: 0 rss: 67Mb L: 19/34 MS: 1 PersAutoDict- DE: "\001\002\347\236\247\3644\240"- 00:07:46.913 [2024-12-13 07:03:05.079672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5e5e0a5e cdw11:985e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.913 [2024-12-13 07:03:05.079696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.913 #26 NEW cov: 11820 ft: 14136 corp: 13/180b lim: 45 exec/s: 0 rss: 67Mb L: 11/34 MS: 1 CrossOver- 00:07:46.913 [2024-12-13 07:03:05.119958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5e5e0a5e cdw11:985e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.913 [2024-12-13 07:03:05.119985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.913 [2024-12-13 07:03:05.120050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadadad cdw11:ad5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.913 [2024-12-13 07:03:05.120064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.913 #27 NEW cov: 11820 ft: 14146 corp: 14/202b lim: 45 exec/s: 0 rss: 67Mb L: 22/34 MS: 1 CrossOver- 00:07:47.172 [2024-12-13 07:03:05.159916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e79e0102 cdw11:27f40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.172 [2024-12-13 07:03:05.159940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.172 #28 NEW cov: 11820 ft: 14179 corp: 15/213b lim: 45 exec/s: 0 rss: 68Mb L: 11/34 MS: 1 ChangeBit- 00:07:47.172 [2024-12-13 07:03:05.200171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad0a5e cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.172 [2024-12-13 07:03:05.200202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.172 [2024-12-13 07:03:05.200257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:5e5e5ede cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.172 [2024-12-13 07:03:05.200271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.172 #29 NEW cov: 11820 ft: 14195 corp: 16/233b lim: 45 exec/s: 0 rss: 68Mb L: 20/34 MS: 1 ChangeBit- 00:07:47.172 [2024-12-13 07:03:05.240315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e79e0102 cdw11:27f40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.172 [2024-12-13 07:03:05.240340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.172 [2024-12-13 07:03:05.240394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00003e00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.172 [2024-12-13 07:03:05.240408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.172 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:47.172 #30 NEW cov: 11843 ft: 14231 corp: 17/252b lim: 45 exec/s: 0 rss: 68Mb L: 19/34 MS: 1 CMP- DE: ">\000\000\000\000\000\000\000"- 00:07:47.172 [2024-12-13 07:03:05.280422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:985e0a5e cdw11:5e5e0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.172 [2024-12-13 07:03:05.280447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.172 [2024-12-13 07:03:05.280500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ad5eadad cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.172 [2024-12-13 07:03:05.280514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.172 #31 NEW cov: 11843 ft: 14248 corp: 18/272b lim: 45 exec/s: 0 rss: 68Mb L: 20/34 MS: 1 EraseBytes- 00:07:47.172 [2024-12-13 07:03:05.320534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e79e0102 cdw11:a7f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.172 [2024-12-13 07:03:05.320558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.172 [2024-12-13 07:03:05.320626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:f402a0a7 cdw11:9e340005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.172 [2024-12-13 07:03:05.320643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.172 #32 NEW cov: 11843 ft: 14252 corp: 19/291b lim: 45 exec/s: 0 rss: 68Mb L: 19/34 MS: 1 ShuffleBytes- 00:07:47.172 [2024-12-13 07:03:05.360481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5e5e0a5e cdw11:985e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.172 [2024-12-13 07:03:05.360505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.172 #33 NEW cov: 11843 ft: 14270 corp: 20/305b lim: 45 exec/s: 33 rss: 68Mb L: 14/34 MS: 1 CopyPart- 00:07:47.172 [2024-12-13 07:03:05.400589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5e5e0a5e cdw11:985e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.172 [2024-12-13 07:03:05.400614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.431 #39 NEW cov: 11843 ft: 14328 corp: 21/316b lim: 45 exec/s: 39 rss: 68Mb L: 11/34 MS: 1 CrossOver- 00:07:47.431 [2024-12-13 07:03:05.441181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad0aad cdw11:adad0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.431 [2024-12-13 07:03:05.441209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.431 [2024-12-13 07:03:05.441261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.431 [2024-12-13 07:03:05.441274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.431 [2024-12-13 07:03:05.441327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:adad5ead cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.431 [2024-12-13 07:03:05.441340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.431 [2024-12-13 07:03:05.441392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.431 [2024-12-13 07:03:05.441405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.431 #40 NEW cov: 11843 ft: 14671 corp: 22/353b lim: 45 exec/s: 40 rss: 68Mb L: 37/37 MS: 1 CopyPart- 00:07:47.431 [2024-12-13 07:03:05.480812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5e5e0a5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.431 [2024-12-13 07:03:05.480836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.431 #41 NEW cov: 11843 ft: 14750 corp: 23/367b lim: 45 exec/s: 41 rss: 68Mb L: 14/37 MS: 1 ChangeBinInt- 00:07:47.431 [2024-12-13 07:03:05.521089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad0a5e cdw11:adaf0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.431 [2024-12-13 07:03:05.521113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.431 [2024-12-13 07:03:05.521164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:5e5e5ede cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.431 [2024-12-13 07:03:05.521177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.431 #42 NEW cov: 11843 ft: 14791 corp: 24/387b lim: 45 exec/s: 42 rss: 68Mb L: 20/37 MS: 1 ChangeBit- 00:07:47.431 [2024-12-13 07:03:05.561078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e79e0102 cdw11:27f40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.431 [2024-12-13 07:03:05.561106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.431 #43 NEW cov: 11843 ft: 14800 corp: 25/398b lim: 45 exec/s: 43 rss: 68Mb L: 11/37 MS: 1 ChangeByte- 00:07:47.431 [2024-12-13 07:03:05.601140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5e555e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.431 [2024-12-13 07:03:05.601165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.431 #44 NEW cov: 11843 ft: 14835 corp: 26/412b lim: 45 exec/s: 44 rss: 68Mb L: 14/37 MS: 1 CopyPart- 00:07:47.431 [2024-12-13 07:03:05.651323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5e5e0a5e cdw11:985e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.431 [2024-12-13 07:03:05.651349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.690 #45 NEW cov: 11843 ft: 14895 corp: 27/426b lim: 45 exec/s: 45 rss: 68Mb L: 14/37 MS: 1 ShuffleBytes- 00:07:47.690 [2024-12-13 07:03:05.691454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e79e0102 cdw11:a7f40001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.690 [2024-12-13 07:03:05.691479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.690 #46 NEW cov: 11843 ft: 14906 corp: 28/437b lim: 45 exec/s: 46 rss: 68Mb L: 11/37 MS: 1 ChangeBinInt- 00:07:47.690 [2024-12-13 07:03:05.731519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:02e70201 cdw11:9ea70007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.690 [2024-12-13 07:03:05.731545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.690 #47 NEW cov: 11843 ft: 14921 corp: 29/454b lim: 45 exec/s: 47 rss: 69Mb L: 17/37 MS: 1 PersAutoDict- DE: ">\000\000\000\000\000\000\000"- 00:07:47.690 [2024-12-13 07:03:05.771951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.690 [2024-12-13 07:03:05.771976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.690 [2024-12-13 07:03:05.772030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.690 [2024-12-13 07:03:05.772044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.690 [2024-12-13 07:03:05.772095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:9ea702e7 cdw11:f43e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.690 [2024-12-13 07:03:05.772124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.690 #48 NEW cov: 11843 ft: 14928 corp: 30/485b lim: 45 exec/s: 48 rss: 69Mb L: 31/37 MS: 1 CrossOver- 00:07:47.690 [2024-12-13 07:03:05.811939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5e5e0a5e cdw11:985e0006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.690 [2024-12-13 07:03:05.811963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.690 [2024-12-13 07:03:05.812014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadadad cdw11:ad5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.691 [2024-12-13 07:03:05.812027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.691 #49 NEW cov: 11843 ft: 14935 corp: 31/507b lim: 45 exec/s: 49 rss: 69Mb L: 22/37 MS: 1 ChangeBit- 00:07:47.691 [2024-12-13 07:03:05.852023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00001300 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.691 [2024-12-13 07:03:05.852050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.691 [2024-12-13 07:03:05.852103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:f4349ea7 cdw11:a0340005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.691 [2024-12-13 07:03:05.852116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.691 #50 NEW cov: 11843 ft: 14943 corp: 32/526b lim: 45 exec/s: 50 rss: 69Mb L: 19/37 MS: 1 ChangeBinInt- 00:07:47.691 [2024-12-13 07:03:05.891968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:664f1792 cdw11:9fe70000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.691 [2024-12-13 07:03:05.891992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.691 #54 NEW cov: 11843 ft: 14969 corp: 33/535b lim: 45 exec/s: 54 rss: 69Mb L: 9/37 MS: 4 CopyPart-ChangeBit-ChangeBit-CMP- DE: "\027\222fO\237\347\002\000"- 00:07:47.950 [2024-12-13 07:03:05.932093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:01000102 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.950 [2024-12-13 07:03:05.932117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.950 #55 NEW cov: 11843 ft: 15028 corp: 34/546b lim: 45 exec/s: 55 rss: 69Mb L: 11/37 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:47.950 [2024-12-13 07:03:05.972242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5e5e0a5e cdw11:2c980002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.950 [2024-12-13 07:03:05.972266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.950 #56 NEW cov: 11843 ft: 15099 corp: 35/558b lim: 45 exec/s: 56 rss: 69Mb L: 12/37 MS: 1 InsertByte- 00:07:47.950 [2024-12-13 07:03:06.012328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5e5e0a54 cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.950 [2024-12-13 07:03:06.012352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.950 #57 NEW cov: 11843 ft: 15107 corp: 36/569b lim: 45 exec/s: 57 rss: 69Mb L: 11/37 MS: 1 ChangeBinInt- 00:07:47.950 [2024-12-13 07:03:06.052568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e79e0102 cdw11:a7010007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.950 [2024-12-13 07:03:06.052592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.950 [2024-12-13 07:03:06.052644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:a702e734 cdw11:a0340005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.950 [2024-12-13 07:03:06.052657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.950 #58 NEW cov: 11843 ft: 15125 corp: 37/588b lim: 45 exec/s: 58 rss: 69Mb L: 19/37 MS: 1 ShuffleBytes- 00:07:47.950 [2024-12-13 07:03:06.092691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:adad0a5e cdw11:adad0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.950 [2024-12-13 07:03:06.092715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.950 [2024-12-13 07:03:06.092768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:5e5ea25e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.950 [2024-12-13 07:03:06.092781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.950 #59 NEW cov: 11843 ft: 15140 corp: 38/608b lim: 45 exec/s: 59 rss: 69Mb L: 20/37 MS: 1 ChangeByte- 00:07:47.950 [2024-12-13 07:03:06.132804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:e79e0102 cdw11:a7010007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.950 [2024-12-13 07:03:06.132828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.950 [2024-12-13 07:03:06.132895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:9266e717 cdw11:4f9f0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.950 [2024-12-13 07:03:06.132909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.950 #60 NEW cov: 11843 ft: 15154 corp: 39/627b lim: 45 exec/s: 60 rss: 69Mb L: 19/37 MS: 1 PersAutoDict- DE: "\027\222fO\237\347\002\000"- 00:07:47.950 [2024-12-13 07:03:06.173283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.950 [2024-12-13 07:03:06.173308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.950 [2024-12-13 07:03:06.173358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.950 [2024-12-13 07:03:06.173371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.950 [2024-12-13 07:03:06.173419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.950 [2024-12-13 07:03:06.173433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.950 [2024-12-13 07:03:06.173480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:5e5e5e5e cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.950 [2024-12-13 07:03:06.173492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.209 #61 NEW cov: 11843 ft: 15172 corp: 40/667b lim: 45 exec/s: 61 rss: 69Mb L: 40/40 MS: 1 CopyPart- 00:07:48.209 [2024-12-13 07:03:06.213030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ad5e0a5e cdw11:5e980002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.209 [2024-12-13 07:03:06.213054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.209 [2024-12-13 07:03:06.213102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:adadadad cdw11:ad5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.209 [2024-12-13 07:03:06.213115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.209 #62 NEW cov: 11843 ft: 15189 corp: 41/693b lim: 45 exec/s: 62 rss: 69Mb L: 26/40 MS: 1 CrossOver- 00:07:48.209 [2024-12-13 07:03:06.253055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:02e70201 cdw11:9ea70006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.209 [2024-12-13 07:03:06.253080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.209 #63 NEW cov: 11843 ft: 15213 corp: 42/702b lim: 45 exec/s: 63 rss: 69Mb L: 9/40 MS: 1 ChangeBit- 00:07:48.209 [2024-12-13 07:03:06.293097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:02e70201 cdw11:2d9e0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.209 [2024-12-13 07:03:06.293121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.209 #64 NEW cov: 11843 ft: 15227 corp: 43/712b lim: 45 exec/s: 64 rss: 69Mb L: 10/40 MS: 1 InsertByte- 00:07:48.209 [2024-12-13 07:03:06.333246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0000013e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.209 [2024-12-13 07:03:06.333273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.209 #65 NEW cov: 11843 ft: 15234 corp: 44/723b lim: 45 exec/s: 32 rss: 69Mb L: 11/40 MS: 1 PersAutoDict- DE: ">\000\000\000\000\000\000\000"- 00:07:48.209 #65 DONE cov: 11843 ft: 15234 corp: 44/723b lim: 45 exec/s: 32 rss: 69Mb 00:07:48.209 ###### Recommended dictionary. ###### 00:07:48.209 "\001\002\347\236\247\3644\240" # Uses: 2 00:07:48.209 ">\000\000\000\000\000\000\000" # Uses: 2 00:07:48.209 "\027\222fO\237\347\002\000" # Uses: 1 00:07:48.209 "\001\000\000\000\000\000\000\000" # Uses: 0 00:07:48.209 ###### End of recommended dictionary. ###### 00:07:48.209 Done 65 runs in 2 second(s) 00:07:48.468 07:03:06 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:07:48.468 07:03:06 -- ../common.sh@72 -- # (( i++ )) 00:07:48.468 07:03:06 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:48.468 07:03:06 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:48.468 07:03:06 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:48.468 07:03:06 -- nvmf/run.sh@24 -- # local timen=1 00:07:48.468 07:03:06 -- nvmf/run.sh@25 -- # local core=0x1 00:07:48.468 07:03:06 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:48.468 07:03:06 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:48.468 07:03:06 -- nvmf/run.sh@29 -- # printf %02d 6 00:07:48.468 07:03:06 -- nvmf/run.sh@29 -- # port=4406 00:07:48.468 07:03:06 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:48.468 07:03:06 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:48.468 07:03:06 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:48.468 07:03:06 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:07:48.468 [2024-12-13 07:03:06.507272] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:48.468 [2024-12-13 07:03:06.507336] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid492433 ] 00:07:48.468 EAL: No free 2048 kB hugepages reported on node 1 00:07:48.468 [2024-12-13 07:03:06.690370] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.728 [2024-12-13 07:03:06.710647] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:48.728 [2024-12-13 07:03:06.710773] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.728 [2024-12-13 07:03:06.762305] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:48.728 [2024-12-13 07:03:06.778624] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:48.728 INFO: Running with entropic power schedule (0xFF, 100). 00:07:48.728 INFO: Seed: 2575785538 00:07:48.728 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:48.728 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:48.728 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:48.728 INFO: A corpus is not provided, starting from an empty corpus 00:07:48.728 #2 INITED exec/s: 0 rss: 59Mb 00:07:48.728 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:48.728 This may also happen if the target rejected all inputs we tried so far 00:07:48.728 [2024-12-13 07:03:06.849066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:48.728 [2024-12-13 07:03:06.849106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.728 [2024-12-13 07:03:06.849233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:48.728 [2024-12-13 07:03:06.849251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.728 [2024-12-13 07:03:06.849367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:48.728 [2024-12-13 07:03:06.849382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.728 [2024-12-13 07:03:06.849495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a0a cdw11:00000000 00:07:48.728 [2024-12-13 07:03:06.849512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.987 NEW_FUNC[1/668]: 0x45bec8 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:48.987 NEW_FUNC[2/668]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:48.987 #3 NEW cov: 11530 ft: 11531 corp: 2/9b lim: 10 exec/s: 0 rss: 67Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:07:48.987 [2024-12-13 07:03:07.179987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:48.987 [2024-12-13 07:03:07.180035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.987 [2024-12-13 07:03:07.180176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:48.987 [2024-12-13 07:03:07.180206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.987 [2024-12-13 07:03:07.180338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002d7a cdw11:00000000 00:07:48.987 [2024-12-13 07:03:07.180361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.987 [2024-12-13 07:03:07.180507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:48.987 [2024-12-13 07:03:07.180529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.987 NEW_FUNC[1/1]: 0xf86208 in spdk_sock_prep_reqs /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk_internal/sock.h:284 00:07:48.987 #4 NEW cov: 11646 ft: 12096 corp: 3/18b lim: 10 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 InsertByte- 00:07:49.246 [2024-12-13 07:03:07.229621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.246 [2024-12-13 07:03:07.229652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.246 [2024-12-13 07:03:07.229768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.229785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.229898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a3a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.229916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.230031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a0a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.230048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.247 #5 NEW cov: 11652 ft: 12387 corp: 4/26b lim: 10 exec/s: 0 rss: 67Mb L: 8/9 MS: 1 ChangeByte- 00:07:49.247 [2024-12-13 07:03:07.270276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000307a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.270305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.270434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.270453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.270574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a2d cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.270592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.270707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.270724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.270856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00007a0a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.270874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.247 #6 NEW cov: 11737 ft: 12714 corp: 5/36b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 InsertByte- 00:07:49.247 [2024-12-13 07:03:07.320348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.320375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.320485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.320504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.320619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002d7a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.320636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.320752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a8f cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.320770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.320880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00007a0a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.320897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.247 #7 NEW cov: 11737 ft: 12821 corp: 6/46b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 InsertByte- 00:07:49.247 [2024-12-13 07:03:07.360300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.360328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.360446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000597a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.360464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.360571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.360592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.360705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.360720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.247 #8 NEW cov: 11737 ft: 12862 corp: 7/55b lim: 10 exec/s: 0 rss: 67Mb L: 9/10 MS: 1 InsertByte- 00:07:49.247 [2024-12-13 07:03:07.400482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000307a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.400512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.400632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.400649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.400765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a2d cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.400782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.400900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.400919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.247 #9 NEW cov: 11737 ft: 12908 corp: 8/64b lim: 10 exec/s: 0 rss: 67Mb L: 9/10 MS: 1 CrossOver- 00:07:49.247 [2024-12-13 07:03:07.440760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000307a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.440787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.440890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.440906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.441015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a2d cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.441030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.441143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.441160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.441279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00007a0a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.441296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.247 #10 NEW cov: 11737 ft: 12932 corp: 9/74b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:49.247 [2024-12-13 07:03:07.480689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.480718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.480842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a3d cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.480860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.480980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.480996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.247 [2024-12-13 07:03:07.481116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.247 [2024-12-13 07:03:07.481135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.507 #11 NEW cov: 11737 ft: 13011 corp: 10/83b lim: 10 exec/s: 0 rss: 67Mb L: 9/10 MS: 1 InsertByte- 00:07:49.507 [2024-12-13 07:03:07.530813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000317a cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.530841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.507 [2024-12-13 07:03:07.530986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.531004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.507 [2024-12-13 07:03:07.531118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a3a cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.531134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.507 [2024-12-13 07:03:07.531271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a0a cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.531289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.507 #12 NEW cov: 11737 ft: 13102 corp: 11/91b lim: 10 exec/s: 0 rss: 67Mb L: 8/10 MS: 1 ChangeByte- 00:07:49.507 [2024-12-13 07:03:07.570996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.571026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.507 [2024-12-13 07:03:07.571145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.571164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.507 [2024-12-13 07:03:07.571294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.571310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.507 [2024-12-13 07:03:07.571430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000780a cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.571447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.507 #13 NEW cov: 11737 ft: 13147 corp: 12/99b lim: 10 exec/s: 0 rss: 67Mb L: 8/10 MS: 1 ChangeBit- 00:07:49.507 [2024-12-13 07:03:07.610831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.610859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.507 [2024-12-13 07:03:07.610977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a3d cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.610997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.507 [2024-12-13 07:03:07.611115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.611130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.507 [2024-12-13 07:03:07.611257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.611273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.507 [2024-12-13 07:03:07.611388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000410a cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.611405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.507 #14 NEW cov: 11737 ft: 13164 corp: 13/109b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 InsertByte- 00:07:49.507 [2024-12-13 07:03:07.650933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000a67a cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.650961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.507 [2024-12-13 07:03:07.651080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.651097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.507 [2024-12-13 07:03:07.651213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002d7a cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.651230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.507 [2024-12-13 07:03:07.651370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a8f cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.651387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.507 [2024-12-13 07:03:07.651506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00007a0a cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.651525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.507 #15 NEW cov: 11737 ft: 13184 corp: 14/119b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ChangeByte- 00:07:49.507 [2024-12-13 07:03:07.700953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000317a cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.700982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.507 [2024-12-13 07:03:07.701113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.701131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.507 [2024-12-13 07:03:07.701257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a3a cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.701278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.507 [2024-12-13 07:03:07.701402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a0a cdw11:00000000 00:07:49.507 [2024-12-13 07:03:07.701419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.507 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:49.507 #16 NEW cov: 11760 ft: 13306 corp: 15/127b lim: 10 exec/s: 0 rss: 68Mb L: 8/10 MS: 1 ShuffleBytes- 00:07:49.766 [2024-12-13 07:03:07.751092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000a67a cdw11:00000000 00:07:49.766 [2024-12-13 07:03:07.751121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.766 [2024-12-13 07:03:07.751239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.767 [2024-12-13 07:03:07.751257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.767 [2024-12-13 07:03:07.751370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a2d cdw11:00000000 00:07:49.767 [2024-12-13 07:03:07.751387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.767 #17 NEW cov: 11760 ft: 13527 corp: 16/134b lim: 10 exec/s: 0 rss: 68Mb L: 7/10 MS: 1 CrossOver- 00:07:49.767 [2024-12-13 07:03:07.801609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.767 [2024-12-13 07:03:07.801637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.767 [2024-12-13 07:03:07.801753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a2d cdw11:00000000 00:07:49.767 [2024-12-13 07:03:07.801772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.767 [2024-12-13 07:03:07.801874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a2d cdw11:00000000 00:07:49.767 [2024-12-13 07:03:07.801891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.767 [2024-12-13 07:03:07.801998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.767 [2024-12-13 07:03:07.802015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.767 #18 NEW cov: 11760 ft: 13547 corp: 17/143b lim: 10 exec/s: 18 rss: 68Mb L: 9/10 MS: 1 CrossOver- 00:07:49.767 [2024-12-13 07:03:07.851302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003030 cdw11:00000000 00:07:49.767 [2024-12-13 07:03:07.851333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.767 [2024-12-13 07:03:07.851457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.767 [2024-12-13 07:03:07.851474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.767 #19 NEW cov: 11760 ft: 13750 corp: 18/148b lim: 10 exec/s: 19 rss: 68Mb L: 5/10 MS: 1 CrossOver- 00:07:49.767 [2024-12-13 07:03:07.901852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.767 [2024-12-13 07:03:07.901881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.767 [2024-12-13 07:03:07.902000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.767 [2024-12-13 07:03:07.902018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.767 [2024-12-13 07:03:07.902131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.767 [2024-12-13 07:03:07.902149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.767 [2024-12-13 07:03:07.902269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a0a cdw11:00000000 00:07:49.767 [2024-12-13 07:03:07.902290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.767 #20 NEW cov: 11760 ft: 13833 corp: 19/156b lim: 10 exec/s: 20 rss: 68Mb L: 8/10 MS: 1 CopyPart- 00:07:49.767 [2024-12-13 07:03:07.942021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002b2b cdw11:00000000 00:07:49.767 [2024-12-13 07:03:07.942051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.767 [2024-12-13 07:03:07.942171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002b2b cdw11:00000000 00:07:49.767 [2024-12-13 07:03:07.942195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.767 [2024-12-13 07:03:07.942318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002b2b cdw11:00000000 00:07:49.767 [2024-12-13 07:03:07.942333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.767 [2024-12-13 07:03:07.942457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00002b2b cdw11:00000000 00:07:49.767 [2024-12-13 07:03:07.942476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.767 [2024-12-13 07:03:07.942595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00002b0a cdw11:00000000 00:07:49.767 [2024-12-13 07:03:07.942612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.767 #21 NEW cov: 11760 ft: 13834 corp: 20/166b lim: 10 exec/s: 21 rss: 68Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:49.767 [2024-12-13 07:03:07.982104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000587a cdw11:00000000 00:07:49.767 [2024-12-13 07:03:07.982132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.767 [2024-12-13 07:03:07.982264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.767 [2024-12-13 07:03:07.982282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.767 [2024-12-13 07:03:07.982397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:49.767 [2024-12-13 07:03:07.982416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.767 [2024-12-13 07:03:07.982534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a0a cdw11:00000000 00:07:49.767 [2024-12-13 07:03:07.982551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.027 #22 NEW cov: 11760 ft: 13862 corp: 21/174b lim: 10 exec/s: 22 rss: 68Mb L: 8/10 MS: 1 ChangeByte- 00:07:50.027 [2024-12-13 07:03:08.032311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003030 cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.032338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.027 [2024-12-13 07:03:08.032458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a30 cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.032474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.027 [2024-12-13 07:03:08.032593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.032612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.027 [2024-12-13 07:03:08.032725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.032742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.027 #23 NEW cov: 11760 ft: 13944 corp: 22/182b lim: 10 exec/s: 23 rss: 68Mb L: 8/10 MS: 1 CopyPart- 00:07:50.027 [2024-12-13 07:03:08.072696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000fa7a cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.072736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.027 [2024-12-13 07:03:08.072845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.072863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.027 [2024-12-13 07:03:08.072981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002d7a cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.072999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.027 [2024-12-13 07:03:08.073121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a8f cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.073138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.027 [2024-12-13 07:03:08.073262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00007a0a cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.073280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.027 #24 NEW cov: 11760 ft: 14051 corp: 23/192b lim: 10 exec/s: 24 rss: 68Mb L: 10/10 MS: 1 ChangeBit- 00:07:50.027 [2024-12-13 07:03:08.112330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a2d cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.112357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.027 [2024-12-13 07:03:08.112480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.112497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.027 [2024-12-13 07:03:08.112614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a2d cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.112632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.027 [2024-12-13 07:03:08.112745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.112762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.027 [2024-12-13 07:03:08.112875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00007a0a cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.112891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.027 #25 NEW cov: 11760 ft: 14063 corp: 24/202b lim: 10 exec/s: 25 rss: 68Mb L: 10/10 MS: 1 CopyPart- 00:07:50.027 [2024-12-13 07:03:08.152804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.152832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.027 [2024-12-13 07:03:08.152957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a3d cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.152974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.027 [2024-12-13 07:03:08.153089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.153105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.027 [2024-12-13 07:03:08.153228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.153245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.027 [2024-12-13 07:03:08.153360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00004132 cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.153378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.027 #26 NEW cov: 11760 ft: 14132 corp: 25/212b lim: 10 exec/s: 26 rss: 69Mb L: 10/10 MS: 1 ChangeByte- 00:07:50.027 [2024-12-13 07:03:08.192751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000a67a cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.192778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.027 [2024-12-13 07:03:08.192895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.192912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.027 [2024-12-13 07:03:08.193028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.193044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.027 [2024-12-13 07:03:08.193166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a2d cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.193183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.027 #27 NEW cov: 11760 ft: 14178 corp: 26/221b lim: 10 exec/s: 27 rss: 69Mb L: 9/10 MS: 1 CopyPart- 00:07:50.027 [2024-12-13 07:03:08.232844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008685 cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.232871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.027 [2024-12-13 07:03:08.232986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008585 cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.233003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.027 [2024-12-13 07:03:08.233131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00008585 cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.233148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.027 [2024-12-13 07:03:08.233282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:000085f9 cdw11:00000000 00:07:50.027 [2024-12-13 07:03:08.233300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.027 #28 NEW cov: 11760 ft: 14219 corp: 27/229b lim: 10 exec/s: 28 rss: 69Mb L: 8/10 MS: 1 ChangeBinInt- 00:07:50.287 [2024-12-13 07:03:08.283167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000a67a cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.283203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.287 [2024-12-13 07:03:08.283330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.283347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.287 [2024-12-13 07:03:08.283460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002d7a cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.283477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.287 [2024-12-13 07:03:08.283586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a8f cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.283603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.287 [2024-12-13 07:03:08.283729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00007b0a cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.283746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.287 #29 NEW cov: 11760 ft: 14255 corp: 28/239b lim: 10 exec/s: 29 rss: 69Mb L: 10/10 MS: 1 ChangeBit- 00:07:50.287 [2024-12-13 07:03:08.322691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.322722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.287 [2024-12-13 07:03:08.322837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000597a cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.322855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.287 [2024-12-13 07:03:08.322970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.322987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.287 [2024-12-13 07:03:08.323108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a0a cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.323124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.287 #30 NEW cov: 11760 ft: 14276 corp: 29/247b lim: 10 exec/s: 30 rss: 69Mb L: 8/10 MS: 1 EraseBytes- 00:07:50.287 [2024-12-13 07:03:08.383368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.383397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.287 [2024-12-13 07:03:08.383517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.383535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.287 [2024-12-13 07:03:08.383647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a29 cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.383664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.287 [2024-12-13 07:03:08.383784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a0a cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.383804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.287 #31 NEW cov: 11760 ft: 14293 corp: 30/255b lim: 10 exec/s: 31 rss: 69Mb L: 8/10 MS: 1 ChangeByte- 00:07:50.287 [2024-12-13 07:03:08.433710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002b2d cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.433739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.287 [2024-12-13 07:03:08.433859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.433877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.287 [2024-12-13 07:03:08.433990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a2d cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.434006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.287 [2024-12-13 07:03:08.434123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.434138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.287 [2024-12-13 07:03:08.434262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00007a0a cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.434280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.287 #32 NEW cov: 11760 ft: 14297 corp: 31/265b lim: 10 exec/s: 32 rss: 69Mb L: 10/10 MS: 1 ChangeByte- 00:07:50.287 [2024-12-13 07:03:08.483640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.483668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.287 [2024-12-13 07:03:08.483786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.483803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.287 [2024-12-13 07:03:08.483920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002d7a cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.483938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.287 [2024-12-13 07:03:08.484051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000278f cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.484070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.287 [2024-12-13 07:03:08.484191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00007a0a cdw11:00000000 00:07:50.287 [2024-12-13 07:03:08.484208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.287 #33 NEW cov: 11760 ft: 14304 corp: 32/275b lim: 10 exec/s: 33 rss: 69Mb L: 10/10 MS: 1 ChangeByte- 00:07:50.546 [2024-12-13 07:03:08.533887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a7a cdw11:00000000 00:07:50.546 [2024-12-13 07:03:08.533916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.546 [2024-12-13 07:03:08.534035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.546 [2024-12-13 07:03:08.534054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.546 [2024-12-13 07:03:08.534173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.546 [2024-12-13 07:03:08.534200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.546 [2024-12-13 07:03:08.534320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.546 [2024-12-13 07:03:08.534340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.547 #34 NEW cov: 11760 ft: 14351 corp: 33/283b lim: 10 exec/s: 34 rss: 69Mb L: 8/10 MS: 1 ShuffleBytes- 00:07:50.547 [2024-12-13 07:03:08.574205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.574234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.547 [2024-12-13 07:03:08.574364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.574381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.547 [2024-12-13 07:03:08.574500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a2d cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.574518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.547 [2024-12-13 07:03:08.574644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a27 cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.574662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.547 [2024-12-13 07:03:08.574784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00008f7a cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.574803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.547 #35 NEW cov: 11760 ft: 14369 corp: 34/293b lim: 10 exec/s: 35 rss: 69Mb L: 10/10 MS: 1 CopyPart- 00:07:50.547 [2024-12-13 07:03:08.624371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a2d cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.624399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.547 [2024-12-13 07:03:08.624512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00003d7a cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.624529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.547 [2024-12-13 07:03:08.624638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a2d cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.624656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.547 [2024-12-13 07:03:08.624768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.624785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.547 [2024-12-13 07:03:08.624902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00007a0a cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.624918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.547 #36 NEW cov: 11760 ft: 14372 corp: 35/303b lim: 10 exec/s: 36 rss: 69Mb L: 10/10 MS: 1 ChangeByte- 00:07:50.547 [2024-12-13 07:03:08.664219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a7a cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.664250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.547 [2024-12-13 07:03:08.664387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.664404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.547 [2024-12-13 07:03:08.664526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.664545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.547 [2024-12-13 07:03:08.664672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a25 cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.664688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.547 #37 NEW cov: 11760 ft: 14374 corp: 36/311b lim: 10 exec/s: 37 rss: 69Mb L: 8/10 MS: 1 ChangeByte- 00:07:50.547 [2024-12-13 07:03:08.714620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007a72 cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.714650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.547 [2024-12-13 07:03:08.714761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a3d cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.714781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.547 [2024-12-13 07:03:08.714895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.714912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.547 [2024-12-13 07:03:08.715020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.715036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.547 [2024-12-13 07:03:08.715127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00004132 cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.715146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.547 #38 NEW cov: 11760 ft: 14381 corp: 37/321b lim: 10 exec/s: 38 rss: 69Mb L: 10/10 MS: 1 ChangeBit- 00:07:50.547 [2024-12-13 07:03:08.764743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000a67a cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.764772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.547 [2024-12-13 07:03:08.764886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.764903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.547 [2024-12-13 07:03:08.765019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002d7a cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.765035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.547 [2024-12-13 07:03:08.765148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a8f cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.765165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.547 [2024-12-13 07:03:08.765284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000840a cdw11:00000000 00:07:50.547 [2024-12-13 07:03:08.765302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.806 #39 NEW cov: 11760 ft: 14395 corp: 38/331b lim: 10 exec/s: 39 rss: 69Mb L: 10/10 MS: 1 ChangeBinInt- 00:07:50.806 [2024-12-13 07:03:08.814729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000fa7a cdw11:00000000 00:07:50.806 [2024-12-13 07:03:08.814757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.806 [2024-12-13 07:03:08.814865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00007a7a cdw11:00000000 00:07:50.806 [2024-12-13 07:03:08.814883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.806 [2024-12-13 07:03:08.814999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002d7a cdw11:00000000 00:07:50.806 [2024-12-13 07:03:08.815016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.806 [2024-12-13 07:03:08.815126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00007a8f cdw11:00000000 00:07:50.806 [2024-12-13 07:03:08.815144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.806 [2024-12-13 07:03:08.815262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00007a0a cdw11:00000000 00:07:50.806 [2024-12-13 07:03:08.815279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.806 #40 NEW cov: 11760 ft: 14397 corp: 39/341b lim: 10 exec/s: 20 rss: 69Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:50.806 #40 DONE cov: 11760 ft: 14397 corp: 39/341b lim: 10 exec/s: 20 rss: 69Mb 00:07:50.806 Done 40 runs in 2 second(s) 00:07:50.806 07:03:08 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:07:50.806 07:03:08 -- ../common.sh@72 -- # (( i++ )) 00:07:50.806 07:03:08 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:50.806 07:03:08 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:50.806 07:03:08 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:50.806 07:03:08 -- nvmf/run.sh@24 -- # local timen=1 00:07:50.806 07:03:08 -- nvmf/run.sh@25 -- # local core=0x1 00:07:50.806 07:03:08 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:50.806 07:03:08 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:50.806 07:03:08 -- nvmf/run.sh@29 -- # printf %02d 7 00:07:50.806 07:03:08 -- nvmf/run.sh@29 -- # port=4407 00:07:50.806 07:03:08 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:50.806 07:03:08 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:50.806 07:03:08 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:50.806 07:03:08 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:07:50.806 [2024-12-13 07:03:08.995965] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:50.806 [2024-12-13 07:03:08.996028] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid492970 ] 00:07:50.806 EAL: No free 2048 kB hugepages reported on node 1 00:07:51.065 [2024-12-13 07:03:09.173683] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.065 [2024-12-13 07:03:09.193246] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:51.065 [2024-12-13 07:03:09.193385] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.065 [2024-12-13 07:03:09.244678] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:51.065 [2024-12-13 07:03:09.260985] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:51.065 INFO: Running with entropic power schedule (0xFF, 100). 00:07:51.065 INFO: Seed: 762013595 00:07:51.065 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:51.065 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:51.065 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:51.065 INFO: A corpus is not provided, starting from an empty corpus 00:07:51.065 #2 INITED exec/s: 0 rss: 59Mb 00:07:51.065 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:51.065 This may also happen if the target rejected all inputs we tried so far 00:07:51.324 [2024-12-13 07:03:09.308529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a7a cdw11:00000000 00:07:51.324 [2024-12-13 07:03:09.308558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.583 NEW_FUNC[1/669]: 0x45c8c8 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:51.583 NEW_FUNC[2/669]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:51.583 #4 NEW cov: 11533 ft: 11534 corp: 2/3b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 2 ShuffleBytes-InsertByte- 00:07:51.583 [2024-12-13 07:03:09.609252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003a0a cdw11:00000000 00:07:51.583 [2024-12-13 07:03:09.609283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.583 #6 NEW cov: 11646 ft: 11979 corp: 3/5b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 2 CrossOver-InsertByte- 00:07:51.583 [2024-12-13 07:03:09.649312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003a0a cdw11:00000000 00:07:51.583 [2024-12-13 07:03:09.649337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.583 #7 NEW cov: 11652 ft: 12315 corp: 4/7b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 ShuffleBytes- 00:07:51.583 [2024-12-13 07:03:09.689456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003a12 cdw11:00000000 00:07:51.583 [2024-12-13 07:03:09.689481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.583 #8 NEW cov: 11737 ft: 12543 corp: 5/9b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ChangeBinInt- 00:07:51.583 [2024-12-13 07:03:09.729520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000273a cdw11:00000000 00:07:51.583 [2024-12-13 07:03:09.729545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.583 #10 NEW cov: 11737 ft: 12721 corp: 6/11b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 2 EraseBytes-InsertByte- 00:07:51.583 [2024-12-13 07:03:09.769663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003a0b cdw11:00000000 00:07:51.583 [2024-12-13 07:03:09.769687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.583 #11 NEW cov: 11737 ft: 12824 corp: 7/13b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ChangeBinInt- 00:07:51.583 [2024-12-13 07:03:09.809773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002727 cdw11:00000000 00:07:51.583 [2024-12-13 07:03:09.809799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.842 #12 NEW cov: 11737 ft: 12890 corp: 8/15b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CopyPart- 00:07:51.842 [2024-12-13 07:03:09.850234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:51.842 [2024-12-13 07:03:09.850260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.842 [2024-12-13 07:03:09.850311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:51.842 [2024-12-13 07:03:09.850325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.842 [2024-12-13 07:03:09.850375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:51.842 [2024-12-13 07:03:09.850387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.842 [2024-12-13 07:03:09.850437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:51.842 [2024-12-13 07:03:09.850451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.842 #14 NEW cov: 11737 ft: 13284 corp: 9/23b lim: 10 exec/s: 0 rss: 68Mb L: 8/8 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:51.842 [2024-12-13 07:03:09.889972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000270a cdw11:00000000 00:07:51.842 [2024-12-13 07:03:09.889997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.842 #16 NEW cov: 11737 ft: 13341 corp: 10/26b lim: 10 exec/s: 0 rss: 68Mb L: 3/8 MS: 2 EraseBytes-CrossOver- 00:07:51.842 [2024-12-13 07:03:09.930117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:51.842 [2024-12-13 07:03:09.930141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.842 #17 NEW cov: 11737 ft: 13382 corp: 11/28b lim: 10 exec/s: 0 rss: 68Mb L: 2/8 MS: 1 CrossOver- 00:07:51.842 [2024-12-13 07:03:09.970386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000270a cdw11:00000000 00:07:51.842 [2024-12-13 07:03:09.970410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.842 [2024-12-13 07:03:09.970460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007a27 cdw11:00000000 00:07:51.842 [2024-12-13 07:03:09.970473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.842 #18 NEW cov: 11737 ft: 13568 corp: 12/32b lim: 10 exec/s: 0 rss: 68Mb L: 4/8 MS: 1 CopyPart- 00:07:51.842 [2024-12-13 07:03:10.010371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000b0a cdw11:00000000 00:07:51.842 [2024-12-13 07:03:10.010396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.842 #19 NEW cov: 11737 ft: 13643 corp: 13/34b lim: 10 exec/s: 0 rss: 68Mb L: 2/8 MS: 1 ChangeBit- 00:07:51.842 [2024-12-13 07:03:10.050617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000700 cdw11:00000000 00:07:51.842 [2024-12-13 07:03:10.050642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.842 [2024-12-13 07:03:10.050691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.842 [2024-12-13 07:03:10.050706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.842 #22 NEW cov: 11737 ft: 13706 corp: 14/38b lim: 10 exec/s: 0 rss: 68Mb L: 4/8 MS: 3 EraseBytes-ChangeBit-InsertRepeatedBytes- 00:07:52.101 [2024-12-13 07:03:10.090916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002727 cdw11:00000000 00:07:52.101 [2024-12-13 07:03:10.090942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.101 [2024-12-13 07:03:10.091007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008f8f cdw11:00000000 00:07:52.101 [2024-12-13 07:03:10.091020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.101 [2024-12-13 07:03:10.091071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00008f8f cdw11:00000000 00:07:52.101 [2024-12-13 07:03:10.091084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.101 [2024-12-13 07:03:10.091132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00008f8f cdw11:00000000 00:07:52.101 [2024-12-13 07:03:10.091145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.101 #23 NEW cov: 11737 ft: 13727 corp: 15/46b lim: 10 exec/s: 0 rss: 68Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:07:52.101 [2024-12-13 07:03:10.130696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003a0a cdw11:00000000 00:07:52.101 [2024-12-13 07:03:10.130723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.101 #24 NEW cov: 11737 ft: 13755 corp: 16/48b lim: 10 exec/s: 0 rss: 68Mb L: 2/8 MS: 1 ShuffleBytes- 00:07:52.101 [2024-12-13 07:03:10.170942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000701 cdw11:00000000 00:07:52.101 [2024-12-13 07:03:10.170967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.101 [2024-12-13 07:03:10.171017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001f00 cdw11:00000000 00:07:52.101 [2024-12-13 07:03:10.171030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.101 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:52.101 #25 NEW cov: 11760 ft: 13846 corp: 17/52b lim: 10 exec/s: 0 rss: 68Mb L: 4/8 MS: 1 CMP- DE: "\001\037"- 00:07:52.102 [2024-12-13 07:03:10.210948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003a02 cdw11:00000000 00:07:52.102 [2024-12-13 07:03:10.210974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.102 #26 NEW cov: 11760 ft: 13938 corp: 18/54b lim: 10 exec/s: 0 rss: 69Mb L: 2/8 MS: 1 ChangeBinInt- 00:07:52.102 [2024-12-13 07:03:10.251177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000701 cdw11:00000000 00:07:52.102 [2024-12-13 07:03:10.251207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.102 [2024-12-13 07:03:10.251259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000400 cdw11:00000000 00:07:52.102 [2024-12-13 07:03:10.251272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.102 #27 NEW cov: 11760 ft: 13949 corp: 19/58b lim: 10 exec/s: 0 rss: 69Mb L: 4/8 MS: 1 ChangeBinInt- 00:07:52.102 [2024-12-13 07:03:10.291519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002727 cdw11:00000000 00:07:52.102 [2024-12-13 07:03:10.291544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.102 [2024-12-13 07:03:10.291598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008f8f cdw11:00000000 00:07:52.102 [2024-12-13 07:03:10.291611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.102 [2024-12-13 07:03:10.291661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00008f8f cdw11:00000000 00:07:52.102 [2024-12-13 07:03:10.291675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.102 [2024-12-13 07:03:10.291725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00008f8f cdw11:00000000 00:07:52.102 [2024-12-13 07:03:10.291738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.102 #28 NEW cov: 11760 ft: 13981 corp: 20/66b lim: 10 exec/s: 28 rss: 69Mb L: 8/8 MS: 1 ShuffleBytes- 00:07:52.102 [2024-12-13 07:03:10.331308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:52.102 [2024-12-13 07:03:10.331333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.361 #29 NEW cov: 11760 ft: 13987 corp: 21/68b lim: 10 exec/s: 29 rss: 69Mb L: 2/8 MS: 1 ChangeBit- 00:07:52.361 [2024-12-13 07:03:10.371400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003a3e cdw11:00000000 00:07:52.361 [2024-12-13 07:03:10.371424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.361 #30 NEW cov: 11760 ft: 13994 corp: 22/70b lim: 10 exec/s: 30 rss: 69Mb L: 2/8 MS: 1 ChangeByte- 00:07:52.361 [2024-12-13 07:03:10.411838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:52.361 [2024-12-13 07:03:10.411863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.361 [2024-12-13 07:03:10.411913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:52.361 [2024-12-13 07:03:10.411927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.361 [2024-12-13 07:03:10.411975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff01 cdw11:00000000 00:07:52.361 [2024-12-13 07:03:10.411989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.361 [2024-12-13 07:03:10.412035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00001f0a cdw11:00000000 00:07:52.361 [2024-12-13 07:03:10.412048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.361 #31 NEW cov: 11760 ft: 14038 corp: 23/78b lim: 10 exec/s: 31 rss: 69Mb L: 8/8 MS: 1 PersAutoDict- DE: "\001\037"- 00:07:52.361 [2024-12-13 07:03:10.451642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000410b cdw11:00000000 00:07:52.361 [2024-12-13 07:03:10.451666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.361 #32 NEW cov: 11760 ft: 14111 corp: 24/80b lim: 10 exec/s: 32 rss: 69Mb L: 2/8 MS: 1 ChangeBinInt- 00:07:52.361 [2024-12-13 07:03:10.491730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000260a cdw11:00000000 00:07:52.361 [2024-12-13 07:03:10.491754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.361 #33 NEW cov: 11760 ft: 14115 corp: 25/82b lim: 10 exec/s: 33 rss: 69Mb L: 2/8 MS: 1 ChangeByte- 00:07:52.361 [2024-12-13 07:03:10.521859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003a0a cdw11:00000000 00:07:52.361 [2024-12-13 07:03:10.521886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.361 #34 NEW cov: 11760 ft: 14177 corp: 26/84b lim: 10 exec/s: 34 rss: 69Mb L: 2/8 MS: 1 ChangeByte- 00:07:52.361 [2024-12-13 07:03:10.561935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff3a cdw11:00000000 00:07:52.361 [2024-12-13 07:03:10.561976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.361 #35 NEW cov: 11760 ft: 14188 corp: 27/87b lim: 10 exec/s: 35 rss: 69Mb L: 3/8 MS: 1 InsertByte- 00:07:52.361 [2024-12-13 07:03:10.592025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002b3a cdw11:00000000 00:07:52.361 [2024-12-13 07:03:10.592049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.620 #36 NEW cov: 11760 ft: 14241 corp: 28/90b lim: 10 exec/s: 36 rss: 69Mb L: 3/8 MS: 1 InsertByte- 00:07:52.620 [2024-12-13 07:03:10.632275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002b82 cdw11:00000000 00:07:52.620 [2024-12-13 07:03:10.632300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.620 [2024-12-13 07:03:10.632349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00003a3e cdw11:00000000 00:07:52.620 [2024-12-13 07:03:10.632362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.620 #37 NEW cov: 11760 ft: 14251 corp: 29/94b lim: 10 exec/s: 37 rss: 69Mb L: 4/8 MS: 1 InsertByte- 00:07:52.620 [2024-12-13 07:03:10.672302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002b3a cdw11:00000000 00:07:52.620 [2024-12-13 07:03:10.672326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.620 #38 NEW cov: 11760 ft: 14272 corp: 30/97b lim: 10 exec/s: 38 rss: 69Mb L: 3/8 MS: 1 ChangeBit- 00:07:52.620 [2024-12-13 07:03:10.712385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000d1c5 cdw11:00000000 00:07:52.620 [2024-12-13 07:03:10.712410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.620 #39 NEW cov: 11760 ft: 14291 corp: 31/99b lim: 10 exec/s: 39 rss: 69Mb L: 2/8 MS: 1 ChangeBinInt- 00:07:52.620 [2024-12-13 07:03:10.752862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000fff7 cdw11:00000000 00:07:52.620 [2024-12-13 07:03:10.752885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.620 [2024-12-13 07:03:10.752949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:52.620 [2024-12-13 07:03:10.752963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.620 [2024-12-13 07:03:10.753012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:52.620 [2024-12-13 07:03:10.753026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.620 [2024-12-13 07:03:10.753073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:52.620 [2024-12-13 07:03:10.753086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.620 #40 NEW cov: 11760 ft: 14331 corp: 32/107b lim: 10 exec/s: 40 rss: 69Mb L: 8/8 MS: 1 ChangeBit- 00:07:52.621 [2024-12-13 07:03:10.792589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000272a cdw11:00000000 00:07:52.621 [2024-12-13 07:03:10.792615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.621 #41 NEW cov: 11760 ft: 14402 corp: 33/109b lim: 10 exec/s: 41 rss: 69Mb L: 2/8 MS: 1 ChangeBit- 00:07:52.621 [2024-12-13 07:03:10.832738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002a27 cdw11:00000000 00:07:52.621 [2024-12-13 07:03:10.832762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.621 #47 NEW cov: 11760 ft: 14410 corp: 34/111b lim: 10 exec/s: 47 rss: 69Mb L: 2/8 MS: 1 ShuffleBytes- 00:07:52.880 [2024-12-13 07:03:10.872897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000d15b cdw11:00000000 00:07:52.880 [2024-12-13 07:03:10.872921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.880 #48 NEW cov: 11760 ft: 14418 corp: 35/113b lim: 10 exec/s: 48 rss: 69Mb L: 2/8 MS: 1 ChangeByte- 00:07:52.880 [2024-12-13 07:03:10.913007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003a12 cdw11:00000000 00:07:52.880 [2024-12-13 07:03:10.913032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.880 #49 NEW cov: 11760 ft: 14426 corp: 36/115b lim: 10 exec/s: 49 rss: 70Mb L: 2/8 MS: 1 CopyPart- 00:07:52.880 [2024-12-13 07:03:10.953479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000fff7 cdw11:00000000 00:07:52.880 [2024-12-13 07:03:10.953504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.880 [2024-12-13 07:03:10.953555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:52.880 [2024-12-13 07:03:10.953569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.880 [2024-12-13 07:03:10.953619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff28 cdw11:00000000 00:07:52.880 [2024-12-13 07:03:10.953631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.880 [2024-12-13 07:03:10.953680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:52.880 [2024-12-13 07:03:10.953693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.880 [2024-12-13 07:03:10.993697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000fff7 cdw11:00000000 00:07:52.880 [2024-12-13 07:03:10.993721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.880 [2024-12-13 07:03:10.993787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:52.880 [2024-12-13 07:03:10.993800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.880 [2024-12-13 07:03:10.993850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff28 cdw11:00000000 00:07:52.880 [2024-12-13 07:03:10.993864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.880 [2024-12-13 07:03:10.993915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000aff cdw11:00000000 00:07:52.880 [2024-12-13 07:03:10.993928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.880 [2024-12-13 07:03:10.993977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:52.880 [2024-12-13 07:03:10.993993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.880 #51 NEW cov: 11760 ft: 14472 corp: 37/125b lim: 10 exec/s: 51 rss: 70Mb L: 10/10 MS: 2 InsertByte-CrossOver- 00:07:52.880 [2024-12-13 07:03:11.033414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002601 cdw11:00000000 00:07:52.880 [2024-12-13 07:03:11.033438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.880 [2024-12-13 07:03:11.033488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001f00 cdw11:00000000 00:07:52.880 [2024-12-13 07:03:11.033502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.880 #52 NEW cov: 11760 ft: 14484 corp: 38/129b lim: 10 exec/s: 52 rss: 70Mb L: 4/10 MS: 1 ChangeByte- 00:07:52.880 [2024-12-13 07:03:11.073538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000272a cdw11:00000000 00:07:52.880 [2024-12-13 07:03:11.073563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.880 [2024-12-13 07:03:11.073625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000272a cdw11:00000000 00:07:52.880 [2024-12-13 07:03:11.073639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.880 [2024-12-13 07:03:11.113649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002731 cdw11:00000000 00:07:52.880 [2024-12-13 07:03:11.113673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.880 [2024-12-13 07:03:11.113724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000272a cdw11:00000000 00:07:52.880 [2024-12-13 07:03:11.113737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.139 #54 NEW cov: 11760 ft: 14488 corp: 39/133b lim: 10 exec/s: 54 rss: 70Mb L: 4/10 MS: 2 CopyPart-ChangeBinInt- 00:07:53.139 [2024-12-13 07:03:11.153637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ba02 cdw11:00000000 00:07:53.139 [2024-12-13 07:03:11.153661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.139 #55 NEW cov: 11760 ft: 14494 corp: 40/135b lim: 10 exec/s: 55 rss: 70Mb L: 2/10 MS: 1 ChangeBit- 00:07:53.139 [2024-12-13 07:03:11.193793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000cfc5 cdw11:00000000 00:07:53.139 [2024-12-13 07:03:11.193817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.139 #56 NEW cov: 11760 ft: 14547 corp: 41/137b lim: 10 exec/s: 56 rss: 70Mb L: 2/10 MS: 1 ChangeBinInt- 00:07:53.139 [2024-12-13 07:03:11.234064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:53.139 [2024-12-13 07:03:11.234088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.139 [2024-12-13 07:03:11.234155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002b82 cdw11:00000000 00:07:53.139 [2024-12-13 07:03:11.234169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.139 [2024-12-13 07:03:11.234221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00003a3e cdw11:00000000 00:07:53.139 [2024-12-13 07:03:11.234234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.139 #57 NEW cov: 11760 ft: 14683 corp: 42/143b lim: 10 exec/s: 57 rss: 70Mb L: 6/10 MS: 1 CMP- DE: "\377\377"- 00:07:53.139 [2024-12-13 07:03:11.274011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002a0a cdw11:00000000 00:07:53.139 [2024-12-13 07:03:11.274037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.139 #58 NEW cov: 11760 ft: 14719 corp: 43/145b lim: 10 exec/s: 29 rss: 70Mb L: 2/10 MS: 1 ChangeBit- 00:07:53.139 #58 DONE cov: 11760 ft: 14719 corp: 43/145b lim: 10 exec/s: 29 rss: 70Mb 00:07:53.139 ###### Recommended dictionary. ###### 00:07:53.139 "\001\037" # Uses: 1 00:07:53.139 "\377\377" # Uses: 0 00:07:53.139 ###### End of recommended dictionary. ###### 00:07:53.139 Done 58 runs in 2 second(s) 00:07:53.398 07:03:11 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:07:53.398 07:03:11 -- ../common.sh@72 -- # (( i++ )) 00:07:53.398 07:03:11 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:53.398 07:03:11 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:53.398 07:03:11 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:53.398 07:03:11 -- nvmf/run.sh@24 -- # local timen=1 00:07:53.398 07:03:11 -- nvmf/run.sh@25 -- # local core=0x1 00:07:53.398 07:03:11 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:53.398 07:03:11 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:53.398 07:03:11 -- nvmf/run.sh@29 -- # printf %02d 8 00:07:53.398 07:03:11 -- nvmf/run.sh@29 -- # port=4408 00:07:53.398 07:03:11 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:53.398 07:03:11 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:53.398 07:03:11 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:53.398 07:03:11 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:07:53.398 [2024-12-13 07:03:11.446608] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:53.399 [2024-12-13 07:03:11.446674] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid493306 ] 00:07:53.399 EAL: No free 2048 kB hugepages reported on node 1 00:07:53.399 [2024-12-13 07:03:11.629839] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.658 [2024-12-13 07:03:11.649902] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:53.658 [2024-12-13 07:03:11.650040] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.658 [2024-12-13 07:03:11.701451] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:53.658 [2024-12-13 07:03:11.717759] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:53.658 INFO: Running with entropic power schedule (0xFF, 100). 00:07:53.658 INFO: Seed: 3221819533 00:07:53.658 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:53.658 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:53.658 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:53.658 INFO: A corpus is not provided, starting from an empty corpus 00:07:53.658 [2024-12-13 07:03:11.782990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.658 [2024-12-13 07:03:11.783018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.658 #2 INITED cov: 11538 ft: 11562 corp: 1/1b exec/s: 0 rss: 65Mb 00:07:53.658 [2024-12-13 07:03:11.813103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.658 [2024-12-13 07:03:11.813130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.658 [2024-12-13 07:03:11.813184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.658 [2024-12-13 07:03:11.813203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.658 #3 NEW cov: 11674 ft: 12771 corp: 2/3b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 CopyPart- 00:07:53.658 [2024-12-13 07:03:11.863112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.658 [2024-12-13 07:03:11.863138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.658 #4 NEW cov: 11680 ft: 12897 corp: 3/4b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 ChangeByte- 00:07:53.917 [2024-12-13 07:03:11.903560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.917 [2024-12-13 07:03:11.903586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.917 [2024-12-13 07:03:11.903642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.917 [2024-12-13 07:03:11.903656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.917 [2024-12-13 07:03:11.903728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.917 [2024-12-13 07:03:11.903741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.917 #5 NEW cov: 11765 ft: 13515 corp: 4/7b lim: 5 exec/s: 0 rss: 66Mb L: 3/3 MS: 1 CrossOver- 00:07:53.917 [2024-12-13 07:03:11.953991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.917 [2024-12-13 07:03:11.954016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.917 [2024-12-13 07:03:11.954092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.917 [2024-12-13 07:03:11.954106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.917 [2024-12-13 07:03:11.954164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.917 [2024-12-13 07:03:11.954178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.917 [2024-12-13 07:03:11.954239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.917 [2024-12-13 07:03:11.954252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.917 [2024-12-13 07:03:11.954310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.918 [2024-12-13 07:03:11.954323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.918 #6 NEW cov: 11765 ft: 13904 corp: 5/12b lim: 5 exec/s: 0 rss: 66Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:53.918 [2024-12-13 07:03:11.993639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.918 [2024-12-13 07:03:11.993664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.918 [2024-12-13 07:03:11.993736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.918 [2024-12-13 07:03:11.993750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.918 #7 NEW cov: 11765 ft: 13963 corp: 6/14b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 ChangeByte- 00:07:53.918 [2024-12-13 07:03:12.033605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.918 [2024-12-13 07:03:12.033630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.918 #8 NEW cov: 11765 ft: 14109 corp: 7/15b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:53.918 [2024-12-13 07:03:12.073870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.918 [2024-12-13 07:03:12.073896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.918 [2024-12-13 07:03:12.073969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.918 [2024-12-13 07:03:12.073982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.918 #9 NEW cov: 11765 ft: 14231 corp: 8/17b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 ChangeBinInt- 00:07:53.918 [2024-12-13 07:03:12.113997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.918 [2024-12-13 07:03:12.114022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.918 [2024-12-13 07:03:12.114076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.918 [2024-12-13 07:03:12.114089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.918 #10 NEW cov: 11765 ft: 14268 corp: 9/19b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 CopyPart- 00:07:53.918 [2024-12-13 07:03:12.154602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.918 [2024-12-13 07:03:12.154628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.918 [2024-12-13 07:03:12.154686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.918 [2024-12-13 07:03:12.154700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.918 [2024-12-13 07:03:12.154755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.918 [2024-12-13 07:03:12.154769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.918 [2024-12-13 07:03:12.154823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.918 [2024-12-13 07:03:12.154840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.918 [2024-12-13 07:03:12.154894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.918 [2024-12-13 07:03:12.154907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.177 #11 NEW cov: 11765 ft: 14354 corp: 10/24b lim: 5 exec/s: 0 rss: 66Mb L: 5/5 MS: 1 CrossOver- 00:07:54.177 [2024-12-13 07:03:12.194103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.177 [2024-12-13 07:03:12.194128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.177 #12 NEW cov: 11765 ft: 14392 corp: 11/25b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 CrossOver- 00:07:54.177 [2024-12-13 07:03:12.234386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.177 [2024-12-13 07:03:12.234411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.177 [2024-12-13 07:03:12.234481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.177 [2024-12-13 07:03:12.234496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.177 #13 NEW cov: 11765 ft: 14403 corp: 12/27b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 CrossOver- 00:07:54.177 [2024-12-13 07:03:12.274350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.177 [2024-12-13 07:03:12.274374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.177 #14 NEW cov: 11765 ft: 14436 corp: 13/28b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 ChangeBinInt- 00:07:54.177 [2024-12-13 07:03:12.314475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.177 [2024-12-13 07:03:12.314501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.177 #15 NEW cov: 11765 ft: 14530 corp: 14/29b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:54.177 [2024-12-13 07:03:12.354745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.177 [2024-12-13 07:03:12.354771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.177 [2024-12-13 07:03:12.354824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.177 [2024-12-13 07:03:12.354838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.177 #16 NEW cov: 11765 ft: 14539 corp: 15/31b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 ChangeByte- 00:07:54.177 [2024-12-13 07:03:12.394700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.177 [2024-12-13 07:03:12.394725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.177 #17 NEW cov: 11765 ft: 14638 corp: 16/32b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 CrossOver- 00:07:54.436 [2024-12-13 07:03:12.434807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.436 [2024-12-13 07:03:12.434834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.436 #18 NEW cov: 11765 ft: 14654 corp: 17/33b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 EraseBytes- 00:07:54.436 [2024-12-13 07:03:12.475086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.436 [2024-12-13 07:03:12.475111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.436 [2024-12-13 07:03:12.475166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.436 [2024-12-13 07:03:12.475180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.436 #19 NEW cov: 11765 ft: 14673 corp: 18/35b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 InsertByte- 00:07:54.436 [2024-12-13 07:03:12.515224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.436 [2024-12-13 07:03:12.515249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.436 [2024-12-13 07:03:12.515323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.436 [2024-12-13 07:03:12.515338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.436 #20 NEW cov: 11765 ft: 14703 corp: 19/37b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:54.436 [2024-12-13 07:03:12.555313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.436 [2024-12-13 07:03:12.555337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.436 [2024-12-13 07:03:12.555408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.436 [2024-12-13 07:03:12.555422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.436 #21 NEW cov: 11765 ft: 14730 corp: 20/39b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 CopyPart- 00:07:54.436 [2024-12-13 07:03:12.595764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.436 [2024-12-13 07:03:12.595788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.436 [2024-12-13 07:03:12.595842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.436 [2024-12-13 07:03:12.595855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.436 [2024-12-13 07:03:12.595911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.436 [2024-12-13 07:03:12.595924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.436 [2024-12-13 07:03:12.595975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.436 [2024-12-13 07:03:12.595991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.436 #22 NEW cov: 11765 ft: 14755 corp: 21/43b lim: 5 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:54.436 [2024-12-13 07:03:12.635550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.436 [2024-12-13 07:03:12.635576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.436 [2024-12-13 07:03:12.635629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.436 [2024-12-13 07:03:12.635642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.695 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:54.695 #23 NEW cov: 11788 ft: 14803 corp: 22/45b lim: 5 exec/s: 23 rss: 68Mb L: 2/5 MS: 1 CopyPart- 00:07:54.695 [2024-12-13 07:03:12.926696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.695 [2024-12-13 07:03:12.926751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.695 [2024-12-13 07:03:12.926836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.695 [2024-12-13 07:03:12.926862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.695 [2024-12-13 07:03:12.926942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.695 [2024-12-13 07:03:12.926966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.954 #24 NEW cov: 11788 ft: 15022 corp: 23/48b lim: 5 exec/s: 24 rss: 68Mb L: 3/5 MS: 1 ChangeByte- 00:07:54.954 [2024-12-13 07:03:12.976741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.954 [2024-12-13 07:03:12.976767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.954 [2024-12-13 07:03:12.976823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.954 [2024-12-13 07:03:12.976837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.954 [2024-12-13 07:03:12.976893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.954 [2024-12-13 07:03:12.976906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.954 [2024-12-13 07:03:12.976960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.954 [2024-12-13 07:03:12.976973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.954 #25 NEW cov: 11788 ft: 15032 corp: 24/52b lim: 5 exec/s: 25 rss: 68Mb L: 4/5 MS: 1 ChangeBit- 00:07:54.954 [2024-12-13 07:03:13.016860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.954 [2024-12-13 07:03:13.016889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.954 [2024-12-13 07:03:13.016965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.954 [2024-12-13 07:03:13.016979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.954 [2024-12-13 07:03:13.017034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.954 [2024-12-13 07:03:13.017047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.954 [2024-12-13 07:03:13.017101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.954 [2024-12-13 07:03:13.017114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.954 #26 NEW cov: 11788 ft: 15046 corp: 25/56b lim: 5 exec/s: 26 rss: 68Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:54.954 [2024-12-13 07:03:13.056481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.954 [2024-12-13 07:03:13.056507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.954 #27 NEW cov: 11788 ft: 15081 corp: 26/57b lim: 5 exec/s: 27 rss: 68Mb L: 1/5 MS: 1 CrossOver- 00:07:54.954 [2024-12-13 07:03:13.096907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.954 [2024-12-13 07:03:13.096933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.954 [2024-12-13 07:03:13.097008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.954 [2024-12-13 07:03:13.097023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.954 [2024-12-13 07:03:13.097080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.954 [2024-12-13 07:03:13.097093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.954 #28 NEW cov: 11788 ft: 15082 corp: 27/60b lim: 5 exec/s: 28 rss: 68Mb L: 3/5 MS: 1 CopyPart- 00:07:54.954 [2024-12-13 07:03:13.137065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.954 [2024-12-13 07:03:13.137090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.954 [2024-12-13 07:03:13.137148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.954 [2024-12-13 07:03:13.137162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.954 [2024-12-13 07:03:13.137224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.954 [2024-12-13 07:03:13.137238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.954 #29 NEW cov: 11788 ft: 15103 corp: 28/63b lim: 5 exec/s: 29 rss: 68Mb L: 3/5 MS: 1 ShuffleBytes- 00:07:54.954 [2024-12-13 07:03:13.176858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.954 [2024-12-13 07:03:13.176883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.214 #30 NEW cov: 11788 ft: 15128 corp: 29/64b lim: 5 exec/s: 30 rss: 69Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:55.214 [2024-12-13 07:03:13.217124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.214 [2024-12-13 07:03:13.217150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.214 [2024-12-13 07:03:13.217230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.214 [2024-12-13 07:03:13.217245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.214 #31 NEW cov: 11788 ft: 15152 corp: 30/66b lim: 5 exec/s: 31 rss: 69Mb L: 2/5 MS: 1 ChangeByte- 00:07:55.214 [2024-12-13 07:03:13.257257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.214 [2024-12-13 07:03:13.257282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.214 [2024-12-13 07:03:13.257357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.214 [2024-12-13 07:03:13.257371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.214 #32 NEW cov: 11788 ft: 15155 corp: 31/68b lim: 5 exec/s: 32 rss: 69Mb L: 2/5 MS: 1 CopyPart- 00:07:55.214 [2024-12-13 07:03:13.297511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.214 [2024-12-13 07:03:13.297536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.214 [2024-12-13 07:03:13.297594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.214 [2024-12-13 07:03:13.297607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.214 [2024-12-13 07:03:13.297665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.214 [2024-12-13 07:03:13.297678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.214 #33 NEW cov: 11788 ft: 15163 corp: 32/71b lim: 5 exec/s: 33 rss: 69Mb L: 3/5 MS: 1 CrossOver- 00:07:55.214 [2024-12-13 07:03:13.337819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.214 [2024-12-13 07:03:13.337845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.214 [2024-12-13 07:03:13.337901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.214 [2024-12-13 07:03:13.337914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.214 [2024-12-13 07:03:13.337966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.214 [2024-12-13 07:03:13.337986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.214 [2024-12-13 07:03:13.338040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.214 [2024-12-13 07:03:13.338053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.214 #34 NEW cov: 11788 ft: 15169 corp: 33/75b lim: 5 exec/s: 34 rss: 69Mb L: 4/5 MS: 1 InsertByte- 00:07:55.214 [2024-12-13 07:03:13.377595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.214 [2024-12-13 07:03:13.377620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.214 [2024-12-13 07:03:13.377676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.214 [2024-12-13 07:03:13.377689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.214 #35 NEW cov: 11788 ft: 15180 corp: 34/77b lim: 5 exec/s: 35 rss: 69Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:55.214 [2024-12-13 07:03:13.417548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.214 [2024-12-13 07:03:13.417574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.214 #36 NEW cov: 11788 ft: 15201 corp: 35/78b lim: 5 exec/s: 36 rss: 69Mb L: 1/5 MS: 1 EraseBytes- 00:07:55.474 [2024-12-13 07:03:13.457727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.474 [2024-12-13 07:03:13.457754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.474 #37 NEW cov: 11788 ft: 15256 corp: 36/79b lim: 5 exec/s: 37 rss: 69Mb L: 1/5 MS: 1 CopyPart- 00:07:55.474 [2024-12-13 07:03:13.497819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.474 [2024-12-13 07:03:13.497844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.474 #38 NEW cov: 11788 ft: 15270 corp: 37/80b lim: 5 exec/s: 38 rss: 69Mb L: 1/5 MS: 1 CopyPart- 00:07:55.474 [2024-12-13 07:03:13.538075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.474 [2024-12-13 07:03:13.538100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.474 [2024-12-13 07:03:13.538172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.474 [2024-12-13 07:03:13.538191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.474 #39 NEW cov: 11788 ft: 15279 corp: 38/82b lim: 5 exec/s: 39 rss: 69Mb L: 2/5 MS: 1 EraseBytes- 00:07:55.474 [2024-12-13 07:03:13.578170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.474 [2024-12-13 07:03:13.578199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.474 [2024-12-13 07:03:13.578260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.474 [2024-12-13 07:03:13.578277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.474 #40 NEW cov: 11788 ft: 15300 corp: 39/84b lim: 5 exec/s: 40 rss: 69Mb L: 2/5 MS: 1 InsertByte- 00:07:55.474 [2024-12-13 07:03:13.618295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.474 [2024-12-13 07:03:13.618321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.474 [2024-12-13 07:03:13.618394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.474 [2024-12-13 07:03:13.618409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.474 #41 NEW cov: 11788 ft: 15308 corp: 40/86b lim: 5 exec/s: 41 rss: 69Mb L: 2/5 MS: 1 ChangeByte- 00:07:55.474 [2024-12-13 07:03:13.658449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.474 [2024-12-13 07:03:13.658474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.474 [2024-12-13 07:03:13.658545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.474 [2024-12-13 07:03:13.658559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.474 #42 NEW cov: 11788 ft: 15337 corp: 41/88b lim: 5 exec/s: 42 rss: 69Mb L: 2/5 MS: 1 ChangeBinInt- 00:07:55.474 [2024-12-13 07:03:13.698891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.474 [2024-12-13 07:03:13.698917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.474 [2024-12-13 07:03:13.698976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.474 [2024-12-13 07:03:13.698990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.474 [2024-12-13 07:03:13.699047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.474 [2024-12-13 07:03:13.699060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.474 [2024-12-13 07:03:13.699116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.474 [2024-12-13 07:03:13.699129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.734 #43 NEW cov: 11788 ft: 15339 corp: 42/92b lim: 5 exec/s: 43 rss: 69Mb L: 4/5 MS: 1 ChangeByte- 00:07:55.734 [2024-12-13 07:03:13.738706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-12-13 07:03:13.738731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.734 [2024-12-13 07:03:13.738787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-12-13 07:03:13.738800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.734 #44 NEW cov: 11788 ft: 15376 corp: 43/94b lim: 5 exec/s: 44 rss: 69Mb L: 2/5 MS: 1 CrossOver- 00:07:55.734 [2024-12-13 07:03:13.778749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-12-13 07:03:13.778774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.734 [2024-12-13 07:03:13.778831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.734 [2024-12-13 07:03:13.778844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.734 #45 NEW cov: 11788 ft: 15391 corp: 44/96b lim: 5 exec/s: 22 rss: 69Mb L: 2/5 MS: 1 InsertByte- 00:07:55.734 #45 DONE cov: 11788 ft: 15391 corp: 44/96b lim: 5 exec/s: 22 rss: 69Mb 00:07:55.734 Done 45 runs in 2 second(s) 00:07:55.734 07:03:13 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:07:55.734 07:03:13 -- ../common.sh@72 -- # (( i++ )) 00:07:55.734 07:03:13 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:55.734 07:03:13 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:55.734 07:03:13 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:55.734 07:03:13 -- nvmf/run.sh@24 -- # local timen=1 00:07:55.734 07:03:13 -- nvmf/run.sh@25 -- # local core=0x1 00:07:55.734 07:03:13 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:55.734 07:03:13 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:55.734 07:03:13 -- nvmf/run.sh@29 -- # printf %02d 9 00:07:55.734 07:03:13 -- nvmf/run.sh@29 -- # port=4409 00:07:55.734 07:03:13 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:55.734 07:03:13 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:55.734 07:03:13 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:55.734 07:03:13 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:07:55.734 [2024-12-13 07:03:13.960031] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:55.734 [2024-12-13 07:03:13.960108] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid493801 ] 00:07:55.993 EAL: No free 2048 kB hugepages reported on node 1 00:07:55.993 [2024-12-13 07:03:14.135024] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.994 [2024-12-13 07:03:14.154602] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:55.994 [2024-12-13 07:03:14.154742] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.994 [2024-12-13 07:03:14.206139] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:55.994 [2024-12-13 07:03:14.222463] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:07:56.253 INFO: Running with entropic power schedule (0xFF, 100). 00:07:56.253 INFO: Seed: 1430853991 00:07:56.253 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:56.253 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:56.253 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:56.253 INFO: A corpus is not provided, starting from an empty corpus 00:07:56.253 [2024-12-13 07:03:14.287704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.253 [2024-12-13 07:03:14.287731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.253 #2 INITED cov: 11561 ft: 11562 corp: 1/1b exec/s: 0 rss: 65Mb 00:07:56.253 [2024-12-13 07:03:14.317797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.253 [2024-12-13 07:03:14.317822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.253 [2024-12-13 07:03:14.317891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.253 [2024-12-13 07:03:14.317905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.253 #3 NEW cov: 11674 ft: 12769 corp: 2/3b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 InsertByte- 00:07:56.253 [2024-12-13 07:03:14.368000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.253 [2024-12-13 07:03:14.368024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.253 [2024-12-13 07:03:14.368079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.253 [2024-12-13 07:03:14.368092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.253 #4 NEW cov: 11680 ft: 13042 corp: 3/5b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 ChangeBinInt- 00:07:56.253 [2024-12-13 07:03:14.408261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.253 [2024-12-13 07:03:14.408287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.253 [2024-12-13 07:03:14.408341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.253 [2024-12-13 07:03:14.408355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.253 #5 NEW cov: 11765 ft: 13377 corp: 4/7b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 ShuffleBytes- 00:07:56.253 [2024-12-13 07:03:14.448210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.253 [2024-12-13 07:03:14.448236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.253 [2024-12-13 07:03:14.448307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.253 [2024-12-13 07:03:14.448321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.254 #6 NEW cov: 11765 ft: 13489 corp: 5/9b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 ShuffleBytes- 00:07:56.254 [2024-12-13 07:03:14.488182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.254 [2024-12-13 07:03:14.488213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.514 #7 NEW cov: 11765 ft: 13566 corp: 6/10b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 EraseBytes- 00:07:56.514 [2024-12-13 07:03:14.528590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.514 [2024-12-13 07:03:14.528614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.514 [2024-12-13 07:03:14.528693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.514 [2024-12-13 07:03:14.528708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.514 [2024-12-13 07:03:14.528761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.514 [2024-12-13 07:03:14.528775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.514 #8 NEW cov: 11765 ft: 13846 corp: 7/13b lim: 5 exec/s: 0 rss: 66Mb L: 3/3 MS: 1 InsertByte- 00:07:56.514 [2024-12-13 07:03:14.568680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.514 [2024-12-13 07:03:14.568704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.514 [2024-12-13 07:03:14.568761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.514 [2024-12-13 07:03:14.568774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.514 [2024-12-13 07:03:14.568831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.514 [2024-12-13 07:03:14.568844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.514 #9 NEW cov: 11765 ft: 13878 corp: 8/16b lim: 5 exec/s: 0 rss: 66Mb L: 3/3 MS: 1 ChangeBinInt- 00:07:56.514 [2024-12-13 07:03:14.608644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.514 [2024-12-13 07:03:14.608668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.514 [2024-12-13 07:03:14.608723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.514 [2024-12-13 07:03:14.608737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.514 #10 NEW cov: 11765 ft: 13934 corp: 9/18b lim: 5 exec/s: 0 rss: 66Mb L: 2/3 MS: 1 ChangeBit- 00:07:56.514 [2024-12-13 07:03:14.648751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.514 [2024-12-13 07:03:14.648776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.514 [2024-12-13 07:03:14.648829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.514 [2024-12-13 07:03:14.648842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.514 #11 NEW cov: 11765 ft: 13969 corp: 10/20b lim: 5 exec/s: 0 rss: 66Mb L: 2/3 MS: 1 InsertByte- 00:07:56.514 [2024-12-13 07:03:14.689102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.514 [2024-12-13 07:03:14.689128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.514 [2024-12-13 07:03:14.689184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.514 [2024-12-13 07:03:14.689207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.514 [2024-12-13 07:03:14.689262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.514 [2024-12-13 07:03:14.689275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.514 #12 NEW cov: 11765 ft: 13985 corp: 11/23b lim: 5 exec/s: 0 rss: 66Mb L: 3/3 MS: 1 CrossOver- 00:07:56.514 [2024-12-13 07:03:14.729041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.514 [2024-12-13 07:03:14.729067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.514 [2024-12-13 07:03:14.729121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.514 [2024-12-13 07:03:14.729135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.773 #13 NEW cov: 11765 ft: 14006 corp: 12/25b lim: 5 exec/s: 0 rss: 66Mb L: 2/3 MS: 1 EraseBytes- 00:07:56.773 [2024-12-13 07:03:14.769123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.773 [2024-12-13 07:03:14.769149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.773 [2024-12-13 07:03:14.769226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.773 [2024-12-13 07:03:14.769241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.773 #14 NEW cov: 11765 ft: 14030 corp: 13/27b lim: 5 exec/s: 0 rss: 66Mb L: 2/3 MS: 1 EraseBytes- 00:07:56.773 [2024-12-13 07:03:14.809428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.773 [2024-12-13 07:03:14.809453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.773 [2024-12-13 07:03:14.809509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.773 [2024-12-13 07:03:14.809523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.773 [2024-12-13 07:03:14.809577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.773 [2024-12-13 07:03:14.809591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.773 #15 NEW cov: 11765 ft: 14041 corp: 14/30b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 InsertByte- 00:07:56.774 [2024-12-13 07:03:14.849515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.774 [2024-12-13 07:03:14.849540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.774 [2024-12-13 07:03:14.849592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.774 [2024-12-13 07:03:14.849606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.774 [2024-12-13 07:03:14.849664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.774 [2024-12-13 07:03:14.849678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.774 #16 NEW cov: 11765 ft: 14057 corp: 15/33b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 ShuffleBytes- 00:07:56.774 [2024-12-13 07:03:14.889668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.774 [2024-12-13 07:03:14.889693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.774 [2024-12-13 07:03:14.889765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.774 [2024-12-13 07:03:14.889779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.774 [2024-12-13 07:03:14.889836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.774 [2024-12-13 07:03:14.889850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.774 #17 NEW cov: 11765 ft: 14067 corp: 16/36b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 ChangeBit- 00:07:56.774 [2024-12-13 07:03:14.929581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.774 [2024-12-13 07:03:14.929606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.774 [2024-12-13 07:03:14.929661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.774 [2024-12-13 07:03:14.929674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.774 #18 NEW cov: 11765 ft: 14138 corp: 17/38b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 EraseBytes- 00:07:56.774 [2024-12-13 07:03:14.970023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.774 [2024-12-13 07:03:14.970047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.774 [2024-12-13 07:03:14.970100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.774 [2024-12-13 07:03:14.970114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.774 [2024-12-13 07:03:14.970164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.774 [2024-12-13 07:03:14.970178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.774 [2024-12-13 07:03:14.970236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.774 [2024-12-13 07:03:14.970249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.774 #19 NEW cov: 11765 ft: 14454 corp: 18/42b lim: 5 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 CrossOver- 00:07:56.774 [2024-12-13 07:03:15.009829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.774 [2024-12-13 07:03:15.009857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.774 [2024-12-13 07:03:15.009911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.774 [2024-12-13 07:03:15.009925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.032 #20 NEW cov: 11765 ft: 14492 corp: 19/44b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 EraseBytes- 00:07:57.032 [2024-12-13 07:03:15.049932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.032 [2024-12-13 07:03:15.049957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.032 [2024-12-13 07:03:15.050014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.032 [2024-12-13 07:03:15.050027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.032 #21 NEW cov: 11765 ft: 14496 corp: 20/46b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 CopyPart- 00:07:57.032 [2024-12-13 07:03:15.090065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.032 [2024-12-13 07:03:15.090090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.032 [2024-12-13 07:03:15.090145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.032 [2024-12-13 07:03:15.090159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.032 #22 NEW cov: 11765 ft: 14511 corp: 21/48b lim: 5 exec/s: 0 rss: 67Mb L: 2/4 MS: 1 ChangeBit- 00:07:57.032 [2024-12-13 07:03:15.130505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.032 [2024-12-13 07:03:15.130530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.032 [2024-12-13 07:03:15.130586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.032 [2024-12-13 07:03:15.130599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.032 [2024-12-13 07:03:15.130651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.032 [2024-12-13 07:03:15.130665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.032 [2024-12-13 07:03:15.130718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.032 [2024-12-13 07:03:15.130731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.032 #23 NEW cov: 11765 ft: 14526 corp: 22/52b lim: 5 exec/s: 0 rss: 67Mb L: 4/4 MS: 1 InsertByte- 00:07:57.032 [2024-12-13 07:03:15.170285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.032 [2024-12-13 07:03:15.170309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.032 [2024-12-13 07:03:15.170382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.032 [2024-12-13 07:03:15.170396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.291 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:57.291 #24 NEW cov: 11788 ft: 14566 corp: 23/54b lim: 5 exec/s: 24 rss: 68Mb L: 2/4 MS: 1 EraseBytes- 00:07:57.291 [2024-12-13 07:03:15.481695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.291 [2024-12-13 07:03:15.481748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.291 [2024-12-13 07:03:15.481839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.291 [2024-12-13 07:03:15.481873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.291 [2024-12-13 07:03:15.481930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.291 [2024-12-13 07:03:15.481949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.291 #25 NEW cov: 11788 ft: 14685 corp: 24/57b lim: 5 exec/s: 25 rss: 68Mb L: 3/4 MS: 1 InsertByte- 00:07:57.291 [2024-12-13 07:03:15.521267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.291 [2024-12-13 07:03:15.521293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.291 [2024-12-13 07:03:15.521349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.291 [2024-12-13 07:03:15.521363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.550 #26 NEW cov: 11788 ft: 14698 corp: 25/59b lim: 5 exec/s: 26 rss: 68Mb L: 2/4 MS: 1 ChangeBit- 00:07:57.550 [2024-12-13 07:03:15.561821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.550 [2024-12-13 07:03:15.561846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.550 [2024-12-13 07:03:15.561916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.550 [2024-12-13 07:03:15.561930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.550 [2024-12-13 07:03:15.561983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.550 [2024-12-13 07:03:15.561996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.550 [2024-12-13 07:03:15.562048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.550 [2024-12-13 07:03:15.562062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.550 [2024-12-13 07:03:15.562114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.550 [2024-12-13 07:03:15.562130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.550 #27 NEW cov: 11788 ft: 14785 corp: 26/64b lim: 5 exec/s: 27 rss: 68Mb L: 5/5 MS: 1 CopyPart- 00:07:57.550 [2024-12-13 07:03:15.611411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.550 [2024-12-13 07:03:15.611436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.550 #28 NEW cov: 11788 ft: 14796 corp: 27/65b lim: 5 exec/s: 28 rss: 68Mb L: 1/5 MS: 1 CrossOver- 00:07:57.550 [2024-12-13 07:03:15.652012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.550 [2024-12-13 07:03:15.652037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.550 [2024-12-13 07:03:15.652092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.550 [2024-12-13 07:03:15.652105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.550 [2024-12-13 07:03:15.652160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.550 [2024-12-13 07:03:15.652173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.550 [2024-12-13 07:03:15.652230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.550 [2024-12-13 07:03:15.652243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.550 #29 NEW cov: 11788 ft: 14815 corp: 28/69b lim: 5 exec/s: 29 rss: 68Mb L: 4/5 MS: 1 CMP- DE: "\005\000"- 00:07:57.550 [2024-12-13 07:03:15.692098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.550 [2024-12-13 07:03:15.692122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.550 [2024-12-13 07:03:15.692178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.550 [2024-12-13 07:03:15.692196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.550 [2024-12-13 07:03:15.692265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.550 [2024-12-13 07:03:15.692279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.550 [2024-12-13 07:03:15.692331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.550 [2024-12-13 07:03:15.692345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.550 #30 NEW cov: 11788 ft: 14838 corp: 29/73b lim: 5 exec/s: 30 rss: 69Mb L: 4/5 MS: 1 CrossOver- 00:07:57.551 [2024-12-13 07:03:15.732220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.551 [2024-12-13 07:03:15.732245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.551 [2024-12-13 07:03:15.732303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.551 [2024-12-13 07:03:15.732317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.551 [2024-12-13 07:03:15.732390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.551 [2024-12-13 07:03:15.732404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.551 [2024-12-13 07:03:15.732458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.551 [2024-12-13 07:03:15.732472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.551 #31 NEW cov: 11788 ft: 14888 corp: 30/77b lim: 5 exec/s: 31 rss: 69Mb L: 4/5 MS: 1 PersAutoDict- DE: "\005\000"- 00:07:57.551 [2024-12-13 07:03:15.772043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.551 [2024-12-13 07:03:15.772068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.551 [2024-12-13 07:03:15.772122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.551 [2024-12-13 07:03:15.772135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.810 #32 NEW cov: 11788 ft: 14917 corp: 31/79b lim: 5 exec/s: 32 rss: 69Mb L: 2/5 MS: 1 ChangeByte- 00:07:57.810 [2024-12-13 07:03:15.812427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.810 [2024-12-13 07:03:15.812451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.810 [2024-12-13 07:03:15.812507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.810 [2024-12-13 07:03:15.812520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.810 [2024-12-13 07:03:15.812557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.810 [2024-12-13 07:03:15.812570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.810 [2024-12-13 07:03:15.812624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.810 [2024-12-13 07:03:15.812637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.810 #33 NEW cov: 11788 ft: 14931 corp: 32/83b lim: 5 exec/s: 33 rss: 69Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:57.810 [2024-12-13 07:03:15.852529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.810 [2024-12-13 07:03:15.852554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.810 [2024-12-13 07:03:15.852608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.810 [2024-12-13 07:03:15.852622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.810 [2024-12-13 07:03:15.852681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.810 [2024-12-13 07:03:15.852694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.810 [2024-12-13 07:03:15.852749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.810 [2024-12-13 07:03:15.852762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.810 #34 NEW cov: 11788 ft: 14943 corp: 33/87b lim: 5 exec/s: 34 rss: 69Mb L: 4/5 MS: 1 ChangeBinInt- 00:07:57.810 [2024-12-13 07:03:15.892531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.810 [2024-12-13 07:03:15.892556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.810 [2024-12-13 07:03:15.892611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.810 [2024-12-13 07:03:15.892625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.810 [2024-12-13 07:03:15.892682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.811 [2024-12-13 07:03:15.892696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.811 #35 NEW cov: 11788 ft: 14954 corp: 34/90b lim: 5 exec/s: 35 rss: 69Mb L: 3/5 MS: 1 ChangeBit- 00:07:57.811 [2024-12-13 07:03:15.932671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.811 [2024-12-13 07:03:15.932696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.811 [2024-12-13 07:03:15.932752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.811 [2024-12-13 07:03:15.932766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.811 [2024-12-13 07:03:15.932817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.811 [2024-12-13 07:03:15.932830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.811 #36 NEW cov: 11788 ft: 14987 corp: 35/93b lim: 5 exec/s: 36 rss: 69Mb L: 3/5 MS: 1 ChangeBit- 00:07:57.811 [2024-12-13 07:03:15.972564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.811 [2024-12-13 07:03:15.972590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.811 [2024-12-13 07:03:15.972644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.811 [2024-12-13 07:03:15.972657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.811 #37 NEW cov: 11788 ft: 14993 corp: 36/95b lim: 5 exec/s: 37 rss: 69Mb L: 2/5 MS: 1 ChangeBit- 00:07:57.811 [2024-12-13 07:03:16.012991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.811 [2024-12-13 07:03:16.013019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.811 [2024-12-13 07:03:16.013092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.811 [2024-12-13 07:03:16.013106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.811 [2024-12-13 07:03:16.013159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.811 [2024-12-13 07:03:16.013174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.811 [2024-12-13 07:03:16.013233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.811 [2024-12-13 07:03:16.013247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.811 #38 NEW cov: 11788 ft: 15002 corp: 37/99b lim: 5 exec/s: 38 rss: 69Mb L: 4/5 MS: 1 PersAutoDict- DE: "\005\000"- 00:07:58.071 [2024-12-13 07:03:16.052987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.071 [2024-12-13 07:03:16.053013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.071 [2024-12-13 07:03:16.053071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.071 [2024-12-13 07:03:16.053085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.071 [2024-12-13 07:03:16.053140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.071 [2024-12-13 07:03:16.053154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.071 #39 NEW cov: 11788 ft: 15003 corp: 38/102b lim: 5 exec/s: 39 rss: 69Mb L: 3/5 MS: 1 ChangeBit- 00:07:58.071 [2024-12-13 07:03:16.093091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.071 [2024-12-13 07:03:16.093116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.071 [2024-12-13 07:03:16.093173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.071 [2024-12-13 07:03:16.093192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.071 [2024-12-13 07:03:16.093251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.071 [2024-12-13 07:03:16.093264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.071 #40 NEW cov: 11788 ft: 15009 corp: 39/105b lim: 5 exec/s: 40 rss: 69Mb L: 3/5 MS: 1 PersAutoDict- DE: "\005\000"- 00:07:58.071 [2024-12-13 07:03:16.133354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.071 [2024-12-13 07:03:16.133378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.071 [2024-12-13 07:03:16.133433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.071 [2024-12-13 07:03:16.133447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.071 [2024-12-13 07:03:16.133500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.071 [2024-12-13 07:03:16.133513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.071 [2024-12-13 07:03:16.133567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.071 [2024-12-13 07:03:16.133580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.071 #41 NEW cov: 11788 ft: 15015 corp: 40/109b lim: 5 exec/s: 41 rss: 69Mb L: 4/5 MS: 1 ChangeByte- 00:07:58.071 [2024-12-13 07:03:16.173184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.071 [2024-12-13 07:03:16.173214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.071 [2024-12-13 07:03:16.173269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.071 [2024-12-13 07:03:16.173283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.071 #42 NEW cov: 11788 ft: 15045 corp: 41/111b lim: 5 exec/s: 42 rss: 69Mb L: 2/5 MS: 1 ChangeByte- 00:07:58.071 [2024-12-13 07:03:16.213244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.071 [2024-12-13 07:03:16.213269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.071 [2024-12-13 07:03:16.213336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.071 [2024-12-13 07:03:16.213350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.071 #43 NEW cov: 11788 ft: 15063 corp: 42/113b lim: 5 exec/s: 43 rss: 69Mb L: 2/5 MS: 1 ChangeBit- 00:07:58.071 [2024-12-13 07:03:16.253414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.071 [2024-12-13 07:03:16.253440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.072 [2024-12-13 07:03:16.253494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.072 [2024-12-13 07:03:16.253508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.072 #44 NEW cov: 11788 ft: 15097 corp: 43/115b lim: 5 exec/s: 22 rss: 69Mb L: 2/5 MS: 1 ChangeByte- 00:07:58.072 #44 DONE cov: 11788 ft: 15097 corp: 43/115b lim: 5 exec/s: 22 rss: 69Mb 00:07:58.072 ###### Recommended dictionary. ###### 00:07:58.072 "\005\000" # Uses: 3 00:07:58.072 ###### End of recommended dictionary. ###### 00:07:58.072 Done 44 runs in 2 second(s) 00:07:58.331 07:03:16 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:07:58.331 07:03:16 -- ../common.sh@72 -- # (( i++ )) 00:07:58.331 07:03:16 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:58.331 07:03:16 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:07:58.331 07:03:16 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:07:58.331 07:03:16 -- nvmf/run.sh@24 -- # local timen=1 00:07:58.331 07:03:16 -- nvmf/run.sh@25 -- # local core=0x1 00:07:58.331 07:03:16 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:58.331 07:03:16 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:07:58.331 07:03:16 -- nvmf/run.sh@29 -- # printf %02d 10 00:07:58.331 07:03:16 -- nvmf/run.sh@29 -- # port=4410 00:07:58.331 07:03:16 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:58.331 07:03:16 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:07:58.331 07:03:16 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:58.331 07:03:16 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:07:58.331 [2024-12-13 07:03:16.430022] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:58.331 [2024-12-13 07:03:16.430108] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid494295 ] 00:07:58.331 EAL: No free 2048 kB hugepages reported on node 1 00:07:58.591 [2024-12-13 07:03:16.616965] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.591 [2024-12-13 07:03:16.636538] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:58.591 [2024-12-13 07:03:16.636659] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.591 [2024-12-13 07:03:16.687920] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:58.591 [2024-12-13 07:03:16.704234] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:07:58.591 INFO: Running with entropic power schedule (0xFF, 100). 00:07:58.591 INFO: Seed: 3912860566 00:07:58.591 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:58.591 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:58.591 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:58.591 INFO: A corpus is not provided, starting from an empty corpus 00:07:58.591 #2 INITED exec/s: 0 rss: 59Mb 00:07:58.591 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:58.591 This may also happen if the target rejected all inputs we tried so far 00:07:58.591 [2024-12-13 07:03:16.759552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.591 [2024-12-13 07:03:16.759580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.591 [2024-12-13 07:03:16.759634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.591 [2024-12-13 07:03:16.759647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.851 NEW_FUNC[1/670]: 0x45e248 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:07:58.851 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:58.851 #5 NEW cov: 11584 ft: 11583 corp: 2/21b lim: 40 exec/s: 0 rss: 67Mb L: 20/20 MS: 3 ChangeByte-CrossOver-InsertRepeatedBytes- 00:07:58.851 [2024-12-13 07:03:17.070489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.851 [2024-12-13 07:03:17.070549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.851 [2024-12-13 07:03:17.070644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.851 [2024-12-13 07:03:17.070670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.111 #6 NEW cov: 11697 ft: 12066 corp: 3/41b lim: 40 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 CrossOver- 00:07:59.111 [2024-12-13 07:03:17.120372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.111 [2024-12-13 07:03:17.120396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.111 [2024-12-13 07:03:17.120466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff05 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.111 [2024-12-13 07:03:17.120479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.111 #7 NEW cov: 11703 ft: 12273 corp: 4/61b lim: 40 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 ChangeBinInt- 00:07:59.111 [2024-12-13 07:03:17.160433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:2dffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.111 [2024-12-13 07:03:17.160456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.111 [2024-12-13 07:03:17.160511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.111 [2024-12-13 07:03:17.160525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.111 #8 NEW cov: 11788 ft: 12544 corp: 5/81b lim: 40 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 ChangeByte- 00:07:59.111 [2024-12-13 07:03:17.200537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.111 [2024-12-13 07:03:17.200560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.111 [2024-12-13 07:03:17.200616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff05 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.111 [2024-12-13 07:03:17.200630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.111 #9 NEW cov: 11788 ft: 12781 corp: 6/101b lim: 40 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 ShuffleBytes- 00:07:59.111 [2024-12-13 07:03:17.240681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.111 [2024-12-13 07:03:17.240705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.111 [2024-12-13 07:03:17.240775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.111 [2024-12-13 07:03:17.240789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.111 #10 NEW cov: 11788 ft: 12836 corp: 7/121b lim: 40 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 ChangeByte- 00:07:59.111 [2024-12-13 07:03:17.280764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.111 [2024-12-13 07:03:17.280788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.111 [2024-12-13 07:03:17.280862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff5cffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.111 [2024-12-13 07:03:17.280876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.111 #11 NEW cov: 11788 ft: 12979 corp: 8/141b lim: 40 exec/s: 0 rss: 67Mb L: 20/20 MS: 1 ChangeByte- 00:07:59.111 [2024-12-13 07:03:17.321164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.111 [2024-12-13 07:03:17.321190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.111 [2024-12-13 07:03:17.321265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.111 [2024-12-13 07:03:17.321278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.111 [2024-12-13 07:03:17.321337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.111 [2024-12-13 07:03:17.321350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.111 [2024-12-13 07:03:17.321406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff24 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.111 [2024-12-13 07:03:17.321419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.111 #12 NEW cov: 11788 ft: 13513 corp: 9/175b lim: 40 exec/s: 0 rss: 67Mb L: 34/34 MS: 1 CopyPart- 00:07:59.372 [2024-12-13 07:03:17.361010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.372 [2024-12-13 07:03:17.361034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.372 [2024-12-13 07:03:17.361106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff05ff05 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.372 [2024-12-13 07:03:17.361119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.372 #13 NEW cov: 11788 ft: 13544 corp: 10/195b lim: 40 exec/s: 0 rss: 67Mb L: 20/34 MS: 1 CrossOver- 00:07:59.372 [2024-12-13 07:03:17.401146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.372 [2024-12-13 07:03:17.401170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.372 [2024-12-13 07:03:17.401230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.372 [2024-12-13 07:03:17.401243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.372 #14 NEW cov: 11788 ft: 13586 corp: 11/216b lim: 40 exec/s: 0 rss: 67Mb L: 21/34 MS: 1 InsertByte- 00:07:59.372 [2024-12-13 07:03:17.441319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.372 [2024-12-13 07:03:17.441343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.372 [2024-12-13 07:03:17.441399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:5cffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.372 [2024-12-13 07:03:17.441416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.372 #15 NEW cov: 11788 ft: 13644 corp: 12/235b lim: 40 exec/s: 0 rss: 68Mb L: 19/34 MS: 1 EraseBytes- 00:07:59.372 [2024-12-13 07:03:17.481668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.372 [2024-12-13 07:03:17.481692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.372 [2024-12-13 07:03:17.481749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.372 [2024-12-13 07:03:17.481762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.372 [2024-12-13 07:03:17.481817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.372 [2024-12-13 07:03:17.481830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.372 [2024-12-13 07:03:17.481885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:3dffff24 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.372 [2024-12-13 07:03:17.481898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.372 #21 NEW cov: 11788 ft: 13658 corp: 13/269b lim: 40 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 ChangeByte- 00:07:59.372 [2024-12-13 07:03:17.521506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.372 [2024-12-13 07:03:17.521530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.372 [2024-12-13 07:03:17.521600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.372 [2024-12-13 07:03:17.521614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.372 #22 NEW cov: 11788 ft: 13680 corp: 14/289b lim: 40 exec/s: 0 rss: 68Mb L: 20/34 MS: 1 ShuffleBytes- 00:07:59.372 [2024-12-13 07:03:17.561459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.372 [2024-12-13 07:03:17.561484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.372 #24 NEW cov: 11788 ft: 14026 corp: 15/297b lim: 40 exec/s: 0 rss: 68Mb L: 8/34 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:59.372 [2024-12-13 07:03:17.601687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.372 [2024-12-13 07:03:17.601712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.372 [2024-12-13 07:03:17.601768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.372 [2024-12-13 07:03:17.601781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.631 #27 NEW cov: 11788 ft: 14052 corp: 16/318b lim: 40 exec/s: 0 rss: 68Mb L: 21/34 MS: 3 ShuffleBytes-ShuffleBytes-CrossOver- 00:07:59.631 [2024-12-13 07:03:17.631807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.631 [2024-12-13 07:03:17.631834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.631 [2024-12-13 07:03:17.631889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.631 [2024-12-13 07:03:17.631903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.631 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:59.631 #28 NEW cov: 11811 ft: 14101 corp: 17/338b lim: 40 exec/s: 0 rss: 68Mb L: 20/34 MS: 1 ChangeBinInt- 00:07:59.631 [2024-12-13 07:03:17.672158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.631 [2024-12-13 07:03:17.672182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.631 [2024-12-13 07:03:17.672247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.631 [2024-12-13 07:03:17.672277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.631 [2024-12-13 07:03:17.672335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffff0ae1 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.631 [2024-12-13 07:03:17.672348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.632 [2024-12-13 07:03:17.672404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.632 [2024-12-13 07:03:17.672416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.632 #29 NEW cov: 11811 ft: 14143 corp: 18/370b lim: 40 exec/s: 0 rss: 68Mb L: 32/34 MS: 1 CopyPart- 00:07:59.632 [2024-12-13 07:03:17.712261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.632 [2024-12-13 07:03:17.712285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.632 [2024-12-13 07:03:17.712341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.632 [2024-12-13 07:03:17.712354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.632 [2024-12-13 07:03:17.712408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffff0ae1 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.632 [2024-12-13 07:03:17.712421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.632 [2024-12-13 07:03:17.712475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.632 [2024-12-13 07:03:17.712487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.632 #30 NEW cov: 11811 ft: 14236 corp: 19/402b lim: 40 exec/s: 0 rss: 68Mb L: 32/34 MS: 1 ShuffleBytes- 00:07:59.632 [2024-12-13 07:03:17.752180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.632 [2024-12-13 07:03:17.752207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.632 [2024-12-13 07:03:17.752284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffff7ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.632 [2024-12-13 07:03:17.752298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.632 #31 NEW cov: 11811 ft: 14251 corp: 20/422b lim: 40 exec/s: 31 rss: 68Mb L: 20/34 MS: 1 ChangeBit- 00:07:59.632 [2024-12-13 07:03:17.792609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.632 [2024-12-13 07:03:17.792634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.632 [2024-12-13 07:03:17.792691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.632 [2024-12-13 07:03:17.792704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.632 [2024-12-13 07:03:17.792758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.632 [2024-12-13 07:03:17.792772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.632 [2024-12-13 07:03:17.792825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.632 [2024-12-13 07:03:17.792838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.632 #32 NEW cov: 11811 ft: 14259 corp: 21/458b lim: 40 exec/s: 32 rss: 68Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:07:59.632 [2024-12-13 07:03:17.832821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.632 [2024-12-13 07:03:17.832846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.632 [2024-12-13 07:03:17.832923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.632 [2024-12-13 07:03:17.832937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.632 [2024-12-13 07:03:17.832993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffff0ae1 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.632 [2024-12-13 07:03:17.833007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.632 [2024-12-13 07:03:17.833062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffff0102 cdw11:e7a63e25 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.632 [2024-12-13 07:03:17.833075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.632 [2024-12-13 07:03:17.833130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:03aaffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.632 [2024-12-13 07:03:17.833143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.632 #33 NEW cov: 11811 ft: 14312 corp: 22/498b lim: 40 exec/s: 33 rss: 68Mb L: 40/40 MS: 1 CMP- DE: "\001\002\347\246>%\003\252"- 00:07:59.891 [2024-12-13 07:03:17.872542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.891 [2024-12-13 07:03:17.872567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.891 [2024-12-13 07:03:17.872626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:80ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.891 [2024-12-13 07:03:17.872639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.891 #34 NEW cov: 11811 ft: 14370 corp: 23/519b lim: 40 exec/s: 34 rss: 68Mb L: 21/40 MS: 1 InsertByte- 00:07:59.891 [2024-12-13 07:03:17.912519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.891 [2024-12-13 07:03:17.912542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.891 #35 NEW cov: 11811 ft: 14397 corp: 24/528b lim: 40 exec/s: 35 rss: 68Mb L: 9/40 MS: 1 CrossOver- 00:07:59.891 [2024-12-13 07:03:17.952989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.891 [2024-12-13 07:03:17.953013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.891 [2024-12-13 07:03:17.953070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.891 [2024-12-13 07:03:17.953083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.891 [2024-12-13 07:03:17.953138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.891 [2024-12-13 07:03:17.953151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.891 [2024-12-13 07:03:17.953205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:3dffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.891 [2024-12-13 07:03:17.953234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.891 #36 NEW cov: 11811 ft: 14412 corp: 25/562b lim: 40 exec/s: 36 rss: 68Mb L: 34/40 MS: 1 ShuffleBytes- 00:07:59.891 [2024-12-13 07:03:17.992756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.891 [2024-12-13 07:03:17.992780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.891 #37 NEW cov: 11811 ft: 14429 corp: 26/571b lim: 40 exec/s: 37 rss: 68Mb L: 9/40 MS: 1 ChangeByte- 00:07:59.891 [2024-12-13 07:03:18.032882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.891 [2024-12-13 07:03:18.032905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.891 #38 NEW cov: 11811 ft: 14445 corp: 27/586b lim: 40 exec/s: 38 rss: 68Mb L: 15/40 MS: 1 EraseBytes- 00:07:59.891 [2024-12-13 07:03:18.073371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.891 [2024-12-13 07:03:18.073395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.892 [2024-12-13 07:03:18.073454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ff010101 cdw11:0101ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.892 [2024-12-13 07:03:18.073467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.892 [2024-12-13 07:03:18.073526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.892 [2024-12-13 07:03:18.073539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.892 [2024-12-13 07:03:18.073595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.892 [2024-12-13 07:03:18.073609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.892 #39 NEW cov: 11811 ft: 14453 corp: 28/625b lim: 40 exec/s: 39 rss: 68Mb L: 39/40 MS: 1 InsertRepeatedBytes- 00:07:59.892 [2024-12-13 07:03:18.113447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.892 [2024-12-13 07:03:18.113471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.892 [2024-12-13 07:03:18.113527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.892 [2024-12-13 07:03:18.113540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.892 [2024-12-13 07:03:18.113595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.892 [2024-12-13 07:03:18.113607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.892 [2024-12-13 07:03:18.113662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffff24ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.892 [2024-12-13 07:03:18.113674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.150 #40 NEW cov: 11811 ft: 14505 corp: 29/659b lim: 40 exec/s: 40 rss: 68Mb L: 34/40 MS: 1 ShuffleBytes- 00:08:00.150 [2024-12-13 07:03:18.153343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.150 [2024-12-13 07:03:18.153367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.150 [2024-12-13 07:03:18.153420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:80ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.150 [2024-12-13 07:03:18.153434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.150 #41 NEW cov: 11811 ft: 14516 corp: 30/680b lim: 40 exec/s: 41 rss: 68Mb L: 21/40 MS: 1 CopyPart- 00:08:00.150 [2024-12-13 07:03:18.193582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff0102e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.150 [2024-12-13 07:03:18.193606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.150 [2024-12-13 07:03:18.193681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:a63e2503 cdw11:aaffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.150 [2024-12-13 07:03:18.193694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.150 [2024-12-13 07:03:18.193753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff05ff05 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.150 [2024-12-13 07:03:18.193766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.150 #42 NEW cov: 11811 ft: 14690 corp: 31/708b lim: 40 exec/s: 42 rss: 68Mb L: 28/40 MS: 1 PersAutoDict- DE: "\001\002\347\246>%\003\252"- 00:08:00.150 [2024-12-13 07:03:18.233579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.150 [2024-12-13 07:03:18.233603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.150 [2024-12-13 07:03:18.233659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffff8aff cdw11:ffff05ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.150 [2024-12-13 07:03:18.233673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.150 #43 NEW cov: 11811 ft: 14713 corp: 32/729b lim: 40 exec/s: 43 rss: 68Mb L: 21/40 MS: 1 InsertByte- 00:08:00.150 [2024-12-13 07:03:18.273836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff0102e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.150 [2024-12-13 07:03:18.273860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.150 [2024-12-13 07:03:18.273935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:a63e2503 cdw11:aaffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.150 [2024-12-13 07:03:18.273949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.150 [2024-12-13 07:03:18.274006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.150 [2024-12-13 07:03:18.274019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.150 #44 NEW cov: 11811 ft: 14744 corp: 33/757b lim: 40 exec/s: 44 rss: 68Mb L: 28/40 MS: 1 PersAutoDict- DE: "\001\002\347\246>%\003\252"- 00:08:00.150 [2024-12-13 07:03:18.314152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.150 [2024-12-13 07:03:18.314175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.150 [2024-12-13 07:03:18.314233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.150 [2024-12-13 07:03:18.314247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.150 [2024-12-13 07:03:18.314300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffef0ae1 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.150 [2024-12-13 07:03:18.314313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.150 [2024-12-13 07:03:18.314368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffff0102 cdw11:e7a63e25 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.150 [2024-12-13 07:03:18.314381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.150 [2024-12-13 07:03:18.314435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:03aaffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.150 [2024-12-13 07:03:18.314448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.150 #45 NEW cov: 11811 ft: 14761 corp: 34/797b lim: 40 exec/s: 45 rss: 69Mb L: 40/40 MS: 1 ChangeBit- 00:08:00.150 [2024-12-13 07:03:18.354147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff24ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.150 [2024-12-13 07:03:18.354173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.150 [2024-12-13 07:03:18.354247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.150 [2024-12-13 07:03:18.354261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.150 [2024-12-13 07:03:18.354314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.150 [2024-12-13 07:03:18.354327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.150 [2024-12-13 07:03:18.354381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffff24ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.150 [2024-12-13 07:03:18.354394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.150 #46 NEW cov: 11811 ft: 14831 corp: 35/831b lim: 40 exec/s: 46 rss: 69Mb L: 34/40 MS: 1 CopyPart- 00:08:00.409 [2024-12-13 07:03:18.394181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.409 [2024-12-13 07:03:18.394224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.409 [2024-12-13 07:03:18.394285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.409 [2024-12-13 07:03:18.394298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.409 [2024-12-13 07:03:18.394354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffff0aff cdw11:ffffffe1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.409 [2024-12-13 07:03:18.394367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.409 #47 NEW cov: 11811 ft: 14892 corp: 36/855b lim: 40 exec/s: 47 rss: 69Mb L: 24/40 MS: 1 CopyPart- 00:08:00.409 [2024-12-13 07:03:18.434131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.409 [2024-12-13 07:03:18.434155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.409 [2024-12-13 07:03:18.434229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffef SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.409 [2024-12-13 07:03:18.434243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.409 #48 NEW cov: 11811 ft: 14974 corp: 37/875b lim: 40 exec/s: 48 rss: 69Mb L: 20/40 MS: 1 ChangeBit- 00:08:00.409 [2024-12-13 07:03:18.474415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.409 [2024-12-13 07:03:18.474439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.409 [2024-12-13 07:03:18.474498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.409 [2024-12-13 07:03:18.474511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.409 [2024-12-13 07:03:18.474570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff3d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.410 [2024-12-13 07:03:18.474584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.410 #49 NEW cov: 11811 ft: 14986 corp: 38/904b lim: 40 exec/s: 49 rss: 69Mb L: 29/40 MS: 1 EraseBytes- 00:08:00.410 [2024-12-13 07:03:18.514518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.410 [2024-12-13 07:03:18.514541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.410 [2024-12-13 07:03:18.514600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.410 [2024-12-13 07:03:18.514613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.410 [2024-12-13 07:03:18.514667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff80ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.410 [2024-12-13 07:03:18.514681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.410 #50 NEW cov: 11811 ft: 14998 corp: 39/935b lim: 40 exec/s: 50 rss: 69Mb L: 31/40 MS: 1 InsertRepeatedBytes- 00:08:00.410 [2024-12-13 07:03:18.554423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.410 [2024-12-13 07:03:18.554446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.410 #51 NEW cov: 11811 ft: 15008 corp: 40/948b lim: 40 exec/s: 51 rss: 69Mb L: 13/40 MS: 1 EraseBytes- 00:08:00.410 [2024-12-13 07:03:18.594844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.410 [2024-12-13 07:03:18.594868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.410 [2024-12-13 07:03:18.594925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.410 [2024-12-13 07:03:18.594938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.410 [2024-12-13 07:03:18.594995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.410 [2024-12-13 07:03:18.595008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.410 [2024-12-13 07:03:18.595063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:0ae1ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.410 [2024-12-13 07:03:18.595075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.410 #52 NEW cov: 11811 ft: 15017 corp: 41/986b lim: 40 exec/s: 52 rss: 69Mb L: 38/40 MS: 1 InsertRepeatedBytes- 00:08:00.410 [2024-12-13 07:03:18.634889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff0102e7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.410 [2024-12-13 07:03:18.634913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.410 [2024-12-13 07:03:18.634968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:1c3e2503 cdw11:aaffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.410 [2024-12-13 07:03:18.634984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.410 [2024-12-13 07:03:18.635039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.410 [2024-12-13 07:03:18.635052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.669 #53 NEW cov: 11811 ft: 15034 corp: 42/1014b lim: 40 exec/s: 53 rss: 69Mb L: 28/40 MS: 1 ChangeBinInt- 00:08:00.669 [2024-12-13 07:03:18.674892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffa1ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.669 [2024-12-13 07:03:18.674915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.669 [2024-12-13 07:03:18.674971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.669 [2024-12-13 07:03:18.674984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.669 #54 NEW cov: 11811 ft: 15050 corp: 43/1034b lim: 40 exec/s: 54 rss: 69Mb L: 20/40 MS: 1 ChangeByte- 00:08:00.669 [2024-12-13 07:03:18.714863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.669 [2024-12-13 07:03:18.714887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.669 #55 NEW cov: 11811 ft: 15072 corp: 44/1044b lim: 40 exec/s: 55 rss: 69Mb L: 10/40 MS: 1 CopyPart- 00:08:00.669 [2024-12-13 07:03:18.755382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.669 [2024-12-13 07:03:18.755406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.669 [2024-12-13 07:03:18.755459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.669 [2024-12-13 07:03:18.755472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.669 [2024-12-13 07:03:18.755522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffff0ae1 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.669 [2024-12-13 07:03:18.755536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.669 [2024-12-13 07:03:18.755588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.669 [2024-12-13 07:03:18.755601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.669 #56 NEW cov: 11811 ft: 15079 corp: 45/1076b lim: 40 exec/s: 28 rss: 69Mb L: 32/40 MS: 1 ShuffleBytes- 00:08:00.669 #56 DONE cov: 11811 ft: 15079 corp: 45/1076b lim: 40 exec/s: 28 rss: 69Mb 00:08:00.669 ###### Recommended dictionary. ###### 00:08:00.669 "\001\002\347\246>%\003\252" # Uses: 2 00:08:00.669 ###### End of recommended dictionary. ###### 00:08:00.669 Done 56 runs in 2 second(s) 00:08:00.670 07:03:18 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:08:00.670 07:03:18 -- ../common.sh@72 -- # (( i++ )) 00:08:00.670 07:03:18 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:00.670 07:03:18 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:08:00.670 07:03:18 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:08:00.670 07:03:18 -- nvmf/run.sh@24 -- # local timen=1 00:08:00.670 07:03:18 -- nvmf/run.sh@25 -- # local core=0x1 00:08:00.670 07:03:18 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:00.670 07:03:18 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:08:00.670 07:03:18 -- nvmf/run.sh@29 -- # printf %02d 11 00:08:00.670 07:03:18 -- nvmf/run.sh@29 -- # port=4411 00:08:00.670 07:03:18 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:00.670 07:03:18 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:08:00.670 07:03:18 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:00.670 07:03:18 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:08:00.929 [2024-12-13 07:03:18.927075] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:00.929 [2024-12-13 07:03:18.927151] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid494634 ] 00:08:00.929 EAL: No free 2048 kB hugepages reported on node 1 00:08:00.929 [2024-12-13 07:03:19.106303] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.929 [2024-12-13 07:03:19.126418] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:00.929 [2024-12-13 07:03:19.126540] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.188 [2024-12-13 07:03:19.178047] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:01.188 [2024-12-13 07:03:19.194392] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:08:01.188 INFO: Running with entropic power schedule (0xFF, 100). 00:08:01.188 INFO: Seed: 2106882874 00:08:01.188 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:01.188 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:01.188 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:01.188 INFO: A corpus is not provided, starting from an empty corpus 00:08:01.188 #2 INITED exec/s: 0 rss: 59Mb 00:08:01.188 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:01.188 This may also happen if the target rejected all inputs we tried so far 00:08:01.188 [2024-12-13 07:03:19.260915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.188 [2024-12-13 07:03:19.260950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.188 [2024-12-13 07:03:19.261098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.188 [2024-12-13 07:03:19.261115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.188 [2024-12-13 07:03:19.261245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:4242420b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.188 [2024-12-13 07:03:19.261262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.447 NEW_FUNC[1/667]: 0x45ffb8 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:01.447 NEW_FUNC[2/667]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:01.447 #10 NEW cov: 11577 ft: 11597 corp: 2/25b lim: 40 exec/s: 0 rss: 67Mb L: 24/24 MS: 3 CrossOver-ChangeBit-InsertRepeatedBytes- 00:08:01.447 [2024-12-13 07:03:19.581755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.447 [2024-12-13 07:03:19.581802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.447 [2024-12-13 07:03:19.581932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42424242 cdw11:42b44242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.447 [2024-12-13 07:03:19.581954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.447 [2024-12-13 07:03:19.582094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:4242420b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.447 [2024-12-13 07:03:19.582115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.447 NEW_FUNC[1/4]: 0x1c716e8 in spdk_thread_is_exited /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:728 00:08:01.447 NEW_FUNC[2/4]: 0x1c72478 in spdk_thread_get_from_ctx /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:797 00:08:01.447 #16 NEW cov: 11709 ft: 12062 corp: 3/49b lim: 40 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 ChangeByte- 00:08:01.447 [2024-12-13 07:03:19.631740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.447 [2024-12-13 07:03:19.631773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.448 [2024-12-13 07:03:19.631909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4242b942 cdw11:42b44242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.448 [2024-12-13 07:03:19.631926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.448 [2024-12-13 07:03:19.632067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:4242420b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.448 [2024-12-13 07:03:19.632084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.448 #17 NEW cov: 11715 ft: 12325 corp: 4/73b lim: 40 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 ChangeBinInt- 00:08:01.448 [2024-12-13 07:03:19.682023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.448 [2024-12-13 07:03:19.682053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.448 [2024-12-13 07:03:19.682193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42424242 cdw11:42b44242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.448 [2024-12-13 07:03:19.682209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.448 [2024-12-13 07:03:19.682341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:4242420b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.448 [2024-12-13 07:03:19.682360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.708 #18 NEW cov: 11800 ft: 12552 corp: 5/97b lim: 40 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 CrossOver- 00:08:01.708 [2024-12-13 07:03:19.722107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.708 [2024-12-13 07:03:19.722134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.708 [2024-12-13 07:03:19.722273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42ffff00 cdw11:00424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.708 [2024-12-13 07:03:19.722292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.708 [2024-12-13 07:03:19.722420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.708 [2024-12-13 07:03:19.722438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.708 #19 NEW cov: 11800 ft: 12771 corp: 6/125b lim: 40 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 CMP- DE: "\377\377\000\000"- 00:08:01.708 [2024-12-13 07:03:19.762211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.708 [2024-12-13 07:03:19.762237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.708 [2024-12-13 07:03:19.762369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.708 [2024-12-13 07:03:19.762386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.708 [2024-12-13 07:03:19.762523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:4242420b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.708 [2024-12-13 07:03:19.762541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.708 #20 NEW cov: 11800 ft: 12846 corp: 7/149b lim: 40 exec/s: 0 rss: 67Mb L: 24/28 MS: 1 ShuffleBytes- 00:08:01.708 [2024-12-13 07:03:19.802331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:4242ff42 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.708 [2024-12-13 07:03:19.802359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.708 [2024-12-13 07:03:19.802502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42ff0042 cdw11:00424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.708 [2024-12-13 07:03:19.802519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.708 [2024-12-13 07:03:19.802650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.708 [2024-12-13 07:03:19.802667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.708 #21 NEW cov: 11800 ft: 12974 corp: 8/177b lim: 40 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 ShuffleBytes- 00:08:01.708 [2024-12-13 07:03:19.842521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.708 [2024-12-13 07:03:19.842548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.708 [2024-12-13 07:03:19.842680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4242b942 cdw11:42b44242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.708 [2024-12-13 07:03:19.842697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.708 [2024-12-13 07:03:19.842825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.708 [2024-12-13 07:03:19.842841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.708 #22 NEW cov: 11800 ft: 13022 corp: 9/201b lim: 40 exec/s: 0 rss: 67Mb L: 24/28 MS: 1 CrossOver- 00:08:01.708 [2024-12-13 07:03:19.882601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a324242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.708 [2024-12-13 07:03:19.882628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.708 [2024-12-13 07:03:19.882769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42424242 cdw11:b94242b4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.708 [2024-12-13 07:03:19.882787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.708 [2024-12-13 07:03:19.882919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.708 [2024-12-13 07:03:19.882935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.708 #27 NEW cov: 11800 ft: 13110 corp: 10/227b lim: 40 exec/s: 0 rss: 67Mb L: 26/28 MS: 5 CopyPart-CopyPart-ChangeByte-ChangeASCIIInt-CrossOver- 00:08:01.708 [2024-12-13 07:03:19.922686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a42420a cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.708 [2024-12-13 07:03:19.922712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.708 [2024-12-13 07:03:19.922850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:424242b9 cdw11:4242b442 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.708 [2024-12-13 07:03:19.922867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.708 [2024-12-13 07:03:19.923005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.708 [2024-12-13 07:03:19.923020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.708 #30 NEW cov: 11800 ft: 13173 corp: 11/251b lim: 40 exec/s: 0 rss: 67Mb L: 24/28 MS: 3 CopyPart-CrossOver-CrossOver- 00:08:01.967 [2024-12-13 07:03:19.972898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.967 [2024-12-13 07:03:19.972927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.967 [2024-12-13 07:03:19.973059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4242b942 cdw11:42b44242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.967 [2024-12-13 07:03:19.973075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.967 [2024-12-13 07:03:19.973209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:4242420b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.967 [2024-12-13 07:03:19.973225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.967 #31 NEW cov: 11800 ft: 13211 corp: 12/275b lim: 40 exec/s: 0 rss: 67Mb L: 24/28 MS: 1 ShuffleBytes- 00:08:01.967 [2024-12-13 07:03:20.022700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.967 [2024-12-13 07:03:20.022727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.967 [2024-12-13 07:03:20.022858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.967 [2024-12-13 07:03:20.022874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.967 #32 NEW cov: 11800 ft: 13462 corp: 13/298b lim: 40 exec/s: 0 rss: 67Mb L: 23/28 MS: 1 EraseBytes- 00:08:01.967 [2024-12-13 07:03:20.073546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.967 [2024-12-13 07:03:20.073575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.967 [2024-12-13 07:03:20.073682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.967 [2024-12-13 07:03:20.073701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.967 [2024-12-13 07:03:20.073843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.967 [2024-12-13 07:03:20.073861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.967 [2024-12-13 07:03:20.073989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.967 [2024-12-13 07:03:20.074006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.967 #33 NEW cov: 11800 ft: 13891 corp: 14/337b lim: 40 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:08:01.967 [2024-12-13 07:03:20.123473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42ff4242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.967 [2024-12-13 07:03:20.123500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.967 [2024-12-13 07:03:20.123635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4242b942 cdw11:42b44242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.967 [2024-12-13 07:03:20.123653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.967 [2024-12-13 07:03:20.123792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:4242420b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.967 [2024-12-13 07:03:20.123808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.967 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:01.967 #34 NEW cov: 11823 ft: 13944 corp: 15/361b lim: 40 exec/s: 0 rss: 68Mb L: 24/39 MS: 1 ChangeByte- 00:08:01.967 [2024-12-13 07:03:20.172929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.967 [2024-12-13 07:03:20.172956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.967 #35 NEW cov: 11823 ft: 14667 corp: 16/376b lim: 40 exec/s: 0 rss: 68Mb L: 15/39 MS: 1 CrossOver- 00:08:02.226 [2024-12-13 07:03:20.213634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a42420a cdw11:4242b942 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.226 [2024-12-13 07:03:20.213661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.226 [2024-12-13 07:03:20.213787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42424242 cdw11:42b94242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.226 [2024-12-13 07:03:20.213804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.226 [2024-12-13 07:03:20.213932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:b4424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.226 [2024-12-13 07:03:20.213952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.226 #36 NEW cov: 11823 ft: 14724 corp: 17/402b lim: 40 exec/s: 36 rss: 68Mb L: 26/39 MS: 1 CopyPart- 00:08:02.226 [2024-12-13 07:03:20.253493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.226 [2024-12-13 07:03:20.253520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.226 [2024-12-13 07:03:20.253670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42421800 cdw11:00004242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.226 [2024-12-13 07:03:20.253688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.226 [2024-12-13 07:03:20.253816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:4242420b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.226 [2024-12-13 07:03:20.253832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.226 #37 NEW cov: 11823 ft: 14742 corp: 18/426b lim: 40 exec/s: 37 rss: 68Mb L: 24/39 MS: 1 ChangeBinInt- 00:08:02.226 [2024-12-13 07:03:20.294012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.226 [2024-12-13 07:03:20.294039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.226 [2024-12-13 07:03:20.294181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:40421800 cdw11:00004242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.227 [2024-12-13 07:03:20.294201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.227 [2024-12-13 07:03:20.294335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:4242420b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.227 [2024-12-13 07:03:20.294351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.227 #38 NEW cov: 11823 ft: 14751 corp: 19/450b lim: 40 exec/s: 38 rss: 68Mb L: 24/39 MS: 1 ChangeByte- 00:08:02.227 [2024-12-13 07:03:20.334113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a42420a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.227 [2024-12-13 07:03:20.334140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.227 [2024-12-13 07:03:20.334279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:004242b9 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.227 [2024-12-13 07:03:20.334297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.227 [2024-12-13 07:03:20.334421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4242b942 cdw11:42b44242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.227 [2024-12-13 07:03:20.334438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.227 #39 NEW cov: 11823 ft: 14770 corp: 20/481b lim: 40 exec/s: 39 rss: 68Mb L: 31/39 MS: 1 InsertRepeatedBytes- 00:08:02.227 [2024-12-13 07:03:20.384345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a42420a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.227 [2024-12-13 07:03:20.384371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.227 [2024-12-13 07:03:20.384518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:004242b9 cdw11:1f004242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.227 [2024-12-13 07:03:20.384535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.227 [2024-12-13 07:03:20.384674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4242b942 cdw11:42b44242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.227 [2024-12-13 07:03:20.384691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.227 #40 NEW cov: 11823 ft: 14842 corp: 21/512b lim: 40 exec/s: 40 rss: 68Mb L: 31/39 MS: 1 ChangeBinInt- 00:08:02.227 [2024-12-13 07:03:20.434354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0afffd82 cdw11:82828282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.227 [2024-12-13 07:03:20.434381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.227 [2024-12-13 07:03:20.434525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:82828282 cdw11:82828282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.227 [2024-12-13 07:03:20.434541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.227 [2024-12-13 07:03:20.434669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:82828282 cdw11:82828282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.227 [2024-12-13 07:03:20.434686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.227 [2024-12-13 07:03:20.434823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:82828282 cdw11:82828282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.227 [2024-12-13 07:03:20.434839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.227 #45 NEW cov: 11823 ft: 14893 corp: 22/549b lim: 40 exec/s: 45 rss: 68Mb L: 37/39 MS: 5 ShuffleBytes-InsertRepeatedBytes-CopyPart-ChangeBit-InsertRepeatedBytes- 00:08:02.486 [2024-12-13 07:03:20.473812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.486 [2024-12-13 07:03:20.473839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.486 [2024-12-13 07:03:20.473966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.486 [2024-12-13 07:03:20.473983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.486 #46 NEW cov: 11823 ft: 14915 corp: 23/572b lim: 40 exec/s: 46 rss: 68Mb L: 23/39 MS: 1 PersAutoDict- DE: "\377\377\000\000"- 00:08:02.486 [2024-12-13 07:03:20.534749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.486 [2024-12-13 07:03:20.534777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.486 [2024-12-13 07:03:20.534925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42424242 cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.486 [2024-12-13 07:03:20.534944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.486 [2024-12-13 07:03:20.535077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:4242420b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.486 [2024-12-13 07:03:20.535096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.486 #47 NEW cov: 11823 ft: 14974 corp: 24/596b lim: 40 exec/s: 47 rss: 68Mb L: 24/39 MS: 1 PersAutoDict- DE: "\377\377\000\000"- 00:08:02.486 [2024-12-13 07:03:20.594586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.486 [2024-12-13 07:03:20.594615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.486 [2024-12-13 07:03:20.594739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.486 [2024-12-13 07:03:20.594757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.486 #48 NEW cov: 11823 ft: 14984 corp: 25/614b lim: 40 exec/s: 48 rss: 68Mb L: 18/39 MS: 1 EraseBytes- 00:08:02.486 [2024-12-13 07:03:20.634686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.486 [2024-12-13 07:03:20.634714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.486 [2024-12-13 07:03:20.634848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4242b942 cdw11:42b44242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.486 [2024-12-13 07:03:20.634866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.486 [2024-12-13 07:03:20.634993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffff0000 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.486 [2024-12-13 07:03:20.635009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.486 [2024-12-13 07:03:20.674835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.486 [2024-12-13 07:03:20.674861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.486 [2024-12-13 07:03:20.675002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4242b942 cdw11:42b44242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.486 [2024-12-13 07:03:20.675019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.486 [2024-12-13 07:03:20.675155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffff0022 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.486 [2024-12-13 07:03:20.675173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.486 #50 NEW cov: 11823 ft: 14994 corp: 26/642b lim: 40 exec/s: 50 rss: 68Mb L: 28/39 MS: 2 PersAutoDict-ChangeByte- DE: "\377\377\000\000"- 00:08:02.486 [2024-12-13 07:03:20.725267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.486 [2024-12-13 07:03:20.725295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.486 [2024-12-13 07:03:20.725430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4242b942 cdw11:42b44242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.486 [2024-12-13 07:03:20.725447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.486 [2024-12-13 07:03:20.725578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff4242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.486 [2024-12-13 07:03:20.725602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.745 #51 NEW cov: 11823 ft: 15052 corp: 27/672b lim: 40 exec/s: 51 rss: 68Mb L: 30/39 MS: 1 InsertRepeatedBytes- 00:08:02.745 [2024-12-13 07:03:20.775060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.745 [2024-12-13 07:03:20.775088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.745 [2024-12-13 07:03:20.775229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4242b942 cdw11:42b44242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.745 [2024-12-13 07:03:20.775248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.745 [2024-12-13 07:03:20.775374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.745 [2024-12-13 07:03:20.775390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.745 #52 NEW cov: 11823 ft: 15075 corp: 28/696b lim: 40 exec/s: 52 rss: 68Mb L: 24/39 MS: 1 ShuffleBytes- 00:08:02.745 [2024-12-13 07:03:20.815425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.745 [2024-12-13 07:03:20.815451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.745 [2024-12-13 07:03:20.815588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4242b942 cdw11:42b44242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.745 [2024-12-13 07:03:20.815604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.745 [2024-12-13 07:03:20.815751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:4242420b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.745 [2024-12-13 07:03:20.815772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.745 #53 NEW cov: 11823 ft: 15082 corp: 29/720b lim: 40 exec/s: 53 rss: 68Mb L: 24/39 MS: 1 CrossOver- 00:08:02.745 [2024-12-13 07:03:20.855553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.745 [2024-12-13 07:03:20.855581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.745 [2024-12-13 07:03:20.855719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42424242 cdw11:42b44242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.745 [2024-12-13 07:03:20.855735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.745 [2024-12-13 07:03:20.855877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:424242b4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.745 [2024-12-13 07:03:20.855896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.745 #54 NEW cov: 11823 ft: 15090 corp: 30/745b lim: 40 exec/s: 54 rss: 68Mb L: 25/39 MS: 1 CrossOver- 00:08:02.745 [2024-12-13 07:03:20.905794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.745 [2024-12-13 07:03:20.905822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.745 [2024-12-13 07:03:20.905969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.745 [2024-12-13 07:03:20.905987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.745 [2024-12-13 07:03:20.906119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424242 cdw11:4242980b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.745 [2024-12-13 07:03:20.906135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.746 #55 NEW cov: 11823 ft: 15103 corp: 31/769b lim: 40 exec/s: 55 rss: 68Mb L: 24/39 MS: 1 ChangeByte- 00:08:02.746 [2024-12-13 07:03:20.945588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424200 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.746 [2024-12-13 07:03:20.945615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.746 [2024-12-13 07:03:20.945763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42424242 cdw11:424242b9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.746 [2024-12-13 07:03:20.945781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.746 [2024-12-13 07:03:20.945913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4242b442 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.746 [2024-12-13 07:03:20.945930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.746 #56 NEW cov: 11823 ft: 15106 corp: 32/798b lim: 40 exec/s: 56 rss: 69Mb L: 29/39 MS: 1 CrossOver- 00:08:03.005 [2024-12-13 07:03:20.985617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424200 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.005 [2024-12-13 07:03:20.985646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.005 [2024-12-13 07:03:20.985777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42424242 cdw11:424242b9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.005 [2024-12-13 07:03:20.985793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.005 [2024-12-13 07:03:20.985920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4242bc42 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.005 [2024-12-13 07:03:20.985938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.005 #57 NEW cov: 11823 ft: 15110 corp: 33/827b lim: 40 exec/s: 57 rss: 69Mb L: 29/39 MS: 1 ChangeBinInt- 00:08:03.005 [2024-12-13 07:03:21.026428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.005 [2024-12-13 07:03:21.026455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.005 [2024-12-13 07:03:21.026598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.005 [2024-12-13 07:03:21.026616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.005 [2024-12-13 07:03:21.026730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424200 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.005 [2024-12-13 07:03:21.026746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.005 [2024-12-13 07:03:21.026880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.005 [2024-12-13 07:03:21.026898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.005 #58 NEW cov: 11823 ft: 15118 corp: 34/866b lim: 40 exec/s: 58 rss: 69Mb L: 39/39 MS: 1 ShuffleBytes- 00:08:03.005 [2024-12-13 07:03:21.076446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.005 [2024-12-13 07:03:21.076472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.005 [2024-12-13 07:03:21.076601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.005 [2024-12-13 07:03:21.076617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.005 [2024-12-13 07:03:21.076744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:42424200 cdw11:00100000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.005 [2024-12-13 07:03:21.076763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.005 [2024-12-13 07:03:21.076883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.005 [2024-12-13 07:03:21.076899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.005 #59 NEW cov: 11823 ft: 15151 corp: 35/905b lim: 40 exec/s: 59 rss: 69Mb L: 39/39 MS: 1 CMP- DE: "\020\000"- 00:08:03.005 [2024-12-13 07:03:21.116718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a42420a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.005 [2024-12-13 07:03:21.116744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.005 [2024-12-13 07:03:21.116874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:004242b9 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.005 [2024-12-13 07:03:21.116893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.005 [2024-12-13 07:03:21.117017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4242b942 cdw11:42b44242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.005 [2024-12-13 07:03:21.117034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.005 [2024-12-13 07:03:21.117165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:42424242 cdw11:424242ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.005 [2024-12-13 07:03:21.117183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.005 #60 NEW cov: 11823 ft: 15163 corp: 36/940b lim: 40 exec/s: 60 rss: 69Mb L: 35/39 MS: 1 PersAutoDict- DE: "\377\377\000\000"- 00:08:03.005 [2024-12-13 07:03:21.156512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a324242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.005 [2024-12-13 07:03:21.156539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.006 [2024-12-13 07:03:21.156670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42424242 cdw11:b94242b4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.006 [2024-12-13 07:03:21.156687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.006 [2024-12-13 07:03:21.156823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:03424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.006 [2024-12-13 07:03:21.156841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.006 #61 NEW cov: 11823 ft: 15175 corp: 37/966b lim: 40 exec/s: 61 rss: 69Mb L: 26/39 MS: 1 ChangeByte- 00:08:03.006 [2024-12-13 07:03:21.196384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424042 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.006 [2024-12-13 07:03:21.196411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.006 [2024-12-13 07:03:21.196537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:18000000 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.006 [2024-12-13 07:03:21.196553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.006 #62 NEW cov: 11823 ft: 15198 corp: 38/988b lim: 40 exec/s: 62 rss: 69Mb L: 22/39 MS: 1 EraseBytes- 00:08:03.006 [2024-12-13 07:03:21.236460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.006 [2024-12-13 07:03:21.236489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.006 [2024-12-13 07:03:21.236621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:42424242 cdw11:42424242 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.006 [2024-12-13 07:03:21.236638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.265 #63 NEW cov: 11823 ft: 15199 corp: 39/1006b lim: 40 exec/s: 31 rss: 69Mb L: 18/39 MS: 1 CrossOver- 00:08:03.265 #63 DONE cov: 11823 ft: 15199 corp: 39/1006b lim: 40 exec/s: 31 rss: 69Mb 00:08:03.265 ###### Recommended dictionary. ###### 00:08:03.265 "\377\377\000\000" # Uses: 4 00:08:03.265 "\020\000" # Uses: 0 00:08:03.265 ###### End of recommended dictionary. ###### 00:08:03.265 Done 63 runs in 2 second(s) 00:08:03.265 07:03:21 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:08:03.265 07:03:21 -- ../common.sh@72 -- # (( i++ )) 00:08:03.265 07:03:21 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:03.265 07:03:21 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:03.265 07:03:21 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:03.265 07:03:21 -- nvmf/run.sh@24 -- # local timen=1 00:08:03.265 07:03:21 -- nvmf/run.sh@25 -- # local core=0x1 00:08:03.265 07:03:21 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:03.265 07:03:21 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:03.265 07:03:21 -- nvmf/run.sh@29 -- # printf %02d 12 00:08:03.265 07:03:21 -- nvmf/run.sh@29 -- # port=4412 00:08:03.265 07:03:21 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:03.265 07:03:21 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:03.265 07:03:21 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:03.265 07:03:21 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:08:03.265 [2024-12-13 07:03:21.419380] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:03.265 [2024-12-13 07:03:21.419470] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid495171 ] 00:08:03.265 EAL: No free 2048 kB hugepages reported on node 1 00:08:03.523 [2024-12-13 07:03:21.593395] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.524 [2024-12-13 07:03:21.612965] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:03.524 [2024-12-13 07:03:21.613093] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.524 [2024-12-13 07:03:21.664435] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:03.524 [2024-12-13 07:03:21.680723] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:03.524 INFO: Running with entropic power schedule (0xFF, 100). 00:08:03.524 INFO: Seed: 297925253 00:08:03.524 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:03.524 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:03.524 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:03.524 INFO: A corpus is not provided, starting from an empty corpus 00:08:03.524 #2 INITED exec/s: 0 rss: 59Mb 00:08:03.524 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:03.524 This may also happen if the target rejected all inputs we tried so far 00:08:03.524 [2024-12-13 07:03:21.730121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a5e5e5e cdw11:5e5e5e5e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.524 [2024-12-13 07:03:21.730149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.524 [2024-12-13 07:03:21.730211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:5e5e5e5e cdw11:5e5e5e5e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.524 [2024-12-13 07:03:21.730225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.524 [2024-12-13 07:03:21.730282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:5e5e5e5e cdw11:5e5e5e5e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.524 [2024-12-13 07:03:21.730296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.524 [2024-12-13 07:03:21.730351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:5e5e5e5e cdw11:5e5e5e5e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.524 [2024-12-13 07:03:21.730365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.782 NEW_FUNC[1/671]: 0x461d28 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:03.782 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:03.782 #29 NEW cov: 11593 ft: 11595 corp: 2/40b lim: 40 exec/s: 0 rss: 67Mb L: 39/39 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:04.041 [2024-12-13 07:03:22.030392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a5e5e5e cdw11:5e5e5e0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.041 [2024-12-13 07:03:22.030422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.041 #30 NEW cov: 11707 ft: 12965 corp: 3/48b lim: 40 exec/s: 0 rss: 67Mb L: 8/39 MS: 1 CrossOver- 00:08:04.041 [2024-12-13 07:03:22.080664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.041 [2024-12-13 07:03:22.080690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.041 [2024-12-13 07:03:22.080746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.041 [2024-12-13 07:03:22.080762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.041 #31 NEW cov: 11713 ft: 13519 corp: 4/67b lim: 40 exec/s: 0 rss: 67Mb L: 19/39 MS: 1 InsertRepeatedBytes- 00:08:04.041 [2024-12-13 07:03:22.120772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.041 [2024-12-13 07:03:22.120798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.041 [2024-12-13 07:03:22.120857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.041 [2024-12-13 07:03:22.120871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.041 #32 NEW cov: 11798 ft: 13819 corp: 5/86b lim: 40 exec/s: 0 rss: 67Mb L: 19/39 MS: 1 ChangeBinInt- 00:08:04.041 [2024-12-13 07:03:22.170747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000028 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.041 [2024-12-13 07:03:22.170772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.041 #36 NEW cov: 11798 ft: 13863 corp: 6/94b lim: 40 exec/s: 0 rss: 67Mb L: 8/39 MS: 4 ChangeByte-ChangeByte-ShuffleBytes-CrossOver- 00:08:04.041 [2024-12-13 07:03:22.210869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:015e5e5e cdw11:5e5e5e0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.041 [2024-12-13 07:03:22.210894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.041 #37 NEW cov: 11798 ft: 13958 corp: 7/102b lim: 40 exec/s: 0 rss: 67Mb L: 8/39 MS: 1 ChangeBinInt- 00:08:04.041 [2024-12-13 07:03:22.261218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.041 [2024-12-13 07:03:22.261244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.041 [2024-12-13 07:03:22.261304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.041 [2024-12-13 07:03:22.261318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.041 #38 NEW cov: 11798 ft: 14036 corp: 8/121b lim: 40 exec/s: 0 rss: 67Mb L: 19/39 MS: 1 CrossOver- 00:08:04.300 [2024-12-13 07:03:22.301109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.300 [2024-12-13 07:03:22.301134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.300 #39 NEW cov: 11798 ft: 14117 corp: 9/131b lim: 40 exec/s: 0 rss: 67Mb L: 10/39 MS: 1 EraseBytes- 00:08:04.300 [2024-12-13 07:03:22.341248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:015e5e5e cdw11:5eaa5e0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.300 [2024-12-13 07:03:22.341273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.300 #40 NEW cov: 11798 ft: 14139 corp: 10/139b lim: 40 exec/s: 0 rss: 67Mb L: 8/39 MS: 1 ChangeBinInt- 00:08:04.300 [2024-12-13 07:03:22.381689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:e0e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.301 [2024-12-13 07:03:22.381714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.301 [2024-12-13 07:03:22.381787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e0e0e0e0 cdw11:e0e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.301 [2024-12-13 07:03:22.381804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.301 [2024-12-13 07:03:22.381862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e0e0e0e0 cdw11:e0e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.301 [2024-12-13 07:03:22.381876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.301 #41 NEW cov: 11798 ft: 14450 corp: 11/169b lim: 40 exec/s: 0 rss: 68Mb L: 30/39 MS: 1 InsertRepeatedBytes- 00:08:04.301 [2024-12-13 07:03:22.421501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.301 [2024-12-13 07:03:22.421526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.301 #42 NEW cov: 11798 ft: 14462 corp: 12/180b lim: 40 exec/s: 0 rss: 68Mb L: 11/39 MS: 1 InsertByte- 00:08:04.301 [2024-12-13 07:03:22.461921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00100000 cdw11:e0e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.301 [2024-12-13 07:03:22.461946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.301 [2024-12-13 07:03:22.462023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e0e0e0e0 cdw11:e0e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.301 [2024-12-13 07:03:22.462036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.301 [2024-12-13 07:03:22.462095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e0e0e0e0 cdw11:e0e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.301 [2024-12-13 07:03:22.462109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.301 #43 NEW cov: 11798 ft: 14494 corp: 13/210b lim: 40 exec/s: 0 rss: 68Mb L: 30/39 MS: 1 ChangeBit- 00:08:04.301 [2024-12-13 07:03:22.501658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:b6ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.301 [2024-12-13 07:03:22.501683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.301 #45 NEW cov: 11798 ft: 14539 corp: 14/218b lim: 40 exec/s: 0 rss: 68Mb L: 8/39 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:04.560 [2024-12-13 07:03:22.542223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:e0e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.560 [2024-12-13 07:03:22.542249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.560 [2024-12-13 07:03:22.542309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e0e0e0e0 cdw11:e0e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.560 [2024-12-13 07:03:22.542323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.560 [2024-12-13 07:03:22.542381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e0e0e0e0 cdw11:e0e09ae0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.560 [2024-12-13 07:03:22.542395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.560 #46 NEW cov: 11798 ft: 14591 corp: 15/248b lim: 40 exec/s: 0 rss: 68Mb L: 30/39 MS: 1 ChangeByte- 00:08:04.560 [2024-12-13 07:03:22.582282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:e0e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.560 [2024-12-13 07:03:22.582310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.560 [2024-12-13 07:03:22.582386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e0e0e0e0 cdw11:e0e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.560 [2024-12-13 07:03:22.582401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.560 [2024-12-13 07:03:22.582460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e0e0e0e0 cdw11:d9e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.560 [2024-12-13 07:03:22.582474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.560 #47 NEW cov: 11798 ft: 14688 corp: 16/278b lim: 40 exec/s: 0 rss: 68Mb L: 30/39 MS: 1 ChangeBinInt- 00:08:04.560 [2024-12-13 07:03:22.622077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.560 [2024-12-13 07:03:22.622102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.560 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:04.560 #48 NEW cov: 11821 ft: 14731 corp: 17/290b lim: 40 exec/s: 0 rss: 68Mb L: 12/39 MS: 1 InsertByte- 00:08:04.560 [2024-12-13 07:03:22.672214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00003f00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.560 [2024-12-13 07:03:22.672238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.560 #49 NEW cov: 11821 ft: 14750 corp: 18/299b lim: 40 exec/s: 0 rss: 68Mb L: 9/39 MS: 1 EraseBytes- 00:08:04.560 [2024-12-13 07:03:22.712818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a5e5e5e cdw11:5e5e5e5e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.560 [2024-12-13 07:03:22.712844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.560 [2024-12-13 07:03:22.712900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:5e5e5e5e cdw11:5e5e5e5e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.560 [2024-12-13 07:03:22.712914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.560 [2024-12-13 07:03:22.712971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:5e5e5e5e cdw11:5e5e5e5e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.560 [2024-12-13 07:03:22.712984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.560 [2024-12-13 07:03:22.713038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:5e5e5e5e cdw11:5e5e5e5e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.560 [2024-12-13 07:03:22.713051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.560 #50 NEW cov: 11821 ft: 14811 corp: 19/338b lim: 40 exec/s: 50 rss: 68Mb L: 39/39 MS: 1 CopyPart- 00:08:04.560 [2024-12-13 07:03:22.752609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.560 [2024-12-13 07:03:22.752635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.560 [2024-12-13 07:03:22.752708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.560 [2024-12-13 07:03:22.752722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.560 #51 NEW cov: 11821 ft: 14825 corp: 20/357b lim: 40 exec/s: 51 rss: 68Mb L: 19/39 MS: 1 ShuffleBytes- 00:08:04.560 [2024-12-13 07:03:22.792532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:3f000a00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.560 [2024-12-13 07:03:22.792557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.819 #52 NEW cov: 11821 ft: 14837 corp: 21/366b lim: 40 exec/s: 52 rss: 68Mb L: 9/39 MS: 1 CopyPart- 00:08:04.819 [2024-12-13 07:03:22.832663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:b6ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.819 [2024-12-13 07:03:22.832689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.819 #53 NEW cov: 11821 ft: 14848 corp: 22/374b lim: 40 exec/s: 53 rss: 68Mb L: 8/39 MS: 1 CopyPart- 00:08:04.819 [2024-12-13 07:03:22.872741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a1e5e5e cdw11:5e5e5e0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.819 [2024-12-13 07:03:22.872765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.819 #54 NEW cov: 11821 ft: 14886 corp: 23/382b lim: 40 exec/s: 54 rss: 68Mb L: 8/39 MS: 1 ChangeBit- 00:08:04.819 [2024-12-13 07:03:22.913031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000df00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.819 [2024-12-13 07:03:22.913056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.819 [2024-12-13 07:03:22.913115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.819 [2024-12-13 07:03:22.913129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.819 #55 NEW cov: 11821 ft: 14911 corp: 24/401b lim: 40 exec/s: 55 rss: 68Mb L: 19/39 MS: 1 ChangeByte- 00:08:04.819 [2024-12-13 07:03:22.952994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e0e0e0e0 cdw11:e0e0e06c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.819 [2024-12-13 07:03:22.953018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.819 #60 NEW cov: 11821 ft: 14923 corp: 25/411b lim: 40 exec/s: 60 rss: 68Mb L: 10/39 MS: 5 CopyPart-InsertByte-ChangeByte-ChangeBit-CrossOver- 00:08:04.819 [2024-12-13 07:03:22.993087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ff015e5e cdw11:5e5eaa5e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.819 [2024-12-13 07:03:22.993112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.819 #61 NEW cov: 11821 ft: 14954 corp: 26/420b lim: 40 exec/s: 61 rss: 69Mb L: 9/39 MS: 1 CrossOver- 00:08:04.819 [2024-12-13 07:03:23.033362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00004000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.819 [2024-12-13 07:03:23.033387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.819 [2024-12-13 07:03:23.033459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.819 [2024-12-13 07:03:23.033474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.078 #62 NEW cov: 11821 ft: 14967 corp: 27/439b lim: 40 exec/s: 62 rss: 69Mb L: 19/39 MS: 1 ChangeByte- 00:08:05.078 [2024-12-13 07:03:23.073329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a1e5e5e cdw11:5e5e5e0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.078 [2024-12-13 07:03:23.073356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.078 #63 NEW cov: 11821 ft: 14993 corp: 28/447b lim: 40 exec/s: 63 rss: 69Mb L: 8/39 MS: 1 CrossOver- 00:08:05.078 [2024-12-13 07:03:23.113945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:001000ff cdw11:ffff00e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.078 [2024-12-13 07:03:23.113970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.078 [2024-12-13 07:03:23.114043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e0e0e0e0 cdw11:e0e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.078 [2024-12-13 07:03:23.114058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.078 [2024-12-13 07:03:23.114113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e0e0e0e0 cdw11:e0e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.078 [2024-12-13 07:03:23.114127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.078 [2024-12-13 07:03:23.114191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:e0e0e0e0 cdw11:e0000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.078 [2024-12-13 07:03:23.114205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.078 #64 NEW cov: 11821 ft: 15003 corp: 29/480b lim: 40 exec/s: 64 rss: 69Mb L: 33/39 MS: 1 InsertRepeatedBytes- 00:08:05.079 [2024-12-13 07:03:23.153855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:e0e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.079 [2024-12-13 07:03:23.153879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.079 [2024-12-13 07:03:23.153938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e0e0e0e0 cdw11:e0e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.079 [2024-12-13 07:03:23.153952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.079 [2024-12-13 07:03:23.154011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e0e0e0e0 cdw11:e0e09ae0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.079 [2024-12-13 07:03:23.154024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.079 #65 NEW cov: 11821 ft: 15008 corp: 30/510b lim: 40 exec/s: 65 rss: 69Mb L: 30/39 MS: 1 CopyPart- 00:08:05.079 [2024-12-13 07:03:23.193798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00004000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.079 [2024-12-13 07:03:23.193823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.079 [2024-12-13 07:03:23.193883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.079 [2024-12-13 07:03:23.193896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.079 #66 NEW cov: 11821 ft: 15023 corp: 31/529b lim: 40 exec/s: 66 rss: 69Mb L: 19/39 MS: 1 ShuffleBytes- 00:08:05.079 [2024-12-13 07:03:23.234115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00100000 cdw11:e0e4e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.079 [2024-12-13 07:03:23.234139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.079 [2024-12-13 07:03:23.234205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e0e0e0e0 cdw11:e0e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.079 [2024-12-13 07:03:23.234236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.079 [2024-12-13 07:03:23.234294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e0e0e0e0 cdw11:e0e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.079 [2024-12-13 07:03:23.234308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.079 #67 NEW cov: 11821 ft: 15028 corp: 32/559b lim: 40 exec/s: 67 rss: 69Mb L: 30/39 MS: 1 ChangeBit- 00:08:05.079 [2024-12-13 07:03:23.274262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00100000 cdw11:e0e4e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.079 [2024-12-13 07:03:23.274287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.079 [2024-12-13 07:03:23.274363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e0e0e0e0 cdw11:e0e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.079 [2024-12-13 07:03:23.274377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.079 [2024-12-13 07:03:23.274435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e0e0e0e0 cdw11:e0e041e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.079 [2024-12-13 07:03:23.274449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.079 #73 NEW cov: 11821 ft: 15031 corp: 33/589b lim: 40 exec/s: 73 rss: 69Mb L: 30/39 MS: 1 ChangeByte- 00:08:05.079 [2024-12-13 07:03:23.314037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:000a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.079 [2024-12-13 07:03:23.314061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.338 #74 NEW cov: 11821 ft: 15048 corp: 34/599b lim: 40 exec/s: 74 rss: 69Mb L: 10/39 MS: 1 ChangeBinInt- 00:08:05.338 [2024-12-13 07:03:23.354477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:e0e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.338 [2024-12-13 07:03:23.354501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.338 [2024-12-13 07:03:23.354573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e0e0e0e0 cdw11:e0e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.338 [2024-12-13 07:03:23.354587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.338 [2024-12-13 07:03:23.354644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e0e0e0e0 cdw11:e0e0e09a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.338 [2024-12-13 07:03:23.354658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.338 #75 NEW cov: 11821 ft: 15063 corp: 35/629b lim: 40 exec/s: 75 rss: 69Mb L: 30/39 MS: 1 ShuffleBytes- 00:08:05.338 [2024-12-13 07:03:23.394795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00100000 cdw11:e0e4e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.338 [2024-12-13 07:03:23.394820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.338 [2024-12-13 07:03:23.394893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:e0e0e0e0 cdw11:e0e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.338 [2024-12-13 07:03:23.394910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.338 [2024-12-13 07:03:23.394967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:e0e0e0e0 cdw11:e0e0e0e0 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.338 [2024-12-13 07:03:23.394981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.338 [2024-12-13 07:03:23.395037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:e041e041 cdw11:e0e0e000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.338 [2024-12-13 07:03:23.395051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.338 #76 NEW cov: 11821 ft: 15083 corp: 36/664b lim: 40 exec/s: 76 rss: 69Mb L: 35/39 MS: 1 CopyPart- 00:08:05.338 [2024-12-13 07:03:23.434566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00c5c5c5 cdw11:c5c5c5c5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.338 [2024-12-13 07:03:23.434590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.338 [2024-12-13 07:03:23.434646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:c5c5c5c5 cdw11:00003f5d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.338 [2024-12-13 07:03:23.434660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.339 #78 NEW cov: 11821 ft: 15107 corp: 37/682b lim: 40 exec/s: 78 rss: 69Mb L: 18/39 MS: 2 EraseBytes-InsertRepeatedBytes- 00:08:05.339 [2024-12-13 07:03:23.474514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a1e5e5a cdw11:5e5e5e0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.339 [2024-12-13 07:03:23.474538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.339 #79 NEW cov: 11821 ft: 15108 corp: 38/690b lim: 40 exec/s: 79 rss: 70Mb L: 8/39 MS: 1 ChangeBinInt- 00:08:05.339 [2024-12-13 07:03:23.514616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:1a1e5e5e cdw11:5e5e5e0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.339 [2024-12-13 07:03:23.514640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.339 #80 NEW cov: 11821 ft: 15116 corp: 39/698b lim: 40 exec/s: 80 rss: 70Mb L: 8/39 MS: 1 ChangeBit- 00:08:05.339 [2024-12-13 07:03:23.554717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:b6ffffff cdw11:ff2dffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.339 [2024-12-13 07:03:23.554742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.598 #81 NEW cov: 11821 ft: 15131 corp: 40/707b lim: 40 exec/s: 81 rss: 70Mb L: 9/39 MS: 1 InsertByte- 00:08:05.598 [2024-12-13 07:03:23.594995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.598 [2024-12-13 07:03:23.595019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.598 [2024-12-13 07:03:23.595079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:002e0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.598 [2024-12-13 07:03:23.595092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.598 #82 NEW cov: 11821 ft: 15132 corp: 41/726b lim: 40 exec/s: 82 rss: 70Mb L: 19/39 MS: 1 ChangeByte- 00:08:05.598 [2024-12-13 07:03:23.634961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:b6ffffb8 cdw11:ff2dffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.598 [2024-12-13 07:03:23.634991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.598 #83 NEW cov: 11821 ft: 15147 corp: 42/735b lim: 40 exec/s: 83 rss: 70Mb L: 9/39 MS: 1 ChangeByte- 00:08:05.598 [2024-12-13 07:03:23.675089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00e83f00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.598 [2024-12-13 07:03:23.675114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.598 #84 NEW cov: 11821 ft: 15151 corp: 43/744b lim: 40 exec/s: 84 rss: 70Mb L: 9/39 MS: 1 ChangeByte- 00:08:05.598 [2024-12-13 07:03:23.715356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00004000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.598 [2024-12-13 07:03:23.715382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.598 [2024-12-13 07:03:23.715438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:2e000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.598 [2024-12-13 07:03:23.715451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.598 #85 NEW cov: 11821 ft: 15159 corp: 44/764b lim: 40 exec/s: 42 rss: 70Mb L: 20/39 MS: 1 InsertByte- 00:08:05.598 #85 DONE cov: 11821 ft: 15159 corp: 44/764b lim: 40 exec/s: 42 rss: 70Mb 00:08:05.598 Done 85 runs in 2 second(s) 00:08:05.857 07:03:23 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:08:05.857 07:03:23 -- ../common.sh@72 -- # (( i++ )) 00:08:05.857 07:03:23 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:05.857 07:03:23 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:05.857 07:03:23 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:05.857 07:03:23 -- nvmf/run.sh@24 -- # local timen=1 00:08:05.857 07:03:23 -- nvmf/run.sh@25 -- # local core=0x1 00:08:05.857 07:03:23 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:05.857 07:03:23 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:05.857 07:03:23 -- nvmf/run.sh@29 -- # printf %02d 13 00:08:05.857 07:03:23 -- nvmf/run.sh@29 -- # port=4413 00:08:05.857 07:03:23 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:05.857 07:03:23 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:05.857 07:03:23 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:05.857 07:03:23 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:08:05.857 [2024-12-13 07:03:23.889712] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:05.857 [2024-12-13 07:03:23.889775] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid495581 ] 00:08:05.857 EAL: No free 2048 kB hugepages reported on node 1 00:08:05.857 [2024-12-13 07:03:24.068864] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.857 [2024-12-13 07:03:24.088466] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:05.857 [2024-12-13 07:03:24.088602] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.116 [2024-12-13 07:03:24.139936] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:06.116 [2024-12-13 07:03:24.156295] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:06.116 INFO: Running with entropic power schedule (0xFF, 100). 00:08:06.116 INFO: Seed: 2773914051 00:08:06.116 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:06.116 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:06.116 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:06.116 INFO: A corpus is not provided, starting from an empty corpus 00:08:06.116 #2 INITED exec/s: 0 rss: 59Mb 00:08:06.116 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:06.116 This may also happen if the target rejected all inputs we tried so far 00:08:06.116 [2024-12-13 07:03:24.201394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa1a1a1 cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.116 [2024-12-13 07:03:24.201422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.374 NEW_FUNC[1/670]: 0x4638f8 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:06.374 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:06.375 #3 NEW cov: 11582 ft: 11575 corp: 2/11b lim: 40 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:08:06.375 [2024-12-13 07:03:24.502192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa1a1a1 cdw11:0aa1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.375 [2024-12-13 07:03:24.502222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.375 #4 NEW cov: 11695 ft: 12175 corp: 3/22b lim: 40 exec/s: 0 rss: 67Mb L: 11/11 MS: 1 CrossOver- 00:08:06.375 [2024-12-13 07:03:24.542286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa1a1a1 cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.375 [2024-12-13 07:03:24.542312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.375 #5 NEW cov: 11701 ft: 12325 corp: 4/32b lim: 40 exec/s: 0 rss: 67Mb L: 10/11 MS: 1 ShuffleBytes- 00:08:06.375 [2024-12-13 07:03:24.582391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa1a1a1 cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.375 [2024-12-13 07:03:24.582416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.375 #6 NEW cov: 11786 ft: 12606 corp: 5/42b lim: 40 exec/s: 0 rss: 67Mb L: 10/11 MS: 1 CrossOver- 00:08:06.633 [2024-12-13 07:03:24.622778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:3c545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.633 [2024-12-13 07:03:24.622803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.633 [2024-12-13 07:03:24.622879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.633 [2024-12-13 07:03:24.622893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.633 [2024-12-13 07:03:24.622952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-12-13 07:03:24.622966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.634 #9 NEW cov: 11786 ft: 13119 corp: 6/71b lim: 40 exec/s: 0 rss: 67Mb L: 29/29 MS: 3 InsertByte-InsertByte-InsertRepeatedBytes- 00:08:06.634 [2024-12-13 07:03:24.662571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a21a1a1 cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-12-13 07:03:24.662596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.634 #10 NEW cov: 11786 ft: 13178 corp: 7/81b lim: 40 exec/s: 0 rss: 67Mb L: 10/29 MS: 1 ChangeBit- 00:08:06.634 [2024-12-13 07:03:24.702647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa1a1a1 cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-12-13 07:03:24.702673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.634 #11 NEW cov: 11786 ft: 13287 corp: 8/92b lim: 40 exec/s: 0 rss: 67Mb L: 11/29 MS: 1 InsertByte- 00:08:06.634 [2024-12-13 07:03:24.742825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a21a10a cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-12-13 07:03:24.742849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.634 #12 NEW cov: 11786 ft: 13337 corp: 9/105b lim: 40 exec/s: 0 rss: 67Mb L: 13/29 MS: 1 CrossOver- 00:08:06.634 [2024-12-13 07:03:24.782932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa1a5a1 cdw11:0aa1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-12-13 07:03:24.782957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.634 #13 NEW cov: 11786 ft: 13440 corp: 10/116b lim: 40 exec/s: 0 rss: 67Mb L: 11/29 MS: 1 ChangeBit- 00:08:06.634 [2024-12-13 07:03:24.823185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a21a1a1 cdw11:a1a13c54 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-12-13 07:03:24.823214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.634 [2024-12-13 07:03:24.823288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-12-13 07:03:24.823303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.634 #14 NEW cov: 11786 ft: 13631 corp: 11/137b lim: 40 exec/s: 0 rss: 67Mb L: 21/29 MS: 1 CrossOver- 00:08:06.634 [2024-12-13 07:03:24.863147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa1a1a1 cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.634 [2024-12-13 07:03:24.863172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.893 #15 NEW cov: 11786 ft: 13663 corp: 12/147b lim: 40 exec/s: 0 rss: 67Mb L: 10/29 MS: 1 ShuffleBytes- 00:08:06.893 [2024-12-13 07:03:24.903238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a1a1 cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.893 [2024-12-13 07:03:24.903262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.893 #16 NEW cov: 11786 ft: 13674 corp: 13/157b lim: 40 exec/s: 0 rss: 67Mb L: 10/29 MS: 1 ChangeBit- 00:08:06.893 [2024-12-13 07:03:24.933337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0acaa1a1 cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.893 [2024-12-13 07:03:24.933361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.893 #17 NEW cov: 11786 ft: 13760 corp: 14/167b lim: 40 exec/s: 0 rss: 67Mb L: 10/29 MS: 1 ChangeByte- 00:08:06.893 [2024-12-13 07:03:24.973481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa10aa1 cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.893 [2024-12-13 07:03:24.973505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.893 #18 NEW cov: 11786 ft: 13805 corp: 15/177b lim: 40 exec/s: 0 rss: 67Mb L: 10/29 MS: 1 ChangeBinInt- 00:08:06.893 [2024-12-13 07:03:25.003586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0aa10a cdw11:a121a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.893 [2024-12-13 07:03:25.003611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.893 #19 NEW cov: 11786 ft: 13836 corp: 16/191b lim: 40 exec/s: 0 rss: 67Mb L: 14/29 MS: 1 CrossOver- 00:08:06.893 [2024-12-13 07:03:25.033671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa10aa1 cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.893 [2024-12-13 07:03:25.033695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.893 #20 NEW cov: 11786 ft: 13883 corp: 17/206b lim: 40 exec/s: 0 rss: 67Mb L: 15/29 MS: 1 InsertRepeatedBytes- 00:08:06.893 [2024-12-13 07:03:25.074082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0aa10a cdw11:a121a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.893 [2024-12-13 07:03:25.074107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.893 [2024-12-13 07:03:25.074182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a1a1a100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.893 [2024-12-13 07:03:25.074201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.893 [2024-12-13 07:03:25.074264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.893 [2024-12-13 07:03:25.074277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.893 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:06.893 #21 NEW cov: 11809 ft: 13958 corp: 18/233b lim: 40 exec/s: 0 rss: 68Mb L: 27/29 MS: 1 InsertRepeatedBytes- 00:08:06.893 [2024-12-13 07:03:25.113924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a21a1a1 cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.893 [2024-12-13 07:03:25.113949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.152 #22 NEW cov: 11809 ft: 13976 corp: 19/246b lim: 40 exec/s: 0 rss: 68Mb L: 13/29 MS: 1 ShuffleBytes- 00:08:07.152 [2024-12-13 07:03:25.154067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a1a1a1a1 cdw11:0aa9a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.152 [2024-12-13 07:03:25.154091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.152 #23 NEW cov: 11809 ft: 13990 corp: 20/256b lim: 40 exec/s: 0 rss: 68Mb L: 10/29 MS: 1 ShuffleBytes- 00:08:07.152 [2024-12-13 07:03:25.194561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0aa10a cdw11:a1030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.152 [2024-12-13 07:03:25.194586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.152 [2024-12-13 07:03:25.194646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:03030303 cdw11:03030303 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.152 [2024-12-13 07:03:25.194660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.152 [2024-12-13 07:03:25.194718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:21a1a1a1 cdw11:a1a10000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.152 [2024-12-13 07:03:25.194732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.152 [2024-12-13 07:03:25.194793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.152 [2024-12-13 07:03:25.194808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.152 #24 NEW cov: 11809 ft: 14529 corp: 21/294b lim: 40 exec/s: 24 rss: 68Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:07.152 [2024-12-13 07:03:25.244426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa1a5a1 cdw11:abababab SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.152 [2024-12-13 07:03:25.244452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.152 [2024-12-13 07:03:25.244514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:abababab cdw11:ababab0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.152 [2024-12-13 07:03:25.244528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.152 #25 NEW cov: 11809 ft: 14536 corp: 22/316b lim: 40 exec/s: 25 rss: 68Mb L: 22/38 MS: 1 InsertRepeatedBytes- 00:08:07.152 [2024-12-13 07:03:25.284418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a21a1a1 cdw11:a1a15f5e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.152 [2024-12-13 07:03:25.284443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.152 #26 NEW cov: 11809 ft: 14564 corp: 23/329b lim: 40 exec/s: 26 rss: 68Mb L: 13/38 MS: 1 ChangeBinInt- 00:08:07.152 [2024-12-13 07:03:25.324874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:3c545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.152 [2024-12-13 07:03:25.324899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.152 [2024-12-13 07:03:25.324975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.152 [2024-12-13 07:03:25.324989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.152 [2024-12-13 07:03:25.325050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.152 [2024-12-13 07:03:25.325063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.152 [2024-12-13 07:03:25.325122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:5454540a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.152 [2024-12-13 07:03:25.325135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.152 #27 NEW cov: 11809 ft: 14579 corp: 24/366b lim: 40 exec/s: 27 rss: 68Mb L: 37/38 MS: 1 CopyPart- 00:08:07.152 [2024-12-13 07:03:25.365020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:3c545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.152 [2024-12-13 07:03:25.365045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.152 [2024-12-13 07:03:25.365120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.152 [2024-12-13 07:03:25.365134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.152 [2024-12-13 07:03:25.365195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:54545412 cdw11:12121212 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.152 [2024-12-13 07:03:25.365212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.152 [2024-12-13 07:03:25.365268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.152 [2024-12-13 07:03:25.365281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.152 #28 NEW cov: 11809 ft: 14595 corp: 25/400b lim: 40 exec/s: 28 rss: 68Mb L: 34/38 MS: 1 InsertRepeatedBytes- 00:08:07.410 [2024-12-13 07:03:25.404726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a21a15f cdw11:a15ea1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.410 [2024-12-13 07:03:25.404751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.410 #29 NEW cov: 11809 ft: 14601 corp: 26/413b lim: 40 exec/s: 29 rss: 68Mb L: 13/38 MS: 1 ShuffleBytes- 00:08:07.410 [2024-12-13 07:03:25.445410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.410 [2024-12-13 07:03:25.445434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.411 [2024-12-13 07:03:25.445506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.411 [2024-12-13 07:03:25.445520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.411 [2024-12-13 07:03:25.445577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.411 [2024-12-13 07:03:25.445592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.411 [2024-12-13 07:03:25.445648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.411 [2024-12-13 07:03:25.445662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.411 [2024-12-13 07:03:25.445719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:72727272 cdw11:72727272 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.411 [2024-12-13 07:03:25.445732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:07.411 #30 NEW cov: 11809 ft: 14718 corp: 27/453b lim: 40 exec/s: 30 rss: 68Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:07.411 [2024-12-13 07:03:25.485212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:3c545454 cdw11:5454543c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.411 [2024-12-13 07:03:25.485238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.411 [2024-12-13 07:03:25.485314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.411 [2024-12-13 07:03:25.485328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.411 [2024-12-13 07:03:25.485391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.411 [2024-12-13 07:03:25.485405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.411 #31 NEW cov: 11809 ft: 14744 corp: 28/482b lim: 40 exec/s: 31 rss: 68Mb L: 29/40 MS: 1 CopyPart- 00:08:07.411 [2024-12-13 07:03:25.525354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:3c545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.411 [2024-12-13 07:03:25.525378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.411 [2024-12-13 07:03:25.525440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.411 [2024-12-13 07:03:25.525454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.411 [2024-12-13 07:03:25.525516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:54544454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.411 [2024-12-13 07:03:25.525530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.411 #32 NEW cov: 11809 ft: 14766 corp: 29/511b lim: 40 exec/s: 32 rss: 68Mb L: 29/40 MS: 1 ChangeBit- 00:08:07.411 [2024-12-13 07:03:25.565474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:3c544a54 cdw11:5454543c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.411 [2024-12-13 07:03:25.565499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.411 [2024-12-13 07:03:25.565556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.411 [2024-12-13 07:03:25.565570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.411 [2024-12-13 07:03:25.565640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.411 [2024-12-13 07:03:25.565655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.411 #33 NEW cov: 11809 ft: 14767 corp: 30/540b lim: 40 exec/s: 33 rss: 68Mb L: 29/40 MS: 1 ChangeBinInt- 00:08:07.411 [2024-12-13 07:03:25.605461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa1a1a1 cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.411 [2024-12-13 07:03:25.605486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.411 [2024-12-13 07:03:25.605542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:74adebc2 cdw11:aae70200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.411 [2024-12-13 07:03:25.605556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.411 #34 NEW cov: 11809 ft: 14780 corp: 31/558b lim: 40 exec/s: 34 rss: 69Mb L: 18/40 MS: 1 CMP- DE: "t\255\353\302\252\347\002\000"- 00:08:07.411 [2024-12-13 07:03:25.645868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0e0e0e0e cdw11:0e0e0e0e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.411 [2024-12-13 07:03:25.645892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.411 [2024-12-13 07:03:25.645965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0e0e0e0e cdw11:0e0e0e0e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.411 [2024-12-13 07:03:25.645980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.411 [2024-12-13 07:03:25.646038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0e0e0aa1 cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.411 [2024-12-13 07:03:25.646054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.411 [2024-12-13 07:03:25.646111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a1a174ad cdw11:ebc2aae7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.411 [2024-12-13 07:03:25.646125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.670 #35 NEW cov: 11809 ft: 14800 corp: 32/594b lim: 40 exec/s: 35 rss: 69Mb L: 36/40 MS: 1 InsertRepeatedBytes- 00:08:07.670 [2024-12-13 07:03:25.685575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0acaa1a1 cdw11:a1a1a149 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.670 [2024-12-13 07:03:25.685599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.670 #36 NEW cov: 11809 ft: 14811 corp: 33/609b lim: 40 exec/s: 36 rss: 69Mb L: 15/40 MS: 1 InsertRepeatedBytes- 00:08:07.670 [2024-12-13 07:03:25.725695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa10000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.670 [2024-12-13 07:03:25.725719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.670 #37 NEW cov: 11809 ft: 14862 corp: 34/619b lim: 40 exec/s: 37 rss: 69Mb L: 10/40 MS: 1 ChangeBinInt- 00:08:07.670 [2024-12-13 07:03:25.755779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9f4a1 cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.670 [2024-12-13 07:03:25.755804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.670 #38 NEW cov: 11809 ft: 14876 corp: 35/629b lim: 40 exec/s: 38 rss: 69Mb L: 10/40 MS: 1 ChangeByte- 00:08:07.670 [2024-12-13 07:03:25.796020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa1a5a1 cdw11:abababab SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.670 [2024-12-13 07:03:25.796045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.670 [2024-12-13 07:03:25.796120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:abababab cdw11:ababab0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.670 [2024-12-13 07:03:25.796134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.670 #39 NEW cov: 11809 ft: 14888 corp: 36/651b lim: 40 exec/s: 39 rss: 69Mb L: 22/40 MS: 1 ChangeBit- 00:08:07.670 [2024-12-13 07:03:25.836149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0acaa1a1 cdw11:a1a1abab SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.670 [2024-12-13 07:03:25.836173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.670 [2024-12-13 07:03:25.836249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:abababab cdw11:abababab SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.670 [2024-12-13 07:03:25.836263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.670 #40 NEW cov: 11809 ft: 14890 corp: 37/673b lim: 40 exec/s: 40 rss: 69Mb L: 22/40 MS: 1 CrossOver- 00:08:07.670 [2024-12-13 07:03:25.876130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0adba1a1 cdw11:a1a15f5e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.670 [2024-12-13 07:03:25.876155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.670 #41 NEW cov: 11809 ft: 14924 corp: 38/686b lim: 40 exec/s: 41 rss: 69Mb L: 13/40 MS: 1 ChangeBinInt- 00:08:07.930 [2024-12-13 07:03:25.916673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:3c54543d cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-12-13 07:03:25.916697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.930 [2024-12-13 07:03:25.916774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-12-13 07:03:25.916788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.930 [2024-12-13 07:03:25.916849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-12-13 07:03:25.916862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.930 [2024-12-13 07:03:25.916920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-12-13 07:03:25.916933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.930 #42 NEW cov: 11809 ft: 14930 corp: 39/724b lim: 40 exec/s: 42 rss: 69Mb L: 38/40 MS: 1 InsertByte- 00:08:07.930 [2024-12-13 07:03:25.956384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa15da1 cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-12-13 07:03:25.956408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.930 #43 NEW cov: 11809 ft: 14933 corp: 40/734b lim: 40 exec/s: 43 rss: 69Mb L: 10/40 MS: 1 ChangeByte- 00:08:07.930 [2024-12-13 07:03:25.986429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa1a1a1 cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-12-13 07:03:25.986453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.930 #44 NEW cov: 11809 ft: 14935 corp: 41/744b lim: 40 exec/s: 44 rss: 69Mb L: 10/40 MS: 1 ChangeByte- 00:08:07.930 [2024-12-13 07:03:26.016551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a50a1a1 cdw11:a1a15f5e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-12-13 07:03:26.016575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.930 #45 NEW cov: 11809 ft: 14948 corp: 42/757b lim: 40 exec/s: 45 rss: 69Mb L: 13/40 MS: 1 ChangeByte- 00:08:07.930 [2024-12-13 07:03:26.047026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:3c545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-12-13 07:03:26.047051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.930 [2024-12-13 07:03:26.047108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-12-13 07:03:26.047121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.930 [2024-12-13 07:03:26.047179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:54545412 cdw11:12121212 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-12-13 07:03:26.047196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.930 [2024-12-13 07:03:26.047256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-12-13 07:03:26.047272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.930 #46 NEW cov: 11809 ft: 14962 corp: 43/794b lim: 40 exec/s: 46 rss: 69Mb L: 37/40 MS: 1 CopyPart- 00:08:07.930 [2024-12-13 07:03:26.086802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a8a1a1a1 cdw11:0aa9a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-12-13 07:03:26.086827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.930 #47 NEW cov: 11809 ft: 15001 corp: 44/804b lim: 40 exec/s: 47 rss: 69Mb L: 10/40 MS: 1 ChangeBinInt- 00:08:07.930 [2024-12-13 07:03:26.126902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a0aa10a cdw11:a121a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-12-13 07:03:26.126928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.930 #48 NEW cov: 11809 ft: 15013 corp: 45/819b lim: 40 exec/s: 48 rss: 69Mb L: 15/40 MS: 1 InsertByte- 00:08:07.930 [2024-12-13 07:03:26.167169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa1a1a1 cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-12-13 07:03:26.167198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.930 [2024-12-13 07:03:26.167257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:74b7ebc2 cdw11:aae70200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-12-13 07:03:26.167272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.190 #49 NEW cov: 11809 ft: 15018 corp: 46/837b lim: 40 exec/s: 24 rss: 69Mb L: 18/40 MS: 1 ChangeByte- 00:08:08.190 #49 DONE cov: 11809 ft: 15018 corp: 46/837b lim: 40 exec/s: 24 rss: 69Mb 00:08:08.190 ###### Recommended dictionary. ###### 00:08:08.190 "t\255\353\302\252\347\002\000" # Uses: 0 00:08:08.190 ###### End of recommended dictionary. ###### 00:08:08.190 Done 49 runs in 2 second(s) 00:08:08.190 07:03:26 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:08:08.190 07:03:26 -- ../common.sh@72 -- # (( i++ )) 00:08:08.190 07:03:26 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:08.190 07:03:26 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:08.190 07:03:26 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:08.190 07:03:26 -- nvmf/run.sh@24 -- # local timen=1 00:08:08.190 07:03:26 -- nvmf/run.sh@25 -- # local core=0x1 00:08:08.190 07:03:26 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:08.190 07:03:26 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:08.190 07:03:26 -- nvmf/run.sh@29 -- # printf %02d 14 00:08:08.190 07:03:26 -- nvmf/run.sh@29 -- # port=4414 00:08:08.190 07:03:26 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:08.190 07:03:26 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:08.190 07:03:26 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:08.190 07:03:26 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:08:08.190 [2024-12-13 07:03:26.341092] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:08.190 [2024-12-13 07:03:26.341167] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid496008 ] 00:08:08.190 EAL: No free 2048 kB hugepages reported on node 1 00:08:08.449 [2024-12-13 07:03:26.522109] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.449 [2024-12-13 07:03:26.541375] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:08.449 [2024-12-13 07:03:26.541511] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.449 [2024-12-13 07:03:26.592783] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:08.449 [2024-12-13 07:03:26.609105] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:08.449 INFO: Running with entropic power schedule (0xFF, 100). 00:08:08.449 INFO: Seed: 932964029 00:08:08.449 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:08.449 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:08.449 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:08.449 INFO: A corpus is not provided, starting from an empty corpus 00:08:08.449 #2 INITED exec/s: 0 rss: 59Mb 00:08:08.449 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:08.449 This may also happen if the target rejected all inputs we tried so far 00:08:08.966 NEW_FUNC[1/653]: 0x4654c8 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:08.966 NEW_FUNC[2/653]: 0x4868f8 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:08.966 #20 NEW cov: 11428 ft: 11429 corp: 2/10b lim: 35 exec/s: 0 rss: 67Mb L: 9/9 MS: 3 CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:08:08.966 NEW_FUNC[1/5]: 0x16c3978 in spdk_nvme_qpair_process_completions /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:757 00:08:08.966 NEW_FUNC[2/5]: 0x1727568 in nvme_transport_qpair_process_completions /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_transport.c:606 00:08:08.966 #26 NEW cov: 11585 ft: 11911 corp: 3/19b lim: 35 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 ShuffleBytes- 00:08:08.966 #27 NEW cov: 11591 ft: 12230 corp: 4/28b lim: 35 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 ShuffleBytes- 00:08:08.966 [2024-12-13 07:03:27.094815] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.966 [2024-12-13 07:03:27.094857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.966 [2024-12-13 07:03:27.094905] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.966 [2024-12-13 07:03:27.094921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.966 [2024-12-13 07:03:27.094950] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.966 [2024-12-13 07:03:27.094965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.966 NEW_FUNC[1/15]: 0x16b5478 in spdk_nvme_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:263 00:08:08.966 NEW_FUNC[2/15]: 0x16b56b8 in nvme_admin_qpair_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:202 00:08:08.966 #30 NEW cov: 11813 ft: 13269 corp: 5/51b lim: 35 exec/s: 0 rss: 67Mb L: 23/23 MS: 3 ChangeBit-ChangeByte-InsertRepeatedBytes- 00:08:08.966 [2024-12-13 07:03:27.154969] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000003e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.966 [2024-12-13 07:03:27.155001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.966 [2024-12-13 07:03:27.155050] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.966 [2024-12-13 07:03:27.155067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.966 [2024-12-13 07:03:27.155101] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.966 [2024-12-13 07:03:27.155117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.966 #31 NEW cov: 11813 ft: 13455 corp: 6/74b lim: 35 exec/s: 0 rss: 67Mb L: 23/23 MS: 1 ChangeBit- 00:08:09.225 [2024-12-13 07:03:27.225248] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.225 [2024-12-13 07:03:27.225278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.225 [2024-12-13 07:03:27.225326] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.225 [2024-12-13 07:03:27.225341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.225 [2024-12-13 07:03:27.225371] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.225 [2024-12-13 07:03:27.225386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.225 [2024-12-13 07:03:27.225415] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.225 [2024-12-13 07:03:27.225430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.225 [2024-12-13 07:03:27.225459] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.225 [2024-12-13 07:03:27.225474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.225 #32 NEW cov: 11813 ft: 13976 corp: 7/109b lim: 35 exec/s: 0 rss: 67Mb L: 35/35 MS: 1 CopyPart- 00:08:09.225 [2024-12-13 07:03:27.285149] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000006b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.225 [2024-12-13 07:03:27.285179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.225 #33 NEW cov: 11813 ft: 14150 corp: 8/118b lim: 35 exec/s: 0 rss: 67Mb L: 9/35 MS: 1 ShuffleBytes- 00:08:09.225 #34 NEW cov: 11813 ft: 14228 corp: 9/127b lim: 35 exec/s: 0 rss: 67Mb L: 9/35 MS: 1 ChangeBit- 00:08:09.226 [2024-12-13 07:03:27.395485] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000006b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.226 [2024-12-13 07:03:27.395515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.226 #35 NEW cov: 11813 ft: 14248 corp: 10/136b lim: 35 exec/s: 0 rss: 67Mb L: 9/35 MS: 1 ChangeBinInt- 00:08:09.484 [2024-12-13 07:03:27.465816] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000003e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.484 [2024-12-13 07:03:27.465848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.484 [2024-12-13 07:03:27.465882] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.485 [2024-12-13 07:03:27.465898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.485 [2024-12-13 07:03:27.465928] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.485 [2024-12-13 07:03:27.465943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.485 #41 NEW cov: 11813 ft: 14324 corp: 11/160b lim: 35 exec/s: 0 rss: 68Mb L: 24/35 MS: 1 InsertByte- 00:08:09.485 [2024-12-13 07:03:27.536002] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.485 [2024-12-13 07:03:27.536032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.485 [2024-12-13 07:03:27.536065] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.485 [2024-12-13 07:03:27.536080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.485 [2024-12-13 07:03:27.536109] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.485 [2024-12-13 07:03:27.536124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.485 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:09.485 #42 NEW cov: 11836 ft: 14400 corp: 12/183b lim: 35 exec/s: 0 rss: 68Mb L: 23/35 MS: 1 ShuffleBytes- 00:08:09.485 #43 NEW cov: 11836 ft: 14472 corp: 13/192b lim: 35 exec/s: 0 rss: 68Mb L: 9/35 MS: 1 ShuffleBytes- 00:08:09.485 [2024-12-13 07:03:27.636273] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000006b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.485 [2024-12-13 07:03:27.636303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.485 [2024-12-13 07:03:27.636335] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000001b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.485 [2024-12-13 07:03:27.636350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.485 #44 NEW cov: 11836 ft: 14596 corp: 14/216b lim: 35 exec/s: 44 rss: 68Mb L: 24/35 MS: 1 InsertRepeatedBytes- 00:08:09.744 #45 NEW cov: 11836 ft: 14694 corp: 15/225b lim: 35 exec/s: 45 rss: 68Mb L: 9/35 MS: 1 ShuffleBytes- 00:08:09.744 [2024-12-13 07:03:27.756575] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000003e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.744 [2024-12-13 07:03:27.756605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.744 [2024-12-13 07:03:27.756638] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.744 [2024-12-13 07:03:27.756653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.744 [2024-12-13 07:03:27.756682] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.744 [2024-12-13 07:03:27.756698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.744 #46 NEW cov: 11836 ft: 14708 corp: 16/249b lim: 35 exec/s: 46 rss: 68Mb L: 24/35 MS: 1 ChangeByte- 00:08:09.744 [2024-12-13 07:03:27.826697] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000003e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.744 [2024-12-13 07:03:27.826726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.744 [2024-12-13 07:03:27.826774] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.744 [2024-12-13 07:03:27.826789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.744 [2024-12-13 07:03:27.826818] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.744 [2024-12-13 07:03:27.826838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.744 #47 NEW cov: 11836 ft: 14753 corp: 17/274b lim: 35 exec/s: 47 rss: 68Mb L: 25/35 MS: 1 InsertByte- 00:08:09.744 [2024-12-13 07:03:27.876864] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.744 [2024-12-13 07:03:27.876896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.744 [2024-12-13 07:03:27.876945] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.744 [2024-12-13 07:03:27.876961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.744 [2024-12-13 07:03:27.876991] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.744 [2024-12-13 07:03:27.877007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.744 #48 NEW cov: 11836 ft: 14841 corp: 18/297b lim: 35 exec/s: 48 rss: 68Mb L: 23/35 MS: 1 ChangeBinInt- 00:08:09.744 [2024-12-13 07:03:27.926829] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000eb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.744 [2024-12-13 07:03:27.926857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.744 #49 NEW cov: 11836 ft: 14853 corp: 19/306b lim: 35 exec/s: 49 rss: 68Mb L: 9/35 MS: 1 ChangeBit- 00:08:10.003 [2024-12-13 07:03:27.987172] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.003 [2024-12-13 07:03:27.987209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.003 [2024-12-13 07:03:27.987244] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.003 [2024-12-13 07:03:27.987259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.003 [2024-12-13 07:03:27.987289] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.003 [2024-12-13 07:03:27.987304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.003 #50 NEW cov: 11836 ft: 14860 corp: 20/329b lim: 35 exec/s: 50 rss: 68Mb L: 23/35 MS: 1 ChangeByte- 00:08:10.003 #51 NEW cov: 11836 ft: 14871 corp: 21/336b lim: 35 exec/s: 51 rss: 68Mb L: 7/35 MS: 1 EraseBytes- 00:08:10.003 [2024-12-13 07:03:28.098548] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000003e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.003 [2024-12-13 07:03:28.098582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.003 [2024-12-13 07:03:28.098649] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.003 [2024-12-13 07:03:28.098668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.003 [2024-12-13 07:03:28.098741] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.003 [2024-12-13 07:03:28.098759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.003 [2024-12-13 07:03:28.098825] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:000000b1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.003 [2024-12-13 07:03:28.098842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.003 #52 NEW cov: 11836 ft: 14987 corp: 22/370b lim: 35 exec/s: 52 rss: 68Mb L: 34/35 MS: 1 CopyPart- 00:08:10.004 [2024-12-13 07:03:28.138463] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.004 [2024-12-13 07:03:28.138488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.004 [2024-12-13 07:03:28.138552] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.004 [2024-12-13 07:03:28.138566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.004 [2024-12-13 07:03:28.138625] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.004 [2024-12-13 07:03:28.138639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.004 #53 NEW cov: 11836 ft: 15107 corp: 23/394b lim: 35 exec/s: 53 rss: 68Mb L: 24/35 MS: 1 InsertByte- 00:08:10.004 [2024-12-13 07:03:28.178766] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.004 [2024-12-13 07:03:28.178793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.004 [2024-12-13 07:03:28.178850] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.004 [2024-12-13 07:03:28.178866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.004 [2024-12-13 07:03:28.178922] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.004 [2024-12-13 07:03:28.178938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.004 #54 NEW cov: 11843 ft: 15178 corp: 24/428b lim: 35 exec/s: 54 rss: 68Mb L: 34/35 MS: 1 InsertRepeatedBytes- 00:08:10.004 [2024-12-13 07:03:28.218822] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000003e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.004 [2024-12-13 07:03:28.218847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.004 [2024-12-13 07:03:28.218906] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.004 [2024-12-13 07:03:28.218920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.004 [2024-12-13 07:03:28.218978] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.004 [2024-12-13 07:03:28.218991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.004 [2024-12-13 07:03:28.219048] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:000000b1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.004 [2024-12-13 07:03:28.219062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.263 #55 NEW cov: 11843 ft: 15187 corp: 25/462b lim: 35 exec/s: 55 rss: 68Mb L: 34/35 MS: 1 ChangeByte- 00:08:10.263 [2024-12-13 07:03:28.268657] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000006b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.263 [2024-12-13 07:03:28.268684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.263 [2024-12-13 07:03:28.268743] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000009f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.263 [2024-12-13 07:03:28.268760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.263 #56 NEW cov: 11843 ft: 15270 corp: 26/479b lim: 35 exec/s: 56 rss: 68Mb L: 17/35 MS: 1 CMP- DE: "\377\001\347\254Mg\237\022"- 00:08:10.263 [2024-12-13 07:03:28.309071] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000006b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.263 [2024-12-13 07:03:28.309097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.263 [2024-12-13 07:03:28.309176] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000006b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.263 [2024-12-13 07:03:28.309195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.263 [2024-12-13 07:03:28.309256] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.263 [2024-12-13 07:03:28.309280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.263 [2024-12-13 07:03:28.309337] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.263 [2024-12-13 07:03:28.309350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.263 #57 NEW cov: 11843 ft: 15284 corp: 27/512b lim: 35 exec/s: 57 rss: 68Mb L: 33/35 MS: 1 CrossOver- 00:08:10.263 [2024-12-13 07:03:28.349180] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.263 [2024-12-13 07:03:28.349211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.263 [2024-12-13 07:03:28.349269] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ARBITRATION cid:5 cdw10:80000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.263 [2024-12-13 07:03:28.349286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.263 [2024-12-13 07:03:28.349365] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.264 [2024-12-13 07:03:28.349379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.264 [2024-12-13 07:03:28.349441] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.264 [2024-12-13 07:03:28.349454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.264 NEW_FUNC[1/1]: 0x47fd88 in feat_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:273 00:08:10.264 #58 NEW cov: 11877 ft: 15361 corp: 28/543b lim: 35 exec/s: 58 rss: 68Mb L: 31/35 MS: 1 PersAutoDict- DE: "\377\001\347\254Mg\237\022"- 00:08:10.264 [2024-12-13 07:03:28.398850] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES ASYNC EVENT CONFIGURATION cid:4 cdw10:8000000b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.264 [2024-12-13 07:03:28.398878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.264 NEW_FUNC[1/1]: 0x486dc8 in feat_async_event_cfg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:346 00:08:10.264 #60 NEW cov: 11977 ft: 15529 corp: 29/553b lim: 35 exec/s: 60 rss: 68Mb L: 10/35 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:10.264 [2024-12-13 07:03:28.438916] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000006b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.264 [2024-12-13 07:03:28.438945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.264 #61 NEW cov: 11977 ft: 15536 corp: 30/562b lim: 35 exec/s: 61 rss: 68Mb L: 9/35 MS: 1 ChangeByte- 00:08:10.264 [2024-12-13 07:03:28.479057] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.264 [2024-12-13 07:03:28.479084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.264 #62 NEW cov: 11977 ft: 15576 corp: 31/571b lim: 35 exec/s: 62 rss: 68Mb L: 9/35 MS: 1 PersAutoDict- DE: "\377\001\347\254Mg\237\022"- 00:08:10.523 [2024-12-13 07:03:28.519552] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.523 [2024-12-13 07:03:28.519577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.523 [2024-12-13 07:03:28.519619] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.523 [2024-12-13 07:03:28.519633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.523 [2024-12-13 07:03:28.519693] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.523 [2024-12-13 07:03:28.519706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.523 #63 NEW cov: 11977 ft: 15592 corp: 32/594b lim: 35 exec/s: 63 rss: 68Mb L: 23/35 MS: 1 ChangeBit- 00:08:10.523 [2024-12-13 07:03:28.559665] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.523 [2024-12-13 07:03:28.559690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.523 [2024-12-13 07:03:28.559751] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.523 [2024-12-13 07:03:28.559765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.523 [2024-12-13 07:03:28.559825] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000048 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.523 [2024-12-13 07:03:28.559839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.523 #64 NEW cov: 11977 ft: 15596 corp: 33/617b lim: 35 exec/s: 64 rss: 68Mb L: 23/35 MS: 1 ShuffleBytes- 00:08:10.523 [2024-12-13 07:03:28.599441] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.523 [2024-12-13 07:03:28.599465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.523 #65 NEW cov: 11977 ft: 15605 corp: 34/630b lim: 35 exec/s: 65 rss: 68Mb L: 13/35 MS: 1 EraseBytes- 00:08:10.523 [2024-12-13 07:03:28.639970] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.523 [2024-12-13 07:03:28.639997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.523 [2024-12-13 07:03:28.640057] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.523 [2024-12-13 07:03:28.640072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.523 [2024-12-13 07:03:28.640132] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.523 [2024-12-13 07:03:28.640148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.523 [2024-12-13 07:03:28.640206] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.524 [2024-12-13 07:03:28.640221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.524 #66 NEW cov: 11977 ft: 15636 corp: 35/660b lim: 35 exec/s: 33 rss: 68Mb L: 30/35 MS: 1 InsertRepeatedBytes- 00:08:10.524 #66 DONE cov: 11977 ft: 15636 corp: 35/660b lim: 35 exec/s: 33 rss: 68Mb 00:08:10.524 ###### Recommended dictionary. ###### 00:08:10.524 "\377\001\347\254Mg\237\022" # Uses: 2 00:08:10.524 ###### End of recommended dictionary. ###### 00:08:10.524 Done 66 runs in 2 second(s) 00:08:10.783 07:03:28 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:08:10.783 07:03:28 -- ../common.sh@72 -- # (( i++ )) 00:08:10.783 07:03:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:10.783 07:03:28 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:10.783 07:03:28 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:10.783 07:03:28 -- nvmf/run.sh@24 -- # local timen=1 00:08:10.783 07:03:28 -- nvmf/run.sh@25 -- # local core=0x1 00:08:10.783 07:03:28 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:10.783 07:03:28 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:10.783 07:03:28 -- nvmf/run.sh@29 -- # printf %02d 15 00:08:10.783 07:03:28 -- nvmf/run.sh@29 -- # port=4415 00:08:10.783 07:03:28 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:10.783 07:03:28 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:10.783 07:03:28 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:10.783 07:03:28 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:08:10.783 [2024-12-13 07:03:28.815510] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:10.783 [2024-12-13 07:03:28.815574] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid496542 ] 00:08:10.783 EAL: No free 2048 kB hugepages reported on node 1 00:08:10.783 [2024-12-13 07:03:28.990916] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.783 [2024-12-13 07:03:29.010508] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:10.783 [2024-12-13 07:03:29.010633] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.041 [2024-12-13 07:03:29.061927] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:11.041 [2024-12-13 07:03:29.078238] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:11.041 INFO: Running with entropic power schedule (0xFF, 100). 00:08:11.041 INFO: Seed: 3400970383 00:08:11.041 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:11.041 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:11.041 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:11.041 INFO: A corpus is not provided, starting from an empty corpus 00:08:11.041 #2 INITED exec/s: 0 rss: 59Mb 00:08:11.041 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:11.041 This may also happen if the target rejected all inputs we tried so far 00:08:11.042 [2024-12-13 07:03:29.123479] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.042 [2024-12-13 07:03:29.123510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.299 NEW_FUNC[1/670]: 0x466a08 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:11.299 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:11.299 #9 NEW cov: 11564 ft: 11565 corp: 2/8b lim: 35 exec/s: 0 rss: 67Mb L: 7/7 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:11.299 [2024-12-13 07:03:29.424219] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.299 [2024-12-13 07:03:29.424249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.299 #15 NEW cov: 11677 ft: 12069 corp: 3/17b lim: 35 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:11.299 [2024-12-13 07:03:29.464265] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.299 [2024-12-13 07:03:29.464291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.299 #16 NEW cov: 11683 ft: 12371 corp: 4/27b lim: 35 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 InsertByte- 00:08:11.299 [2024-12-13 07:03:29.504508] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.299 [2024-12-13 07:03:29.504532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.299 [2024-12-13 07:03:29.504605] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.299 [2024-12-13 07:03:29.504620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.299 #18 NEW cov: 11768 ft: 12969 corp: 5/46b lim: 35 exec/s: 0 rss: 67Mb L: 19/19 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:11.558 [2024-12-13 07:03:29.544494] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.558 [2024-12-13 07:03:29.544519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.558 #19 NEW cov: 11768 ft: 13086 corp: 6/53b lim: 35 exec/s: 0 rss: 67Mb L: 7/19 MS: 1 CrossOver- 00:08:11.558 [2024-12-13 07:03:29.584595] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.558 [2024-12-13 07:03:29.584619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.558 #20 NEW cov: 11768 ft: 13141 corp: 7/63b lim: 35 exec/s: 0 rss: 67Mb L: 10/19 MS: 1 EraseBytes- 00:08:11.558 [2024-12-13 07:03:29.624716] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.558 [2024-12-13 07:03:29.624741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.558 #22 NEW cov: 11768 ft: 13223 corp: 8/71b lim: 35 exec/s: 0 rss: 67Mb L: 8/19 MS: 2 EraseBytes-CrossOver- 00:08:11.558 [2024-12-13 07:03:29.664990] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.558 [2024-12-13 07:03:29.665015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.558 [2024-12-13 07:03:29.665089] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.558 [2024-12-13 07:03:29.665103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.558 #23 NEW cov: 11768 ft: 13233 corp: 9/90b lim: 35 exec/s: 0 rss: 67Mb L: 19/19 MS: 1 CrossOver- 00:08:11.558 [2024-12-13 07:03:29.704971] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.558 [2024-12-13 07:03:29.704996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.559 #24 NEW cov: 11768 ft: 13274 corp: 10/98b lim: 35 exec/s: 0 rss: 67Mb L: 8/19 MS: 1 CrossOver- 00:08:11.559 [2024-12-13 07:03:29.745095] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000002e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.559 [2024-12-13 07:03:29.745120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.559 #25 NEW cov: 11768 ft: 13374 corp: 11/109b lim: 35 exec/s: 0 rss: 67Mb L: 11/19 MS: 1 InsertByte- 00:08:11.559 [2024-12-13 07:03:29.785218] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.559 [2024-12-13 07:03:29.785243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.818 #26 NEW cov: 11768 ft: 13410 corp: 12/117b lim: 35 exec/s: 0 rss: 68Mb L: 8/19 MS: 1 ChangeBinInt- 00:08:11.818 [2024-12-13 07:03:29.825341] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000002e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-12-13 07:03:29.825367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.818 #27 NEW cov: 11768 ft: 13430 corp: 13/128b lim: 35 exec/s: 0 rss: 68Mb L: 11/19 MS: 1 CopyPart- 00:08:11.818 [2024-12-13 07:03:29.865468] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-12-13 07:03:29.865493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.818 #28 NEW cov: 11768 ft: 13454 corp: 14/139b lim: 35 exec/s: 0 rss: 68Mb L: 11/19 MS: 1 InsertByte- 00:08:11.818 [2024-12-13 07:03:29.905936] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-12-13 07:03:29.905961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.818 [2024-12-13 07:03:29.906020] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-12-13 07:03:29.906035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.818 [2024-12-13 07:03:29.906092] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-12-13 07:03:29.906106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.818 [2024-12-13 07:03:29.906164] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-12-13 07:03:29.906177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.818 #29 NEW cov: 11768 ft: 14042 corp: 15/170b lim: 35 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 InsertRepeatedBytes- 00:08:11.818 [2024-12-13 07:03:29.945769] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000073f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-12-13 07:03:29.945795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.818 [2024-12-13 07:03:29.945855] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-12-13 07:03:29.945874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.818 #35 NEW cov: 11768 ft: 14096 corp: 16/189b lim: 35 exec/s: 0 rss: 68Mb L: 19/31 MS: 1 ChangeByte- 00:08:11.818 [2024-12-13 07:03:29.985923] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-12-13 07:03:29.985948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.818 [2024-12-13 07:03:29.986007] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-12-13 07:03:29.986021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.818 #41 NEW cov: 11768 ft: 14158 corp: 17/208b lim: 35 exec/s: 0 rss: 68Mb L: 19/31 MS: 1 CMP- DE: "\000\000\000\037"- 00:08:11.818 [2024-12-13 07:03:30.025910] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-12-13 07:03:30.025936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.818 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:11.818 #42 NEW cov: 11791 ft: 14248 corp: 18/215b lim: 35 exec/s: 0 rss: 68Mb L: 7/31 MS: 1 ChangeBinInt- 00:08:12.077 [2024-12-13 07:03:30.066054] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.077 [2024-12-13 07:03:30.066081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.077 #43 NEW cov: 11791 ft: 14265 corp: 19/224b lim: 35 exec/s: 0 rss: 68Mb L: 9/31 MS: 1 InsertByte- 00:08:12.077 [2024-12-13 07:03:30.106589] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.077 [2024-12-13 07:03:30.106616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.077 [2024-12-13 07:03:30.106689] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.077 [2024-12-13 07:03:30.106703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.077 [2024-12-13 07:03:30.106763] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.077 [2024-12-13 07:03:30.106777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.077 [2024-12-13 07:03:30.106835] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.077 [2024-12-13 07:03:30.106848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.077 #44 NEW cov: 11791 ft: 14299 corp: 20/258b lim: 35 exec/s: 44 rss: 68Mb L: 34/34 MS: 1 CrossOver- 00:08:12.077 [2024-12-13 07:03:30.156343] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.077 [2024-12-13 07:03:30.156368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.077 #45 NEW cov: 11791 ft: 14309 corp: 21/270b lim: 35 exec/s: 45 rss: 68Mb L: 12/34 MS: 1 InsertByte- 00:08:12.077 [2024-12-13 07:03:30.197041] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.077 [2024-12-13 07:03:30.197066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.077 [2024-12-13 07:03:30.197135] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.077 [2024-12-13 07:03:30.197150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.077 [2024-12-13 07:03:30.197204] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.077 [2024-12-13 07:03:30.197218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.077 [2024-12-13 07:03:30.197277] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.077 [2024-12-13 07:03:30.197291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.077 [2024-12-13 07:03:30.197348] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.077 [2024-12-13 07:03:30.197361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.077 #46 NEW cov: 11791 ft: 14370 corp: 22/305b lim: 35 exec/s: 46 rss: 68Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:12.077 [2024-12-13 07:03:30.236537] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.077 [2024-12-13 07:03:30.236562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.077 #47 NEW cov: 11791 ft: 14387 corp: 23/315b lim: 35 exec/s: 47 rss: 68Mb L: 10/35 MS: 1 ChangeBinInt- 00:08:12.077 [2024-12-13 07:03:30.276936] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000006c6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.077 [2024-12-13 07:03:30.276962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.077 [2024-12-13 07:03:30.277024] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006c6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.077 [2024-12-13 07:03:30.277038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.077 [2024-12-13 07:03:30.277099] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006c6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.077 [2024-12-13 07:03:30.277112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.077 #48 NEW cov: 11791 ft: 14551 corp: 24/340b lim: 35 exec/s: 48 rss: 68Mb L: 25/35 MS: 1 InsertRepeatedBytes- 00:08:12.077 [2024-12-13 07:03:30.316929] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.077 [2024-12-13 07:03:30.316954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.336 [2024-12-13 07:03:30.317014] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.337 [2024-12-13 07:03:30.317028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.337 #49 NEW cov: 11791 ft: 14583 corp: 25/359b lim: 35 exec/s: 49 rss: 68Mb L: 19/35 MS: 1 ChangeBit- 00:08:12.337 [2024-12-13 07:03:30.357345] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.337 [2024-12-13 07:03:30.357371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.337 [2024-12-13 07:03:30.357432] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.337 [2024-12-13 07:03:30.357450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.337 [2024-12-13 07:03:30.357509] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.337 [2024-12-13 07:03:30.357523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.337 [2024-12-13 07:03:30.357582] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.337 [2024-12-13 07:03:30.357595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.337 #50 NEW cov: 11791 ft: 14642 corp: 26/391b lim: 35 exec/s: 50 rss: 68Mb L: 32/35 MS: 1 CopyPart- 00:08:12.337 [2024-12-13 07:03:30.397064] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000002e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.337 [2024-12-13 07:03:30.397089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.337 #51 NEW cov: 11791 ft: 14650 corp: 27/402b lim: 35 exec/s: 51 rss: 68Mb L: 11/35 MS: 1 ChangeByte- 00:08:12.337 [2024-12-13 07:03:30.437240] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.337 [2024-12-13 07:03:30.437266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.337 #52 NEW cov: 11791 ft: 14671 corp: 28/413b lim: 35 exec/s: 52 rss: 68Mb L: 11/35 MS: 1 CrossOver- 00:08:12.337 [2024-12-13 07:03:30.477697] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.337 [2024-12-13 07:03:30.477722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.337 [2024-12-13 07:03:30.477783] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.337 [2024-12-13 07:03:30.477797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.337 [2024-12-13 07:03:30.477860] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.337 [2024-12-13 07:03:30.477873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.337 [2024-12-13 07:03:30.477935] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.337 [2024-12-13 07:03:30.477949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.337 #53 NEW cov: 11791 ft: 14720 corp: 29/447b lim: 35 exec/s: 53 rss: 68Mb L: 34/35 MS: 1 InsertRepeatedBytes- 00:08:12.337 [2024-12-13 07:03:30.517430] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000023 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.337 [2024-12-13 07:03:30.517454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.337 #56 NEW cov: 11791 ft: 14741 corp: 30/458b lim: 35 exec/s: 56 rss: 68Mb L: 11/35 MS: 3 CrossOver-ChangeByte-CrossOver- 00:08:12.337 [2024-12-13 07:03:30.547825] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000073f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.337 [2024-12-13 07:03:30.547849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.337 [2024-12-13 07:03:30.547927] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.337 [2024-12-13 07:03:30.547945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.337 [2024-12-13 07:03:30.548007] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.337 [2024-12-13 07:03:30.548020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.337 #57 NEW cov: 11791 ft: 14783 corp: 31/483b lim: 35 exec/s: 57 rss: 68Mb L: 25/35 MS: 1 InsertRepeatedBytes- 00:08:12.596 [2024-12-13 07:03:30.587595] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.596 [2024-12-13 07:03:30.587620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.596 #58 NEW cov: 11791 ft: 14794 corp: 32/494b lim: 35 exec/s: 58 rss: 69Mb L: 11/35 MS: 1 InsertByte- 00:08:12.596 [2024-12-13 07:03:30.617673] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.596 [2024-12-13 07:03:30.617697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.596 #59 NEW cov: 11791 ft: 14795 corp: 33/507b lim: 35 exec/s: 59 rss: 69Mb L: 13/35 MS: 1 CMP- DE: "\001\000\000\037"- 00:08:12.596 [2024-12-13 07:03:30.647807] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.596 [2024-12-13 07:03:30.647831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.596 #60 NEW cov: 11791 ft: 14845 corp: 34/519b lim: 35 exec/s: 60 rss: 69Mb L: 12/35 MS: 1 InsertByte- 00:08:12.597 [2024-12-13 07:03:30.688063] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.597 [2024-12-13 07:03:30.688087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.597 [2024-12-13 07:03:30.688146] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000077a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.597 [2024-12-13 07:03:30.688160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.597 #61 NEW cov: 11791 ft: 14853 corp: 35/535b lim: 35 exec/s: 61 rss: 69Mb L: 16/35 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:12.597 [2024-12-13 07:03:30.728039] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.597 [2024-12-13 07:03:30.728064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.597 #62 NEW cov: 11791 ft: 14867 corp: 36/546b lim: 35 exec/s: 62 rss: 69Mb L: 11/35 MS: 1 ChangeBit- 00:08:12.597 [2024-12-13 07:03:30.768127] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000071a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.597 [2024-12-13 07:03:30.768151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.597 #63 NEW cov: 11791 ft: 14883 corp: 37/556b lim: 35 exec/s: 63 rss: 69Mb L: 10/35 MS: 1 ChangeByte- 00:08:12.597 [2024-12-13 07:03:30.798376] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.597 [2024-12-13 07:03:30.798402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.597 [2024-12-13 07:03:30.798463] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000071f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.597 [2024-12-13 07:03:30.798476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.597 #64 NEW cov: 11791 ft: 14888 corp: 38/570b lim: 35 exec/s: 64 rss: 69Mb L: 14/35 MS: 1 PersAutoDict- DE: "\000\000\000\037"- 00:08:12.856 [2024-12-13 07:03:30.838354] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST MEM BUFFER cid:4 cdw10:0000070d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.856 [2024-12-13 07:03:30.838380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.856 #65 NEW cov: 11791 ft: 14976 corp: 39/580b lim: 35 exec/s: 65 rss: 69Mb L: 10/35 MS: 1 InsertByte- 00:08:12.856 [2024-12-13 07:03:30.878452] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000023 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.856 [2024-12-13 07:03:30.878477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.856 #66 NEW cov: 11791 ft: 15024 corp: 40/591b lim: 35 exec/s: 66 rss: 69Mb L: 11/35 MS: 1 ChangeBit- 00:08:12.856 [2024-12-13 07:03:30.919176] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.856 [2024-12-13 07:03:30.919205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.856 [2024-12-13 07:03:30.919268] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.856 [2024-12-13 07:03:30.919283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.856 [2024-12-13 07:03:30.919347] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.856 [2024-12-13 07:03:30.919360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.856 [2024-12-13 07:03:30.919423] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.856 [2024-12-13 07:03:30.919437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.856 [2024-12-13 07:03:30.919497] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.856 [2024-12-13 07:03:30.919512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.856 #67 NEW cov: 11791 ft: 15112 corp: 41/626b lim: 35 exec/s: 67 rss: 69Mb L: 35/35 MS: 1 ChangeBinInt- 00:08:12.856 [2024-12-13 07:03:30.968870] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000073f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.856 [2024-12-13 07:03:30.968895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.856 [2024-12-13 07:03:30.968976] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.856 [2024-12-13 07:03:30.968991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.856 #68 NEW cov: 11791 ft: 15170 corp: 42/645b lim: 35 exec/s: 68 rss: 69Mb L: 19/35 MS: 1 ChangeBit- 00:08:12.856 [2024-12-13 07:03:31.008797] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.856 [2024-12-13 07:03:31.008821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.856 #69 NEW cov: 11791 ft: 15179 corp: 43/656b lim: 35 exec/s: 69 rss: 69Mb L: 11/35 MS: 1 ChangeBinInt- 00:08:12.856 [2024-12-13 07:03:31.049510] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.856 [2024-12-13 07:03:31.049537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.856 [2024-12-13 07:03:31.049616] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.856 [2024-12-13 07:03:31.049630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.856 [2024-12-13 07:03:31.049689] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.856 [2024-12-13 07:03:31.049703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.856 [2024-12-13 07:03:31.049764] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.856 [2024-12-13 07:03:31.049777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.856 [2024-12-13 07:03:31.049839] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.856 [2024-12-13 07:03:31.049852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.856 #70 NEW cov: 11791 ft: 15189 corp: 44/691b lim: 35 exec/s: 70 rss: 69Mb L: 35/35 MS: 1 ChangeByte- 00:08:13.116 [2024-12-13 07:03:31.099562] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000002e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.116 [2024-12-13 07:03:31.099587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.116 [2024-12-13 07:03:31.099648] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.116 [2024-12-13 07:03:31.099662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.116 [2024-12-13 07:03:31.099722] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.116 [2024-12-13 07:03:31.099735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.116 [2024-12-13 07:03:31.099793] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.116 [2024-12-13 07:03:31.099807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.116 #71 NEW cov: 11791 ft: 15199 corp: 45/723b lim: 35 exec/s: 35 rss: 69Mb L: 32/35 MS: 1 InsertRepeatedBytes- 00:08:13.116 #71 DONE cov: 11791 ft: 15199 corp: 45/723b lim: 35 exec/s: 35 rss: 69Mb 00:08:13.116 ###### Recommended dictionary. ###### 00:08:13.116 "\377\377\377\377\377\377\377\377" # Uses: 1 00:08:13.116 "\000\000\000\037" # Uses: 1 00:08:13.116 "\001\000\000\037" # Uses: 0 00:08:13.116 ###### End of recommended dictionary. ###### 00:08:13.116 Done 71 runs in 2 second(s) 00:08:13.116 07:03:31 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:08:13.116 07:03:31 -- ../common.sh@72 -- # (( i++ )) 00:08:13.116 07:03:31 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:13.116 07:03:31 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:13.116 07:03:31 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:13.116 07:03:31 -- nvmf/run.sh@24 -- # local timen=1 00:08:13.116 07:03:31 -- nvmf/run.sh@25 -- # local core=0x1 00:08:13.116 07:03:31 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:13.116 07:03:31 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:13.116 07:03:31 -- nvmf/run.sh@29 -- # printf %02d 16 00:08:13.116 07:03:31 -- nvmf/run.sh@29 -- # port=4416 00:08:13.116 07:03:31 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:13.116 07:03:31 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:13.116 07:03:31 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:13.116 07:03:31 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:08:13.117 [2024-12-13 07:03:31.275626] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:13.117 [2024-12-13 07:03:31.275692] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid496835 ] 00:08:13.117 EAL: No free 2048 kB hugepages reported on node 1 00:08:13.376 [2024-12-13 07:03:31.453620] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.376 [2024-12-13 07:03:31.474866] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:13.376 [2024-12-13 07:03:31.475007] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.376 [2024-12-13 07:03:31.526628] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:13.376 [2024-12-13 07:03:31.542876] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:13.376 INFO: Running with entropic power schedule (0xFF, 100). 00:08:13.376 INFO: Seed: 1571992783 00:08:13.376 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:13.376 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:13.376 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:13.376 INFO: A corpus is not provided, starting from an empty corpus 00:08:13.376 #2 INITED exec/s: 0 rss: 60Mb 00:08:13.376 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:13.376 This may also happen if the target rejected all inputs we tried so far 00:08:13.376 [2024-12-13 07:03:31.608938] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11068046443974072729 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.376 [2024-12-13 07:03:31.608983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.376 [2024-12-13 07:03:31.609113] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11068046444225730969 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.376 [2024-12-13 07:03:31.609138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.376 [2024-12-13 07:03:31.609268] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:11068046444225730969 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.376 [2024-12-13 07:03:31.609290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.894 NEW_FUNC[1/671]: 0x467ec8 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:13.894 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:13.894 #5 NEW cov: 11667 ft: 11668 corp: 2/66b lim: 105 exec/s: 0 rss: 67Mb L: 65/65 MS: 3 ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:08:13.894 [2024-12-13 07:03:31.939479] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069582422015 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-12-13 07:03:31.939519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.894 #9 NEW cov: 11780 ft: 12690 corp: 3/103b lim: 105 exec/s: 0 rss: 67Mb L: 37/65 MS: 4 CrossOver-InsertRepeatedBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:13.894 [2024-12-13 07:03:31.980021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-12-13 07:03:31.980054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.894 [2024-12-13 07:03:31.980150] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-12-13 07:03:31.980172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.894 [2024-12-13 07:03:31.980289] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-12-13 07:03:31.980312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.894 [2024-12-13 07:03:31.980430] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-12-13 07:03:31.980452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.894 #10 NEW cov: 11786 ft: 13557 corp: 4/207b lim: 105 exec/s: 0 rss: 67Mb L: 104/104 MS: 1 InsertRepeatedBytes- 00:08:13.894 [2024-12-13 07:03:32.020179] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-12-13 07:03:32.020217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.894 [2024-12-13 07:03:32.020343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-12-13 07:03:32.020366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.894 [2024-12-13 07:03:32.020487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-12-13 07:03:32.020508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.894 [2024-12-13 07:03:32.020632] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-12-13 07:03:32.020651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.894 #11 NEW cov: 11871 ft: 13776 corp: 5/311b lim: 105 exec/s: 0 rss: 67Mb L: 104/104 MS: 1 ChangeBinInt- 00:08:13.894 [2024-12-13 07:03:32.070301] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-12-13 07:03:32.070334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.894 [2024-12-13 07:03:32.070440] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-12-13 07:03:32.070461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.894 [2024-12-13 07:03:32.070574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.894 [2024-12-13 07:03:32.070595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.895 [2024-12-13 07:03:32.070715] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.895 [2024-12-13 07:03:32.070733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.895 #12 NEW cov: 11871 ft: 13840 corp: 6/415b lim: 105 exec/s: 0 rss: 67Mb L: 104/104 MS: 1 ChangeByte- 00:08:13.895 [2024-12-13 07:03:32.109845] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11068046441825606041 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.895 [2024-12-13 07:03:32.109874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.895 #13 NEW cov: 11871 ft: 14024 corp: 7/453b lim: 105 exec/s: 0 rss: 67Mb L: 38/104 MS: 1 CrossOver- 00:08:14.155 [2024-12-13 07:03:32.150706] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.155 [2024-12-13 07:03:32.150738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.155 [2024-12-13 07:03:32.150820] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.155 [2024-12-13 07:03:32.150843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.155 [2024-12-13 07:03:32.150956] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.155 [2024-12-13 07:03:32.150976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.155 [2024-12-13 07:03:32.151088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.155 [2024-12-13 07:03:32.151112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.155 [2024-12-13 07:03:32.151239] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.155 [2024-12-13 07:03:32.151261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:14.155 #14 NEW cov: 11871 ft: 14135 corp: 8/558b lim: 105 exec/s: 0 rss: 67Mb L: 105/105 MS: 1 CopyPart- 00:08:14.155 [2024-12-13 07:03:32.190821] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.155 [2024-12-13 07:03:32.190851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.155 [2024-12-13 07:03:32.190945] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.155 [2024-12-13 07:03:32.190979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.155 [2024-12-13 07:03:32.191100] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.155 [2024-12-13 07:03:32.191120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.155 [2024-12-13 07:03:32.191242] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.155 [2024-12-13 07:03:32.191262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.155 [2024-12-13 07:03:32.191377] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.155 [2024-12-13 07:03:32.191399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:14.155 #15 NEW cov: 11871 ft: 14184 corp: 9/663b lim: 105 exec/s: 0 rss: 67Mb L: 105/105 MS: 1 ShuffleBytes- 00:08:14.155 [2024-12-13 07:03:32.230787] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11068046441825606041 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.155 [2024-12-13 07:03:32.230816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.155 [2024-12-13 07:03:32.230916] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11068046444225730969 len:44719 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.155 [2024-12-13 07:03:32.230939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.155 [2024-12-13 07:03:32.231053] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:12587190073825341102 len:44719 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.155 [2024-12-13 07:03:32.231075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.155 [2024-12-13 07:03:32.231200] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:12587190073825341102 len:44719 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.155 [2024-12-13 07:03:32.231233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.155 #16 NEW cov: 11871 ft: 14217 corp: 10/748b lim: 105 exec/s: 0 rss: 68Mb L: 85/105 MS: 1 InsertRepeatedBytes- 00:08:14.155 [2024-12-13 07:03:32.270407] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18374687575056121855 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.155 [2024-12-13 07:03:32.270435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.155 #17 NEW cov: 11871 ft: 14246 corp: 11/785b lim: 105 exec/s: 0 rss: 68Mb L: 37/105 MS: 1 ChangeBinInt- 00:08:14.155 [2024-12-13 07:03:32.310509] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:72057594037927680 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.155 [2024-12-13 07:03:32.310538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.155 #19 NEW cov: 11871 ft: 14332 corp: 12/820b lim: 105 exec/s: 0 rss: 68Mb L: 35/105 MS: 2 ShuffleBytes-CrossOver- 00:08:14.155 [2024-12-13 07:03:32.350664] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:72057594037927680 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.155 [2024-12-13 07:03:32.350690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.155 #20 NEW cov: 11871 ft: 14360 corp: 13/857b lim: 105 exec/s: 0 rss: 68Mb L: 37/105 MS: 1 CMP- DE: " \000"- 00:08:14.155 [2024-12-13 07:03:32.390783] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18374687575056121855 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.155 [2024-12-13 07:03:32.390813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.415 #21 NEW cov: 11871 ft: 14372 corp: 14/894b lim: 105 exec/s: 0 rss: 68Mb L: 37/105 MS: 1 ChangeByte- 00:08:14.415 [2024-12-13 07:03:32.430928] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11068046441825606041 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.415 [2024-12-13 07:03:32.430955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.415 #22 NEW cov: 11871 ft: 14390 corp: 15/930b lim: 105 exec/s: 0 rss: 68Mb L: 36/105 MS: 1 EraseBytes- 00:08:14.415 [2024-12-13 07:03:32.470962] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2709365533425965465 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.415 [2024-12-13 07:03:32.470992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.415 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:14.415 #23 NEW cov: 11894 ft: 14444 corp: 16/968b lim: 105 exec/s: 0 rss: 68Mb L: 38/105 MS: 1 ChangeByte- 00:08:14.415 [2024-12-13 07:03:32.511657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11068046443974072729 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.415 [2024-12-13 07:03:32.511688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.415 [2024-12-13 07:03:32.511783] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11068046444225730969 len:13879 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.415 [2024-12-13 07:03:32.511802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.415 [2024-12-13 07:03:32.511929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:3906369758457902646 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.415 [2024-12-13 07:03:32.511951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.415 [2024-12-13 07:03:32.512069] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:11068046444225730969 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.415 [2024-12-13 07:03:32.512090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.415 #24 NEW cov: 11894 ft: 14525 corp: 17/1053b lim: 105 exec/s: 0 rss: 68Mb L: 85/105 MS: 1 InsertRepeatedBytes- 00:08:14.415 [2024-12-13 07:03:32.561255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:9008298766368512 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.415 [2024-12-13 07:03:32.561285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.415 #25 NEW cov: 11894 ft: 14562 corp: 18/1090b lim: 105 exec/s: 25 rss: 68Mb L: 37/105 MS: 1 PersAutoDict- DE: " \000"- 00:08:14.415 [2024-12-13 07:03:32.602136] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.415 [2024-12-13 07:03:32.602166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.415 [2024-12-13 07:03:32.602272] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.415 [2024-12-13 07:03:32.602293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.415 [2024-12-13 07:03:32.602401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.415 [2024-12-13 07:03:32.602424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.415 [2024-12-13 07:03:32.602532] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.415 [2024-12-13 07:03:32.602553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.415 [2024-12-13 07:03:32.602672] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.415 [2024-12-13 07:03:32.602692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:14.415 #26 NEW cov: 11894 ft: 14567 corp: 19/1195b lim: 105 exec/s: 26 rss: 68Mb L: 105/105 MS: 1 ShuffleBytes- 00:08:14.415 [2024-12-13 07:03:32.642041] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11068046443974072729 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.415 [2024-12-13 07:03:32.642069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.415 [2024-12-13 07:03:32.642153] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11068046444225730969 len:13879 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.415 [2024-12-13 07:03:32.642173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.415 [2024-12-13 07:03:32.642287] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:3906369334917084726 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.415 [2024-12-13 07:03:32.642311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.416 [2024-12-13 07:03:32.642428] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:11068046444225730969 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.416 [2024-12-13 07:03:32.642451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.676 #27 NEW cov: 11894 ft: 14585 corp: 20/1281b lim: 105 exec/s: 27 rss: 68Mb L: 86/105 MS: 1 CopyPart- 00:08:14.676 [2024-12-13 07:03:32.682418] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.682452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.676 [2024-12-13 07:03:32.682539] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:72057589742960640 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.682560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.676 [2024-12-13 07:03:32.682666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.682686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.676 [2024-12-13 07:03:32.682798] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.682817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.676 [2024-12-13 07:03:32.682922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.682941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:14.676 #28 NEW cov: 11894 ft: 14636 corp: 21/1386b lim: 105 exec/s: 28 rss: 68Mb L: 105/105 MS: 1 ChangeBinInt- 00:08:14.676 [2024-12-13 07:03:32.732392] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.732423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.676 [2024-12-13 07:03:32.732510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.732533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.676 [2024-12-13 07:03:32.732647] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.732668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.676 [2024-12-13 07:03:32.732781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.732803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.676 #29 NEW cov: 11894 ft: 14673 corp: 22/1490b lim: 105 exec/s: 29 rss: 69Mb L: 104/105 MS: 1 CrossOver- 00:08:14.676 [2024-12-13 07:03:32.772516] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11068046441825606041 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.772546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.676 [2024-12-13 07:03:32.772645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11068046444225730969 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.772666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.676 [2024-12-13 07:03:32.772780] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:6293595036912670551 len:22360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.772805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.676 [2024-12-13 07:03:32.772926] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:6293595036912670551 len:22360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.772944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.676 #30 NEW cov: 11894 ft: 14684 corp: 23/1594b lim: 105 exec/s: 30 rss: 69Mb L: 104/105 MS: 1 InsertRepeatedBytes- 00:08:14.676 [2024-12-13 07:03:32.812033] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11068046441825606041 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.812063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.676 #31 NEW cov: 11894 ft: 14694 corp: 24/1633b lim: 105 exec/s: 31 rss: 69Mb L: 39/105 MS: 1 InsertByte- 00:08:14.676 [2024-12-13 07:03:32.852480] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11068046441825606041 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.852509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.676 [2024-12-13 07:03:32.852593] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11068046444225730969 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.852614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.676 [2024-12-13 07:03:32.852730] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:11068046444225730969 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.852755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.676 #32 NEW cov: 11894 ft: 14711 corp: 25/1700b lim: 105 exec/s: 32 rss: 69Mb L: 67/105 MS: 1 CopyPart- 00:08:14.676 [2024-12-13 07:03:32.893114] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069584584703 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.893143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.676 [2024-12-13 07:03:32.893228] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.893253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.676 [2024-12-13 07:03:32.893370] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.893388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.676 [2024-12-13 07:03:32.893503] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.893524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.676 [2024-12-13 07:03:32.893642] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.676 [2024-12-13 07:03:32.893665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:14.936 #33 NEW cov: 11894 ft: 14728 corp: 26/1805b lim: 105 exec/s: 33 rss: 69Mb L: 105/105 MS: 1 InsertByte- 00:08:14.936 [2024-12-13 07:03:32.942890] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11068046441825606041 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.936 [2024-12-13 07:03:32.942923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.936 [2024-12-13 07:03:32.942997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11068046444225730969 len:44719 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.936 [2024-12-13 07:03:32.943021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.936 [2024-12-13 07:03:32.943132] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:12587190073825341102 len:44719 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.936 [2024-12-13 07:03:32.943154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.936 [2024-12-13 07:03:32.943274] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:12587190073825341102 len:44719 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.936 [2024-12-13 07:03:32.943295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.936 #34 NEW cov: 11894 ft: 14741 corp: 27/1890b lim: 105 exec/s: 34 rss: 69Mb L: 85/105 MS: 1 ChangeBit- 00:08:14.936 [2024-12-13 07:03:32.992778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11068046443974072729 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.936 [2024-12-13 07:03:32.992809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.937 [2024-12-13 07:03:32.992915] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11068046444225730969 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.937 [2024-12-13 07:03:32.992942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.937 #35 NEW cov: 11894 ft: 15012 corp: 28/1948b lim: 105 exec/s: 35 rss: 69Mb L: 58/105 MS: 1 EraseBytes- 00:08:14.937 [2024-12-13 07:03:33.032674] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11068046441825606041 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.937 [2024-12-13 07:03:33.032706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.937 #36 NEW cov: 11894 ft: 15036 corp: 29/1986b lim: 105 exec/s: 36 rss: 69Mb L: 38/105 MS: 1 ShuffleBytes- 00:08:14.937 [2024-12-13 07:03:33.073535] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.937 [2024-12-13 07:03:33.073568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.937 [2024-12-13 07:03:33.073662] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.937 [2024-12-13 07:03:33.073681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.937 [2024-12-13 07:03:33.073795] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.937 [2024-12-13 07:03:33.073815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.937 [2024-12-13 07:03:33.073929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.937 [2024-12-13 07:03:33.073949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.937 [2024-12-13 07:03:33.074069] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.937 [2024-12-13 07:03:33.074092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:14.937 #37 NEW cov: 11894 ft: 15042 corp: 30/2091b lim: 105 exec/s: 37 rss: 69Mb L: 105/105 MS: 1 ShuffleBytes- 00:08:14.937 [2024-12-13 07:03:33.113191] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11068046443974072729 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.937 [2024-12-13 07:03:33.113221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.937 [2024-12-13 07:03:33.113315] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11428334414415370649 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.937 [2024-12-13 07:03:33.113338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.937 [2024-12-13 07:03:33.113467] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:11068046444225730969 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.937 [2024-12-13 07:03:33.113490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.937 #38 NEW cov: 11894 ft: 15054 corp: 31/2156b lim: 105 exec/s: 38 rss: 69Mb L: 65/105 MS: 1 ChangeBinInt- 00:08:14.937 [2024-12-13 07:03:33.153572] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11068046443974072729 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.937 [2024-12-13 07:03:33.153606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.937 [2024-12-13 07:03:33.153705] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11068046444225730969 len:13879 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.937 [2024-12-13 07:03:33.153729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.937 [2024-12-13 07:03:33.153842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:3906369334917084726 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.937 [2024-12-13 07:03:33.153866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.937 [2024-12-13 07:03:33.153990] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:11068046444225699840 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.937 [2024-12-13 07:03:33.154012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.196 #39 NEW cov: 11894 ft: 15069 corp: 32/2242b lim: 105 exec/s: 39 rss: 69Mb L: 86/105 MS: 1 PersAutoDict- DE: " \000"- 00:08:15.196 [2024-12-13 07:03:33.203235] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069582422015 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.196 [2024-12-13 07:03:33.203261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.196 #40 NEW cov: 11894 ft: 15082 corp: 33/2279b lim: 105 exec/s: 40 rss: 69Mb L: 37/105 MS: 1 PersAutoDict- DE: " \000"- 00:08:15.196 [2024-12-13 07:03:33.243258] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:72057594037927680 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.196 [2024-12-13 07:03:33.243293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.196 #41 NEW cov: 11894 ft: 15087 corp: 34/2314b lim: 105 exec/s: 41 rss: 69Mb L: 35/105 MS: 1 PersAutoDict- DE: " \000"- 00:08:15.196 [2024-12-13 07:03:33.283358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:72057594037927680 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.196 [2024-12-13 07:03:33.283386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.196 #42 NEW cov: 11894 ft: 15099 corp: 35/2349b lim: 105 exec/s: 42 rss: 69Mb L: 35/105 MS: 1 ChangeBinInt- 00:08:15.196 [2024-12-13 07:03:33.323483] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069582487551 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.196 [2024-12-13 07:03:33.323515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.196 #43 NEW cov: 11894 ft: 15117 corp: 36/2386b lim: 105 exec/s: 43 rss: 69Mb L: 37/105 MS: 1 ChangeBit- 00:08:15.196 [2024-12-13 07:03:33.364163] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11068046443974072729 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.196 [2024-12-13 07:03:33.364193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.197 [2024-12-13 07:03:33.364283] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11068046444225730969 len:13879 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.197 [2024-12-13 07:03:33.364305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.197 [2024-12-13 07:03:33.364423] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:3906369333256140441 len:13978 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.197 [2024-12-13 07:03:33.364443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.197 [2024-12-13 07:03:33.364565] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:11033819746764691865 len:39322 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.197 [2024-12-13 07:03:33.364586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.197 #44 NEW cov: 11894 ft: 15120 corp: 37/2475b lim: 105 exec/s: 44 rss: 69Mb L: 89/105 MS: 1 InsertRepeatedBytes- 00:08:15.197 [2024-12-13 07:03:33.403782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446735273489465343 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.197 [2024-12-13 07:03:33.403811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.197 #45 NEW cov: 11894 ft: 15130 corp: 38/2512b lim: 105 exec/s: 45 rss: 69Mb L: 37/105 MS: 1 ChangeBit- 00:08:15.456 [2024-12-13 07:03:33.444427] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.456 [2024-12-13 07:03:33.444457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.456 [2024-12-13 07:03:33.444537] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.456 [2024-12-13 07:03:33.444564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.456 [2024-12-13 07:03:33.444683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.456 [2024-12-13 07:03:33.444703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.456 [2024-12-13 07:03:33.444819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.456 [2024-12-13 07:03:33.444841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.456 #46 NEW cov: 11894 ft: 15139 corp: 39/2613b lim: 105 exec/s: 46 rss: 69Mb L: 101/105 MS: 1 EraseBytes- 00:08:15.456 [2024-12-13 07:03:33.483912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:72057594037927680 len:256 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.456 [2024-12-13 07:03:33.483936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.456 #47 NEW cov: 11894 ft: 15145 corp: 40/2648b lim: 105 exec/s: 47 rss: 69Mb L: 35/105 MS: 1 PersAutoDict- DE: " \000"- 00:08:15.456 [2024-12-13 07:03:33.524792] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.456 [2024-12-13 07:03:33.524822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.456 [2024-12-13 07:03:33.524910] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.457 [2024-12-13 07:03:33.524933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.457 [2024-12-13 07:03:33.525056] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.457 [2024-12-13 07:03:33.525078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.457 [2024-12-13 07:03:33.525193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.457 [2024-12-13 07:03:33.525216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.457 [2024-12-13 07:03:33.525348] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.457 [2024-12-13 07:03:33.525368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:15.457 #48 NEW cov: 11894 ft: 15189 corp: 41/2753b lim: 105 exec/s: 48 rss: 69Mb L: 105/105 MS: 1 ChangeBit- 00:08:15.457 [2024-12-13 07:03:33.564215] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446735273489465343 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.457 [2024-12-13 07:03:33.564242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.457 #49 NEW cov: 11894 ft: 15198 corp: 42/2790b lim: 105 exec/s: 24 rss: 70Mb L: 37/105 MS: 1 ShuffleBytes- 00:08:15.457 #49 DONE cov: 11894 ft: 15198 corp: 42/2790b lim: 105 exec/s: 24 rss: 70Mb 00:08:15.457 ###### Recommended dictionary. ###### 00:08:15.457 " \000" # Uses: 5 00:08:15.457 ###### End of recommended dictionary. ###### 00:08:15.457 Done 49 runs in 2 second(s) 00:08:15.716 07:03:33 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:08:15.716 07:03:33 -- ../common.sh@72 -- # (( i++ )) 00:08:15.716 07:03:33 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:15.716 07:03:33 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:15.716 07:03:33 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:15.716 07:03:33 -- nvmf/run.sh@24 -- # local timen=1 00:08:15.716 07:03:33 -- nvmf/run.sh@25 -- # local core=0x1 00:08:15.716 07:03:33 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:15.716 07:03:33 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:15.716 07:03:33 -- nvmf/run.sh@29 -- # printf %02d 17 00:08:15.716 07:03:33 -- nvmf/run.sh@29 -- # port=4417 00:08:15.716 07:03:33 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:15.716 07:03:33 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:15.716 07:03:33 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:15.716 07:03:33 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:08:15.716 [2024-12-13 07:03:33.746719] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:15.716 [2024-12-13 07:03:33.746811] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid497372 ] 00:08:15.716 EAL: No free 2048 kB hugepages reported on node 1 00:08:15.716 [2024-12-13 07:03:33.923243] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.716 [2024-12-13 07:03:33.942806] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:15.717 [2024-12-13 07:03:33.942938] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.976 [2024-12-13 07:03:33.994367] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:15.976 [2024-12-13 07:03:34.010617] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:15.976 INFO: Running with entropic power schedule (0xFF, 100). 00:08:15.976 INFO: Seed: 4038983554 00:08:15.976 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:15.976 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:15.976 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:15.976 INFO: A corpus is not provided, starting from an empty corpus 00:08:15.976 #2 INITED exec/s: 0 rss: 59Mb 00:08:15.976 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:15.976 This may also happen if the target rejected all inputs we tried so far 00:08:15.976 [2024-12-13 07:03:34.055954] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.976 [2024-12-13 07:03:34.055984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.976 [2024-12-13 07:03:34.056015] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.976 [2024-12-13 07:03:34.056031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.976 [2024-12-13 07:03:34.056081] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.976 [2024-12-13 07:03:34.056096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.236 NEW_FUNC[1/672]: 0x46b1b8 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:16.236 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:16.236 #7 NEW cov: 11688 ft: 11689 corp: 2/80b lim: 120 exec/s: 0 rss: 67Mb L: 79/79 MS: 5 InsertByte-InsertRepeatedBytes-EraseBytes-CopyPart-InsertRepeatedBytes- 00:08:16.236 [2024-12-13 07:03:34.376998] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.236 [2024-12-13 07:03:34.377057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.236 [2024-12-13 07:03:34.377138] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.236 [2024-12-13 07:03:34.377167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.236 [2024-12-13 07:03:34.377257] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18400582177529004031 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.236 [2024-12-13 07:03:34.377286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.236 #8 NEW cov: 11801 ft: 12319 corp: 3/159b lim: 120 exec/s: 0 rss: 67Mb L: 79/79 MS: 1 ChangeByte- 00:08:16.236 [2024-12-13 07:03:34.426929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.236 [2024-12-13 07:03:34.426959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.236 [2024-12-13 07:03:34.426994] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.236 [2024-12-13 07:03:34.427011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.236 [2024-12-13 07:03:34.427064] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.236 [2024-12-13 07:03:34.427080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.236 #9 NEW cov: 11807 ft: 12633 corp: 4/238b lim: 120 exec/s: 0 rss: 67Mb L: 79/79 MS: 1 ShuffleBytes- 00:08:16.236 [2024-12-13 07:03:34.467061] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.236 [2024-12-13 07:03:34.467093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.236 [2024-12-13 07:03:34.467143] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.236 [2024-12-13 07:03:34.467159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.236 [2024-12-13 07:03:34.467218] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.236 [2024-12-13 07:03:34.467233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.510 #10 NEW cov: 11892 ft: 12890 corp: 5/317b lim: 120 exec/s: 0 rss: 67Mb L: 79/79 MS: 1 ChangeByte- 00:08:16.510 [2024-12-13 07:03:34.507178] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.510 [2024-12-13 07:03:34.507211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.510 [2024-12-13 07:03:34.507256] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.510 [2024-12-13 07:03:34.507271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.510 [2024-12-13 07:03:34.507321] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.507337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.511 #16 NEW cov: 11892 ft: 13024 corp: 6/396b lim: 120 exec/s: 0 rss: 67Mb L: 79/79 MS: 1 ChangeBinInt- 00:08:16.511 [2024-12-13 07:03:34.547427] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.547455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.511 [2024-12-13 07:03:34.547501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.547517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.511 [2024-12-13 07:03:34.547566] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.547581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.511 [2024-12-13 07:03:34.547632] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.547648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.511 #17 NEW cov: 11892 ft: 13506 corp: 7/492b lim: 120 exec/s: 0 rss: 67Mb L: 96/96 MS: 1 CopyPart- 00:08:16.511 [2024-12-13 07:03:34.587527] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.587555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.511 [2024-12-13 07:03:34.587597] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.587612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.511 [2024-12-13 07:03:34.587665] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.587679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.511 [2024-12-13 07:03:34.587729] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744071435745400 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.587744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.511 #18 NEW cov: 11892 ft: 13557 corp: 8/606b lim: 120 exec/s: 0 rss: 67Mb L: 114/114 MS: 1 InsertRepeatedBytes- 00:08:16.511 [2024-12-13 07:03:34.627508] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.627536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.511 [2024-12-13 07:03:34.627572] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069414584320 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.627587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.511 [2024-12-13 07:03:34.627640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.627671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.511 #19 NEW cov: 11892 ft: 13657 corp: 9/685b lim: 120 exec/s: 0 rss: 67Mb L: 79/114 MS: 1 CMP- DE: "\004\000\000\000\000\000\000\000"- 00:08:16.511 [2024-12-13 07:03:34.667749] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.667777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.511 [2024-12-13 07:03:34.667814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.667830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.511 [2024-12-13 07:03:34.667881] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.667897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.511 [2024-12-13 07:03:34.667933] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.667948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.511 #20 NEW cov: 11892 ft: 13738 corp: 10/803b lim: 120 exec/s: 0 rss: 67Mb L: 118/118 MS: 1 CrossOver- 00:08:16.511 [2024-12-13 07:03:34.707729] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.707758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.511 [2024-12-13 07:03:34.707811] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.707827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.511 [2024-12-13 07:03:34.707881] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.707897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.511 #21 NEW cov: 11892 ft: 13770 corp: 11/895b lim: 120 exec/s: 0 rss: 67Mb L: 92/118 MS: 1 InsertRepeatedBytes- 00:08:16.511 [2024-12-13 07:03:34.748028] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.748056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.511 [2024-12-13 07:03:34.748120] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069414584320 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.748136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.511 [2024-12-13 07:03:34.748192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.748208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.511 [2024-12-13 07:03:34.748260] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744071435780095 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.511 [2024-12-13 07:03:34.748275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.771 #22 NEW cov: 11892 ft: 13858 corp: 12/995b lim: 120 exec/s: 0 rss: 67Mb L: 100/118 MS: 1 CopyPart- 00:08:16.771 [2024-12-13 07:03:34.788117] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.771 [2024-12-13 07:03:34.788145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.771 [2024-12-13 07:03:34.788191] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.771 [2024-12-13 07:03:34.788207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.771 [2024-12-13 07:03:34.788259] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.771 [2024-12-13 07:03:34.788275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.771 [2024-12-13 07:03:34.788330] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.771 [2024-12-13 07:03:34.788345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.771 #28 NEW cov: 11892 ft: 13913 corp: 13/1113b lim: 120 exec/s: 0 rss: 68Mb L: 118/118 MS: 1 CopyPart- 00:08:16.771 [2024-12-13 07:03:34.828092] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.771 [2024-12-13 07:03:34.828120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.771 [2024-12-13 07:03:34.828157] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069414584320 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.771 [2024-12-13 07:03:34.828172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.771 [2024-12-13 07:03:34.828230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.771 [2024-12-13 07:03:34.828244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.771 #29 NEW cov: 11892 ft: 13922 corp: 14/1193b lim: 120 exec/s: 0 rss: 68Mb L: 80/118 MS: 1 InsertByte- 00:08:16.771 [2024-12-13 07:03:34.868357] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.771 [2024-12-13 07:03:34.868384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.771 [2024-12-13 07:03:34.868421] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.771 [2024-12-13 07:03:34.868436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.771 [2024-12-13 07:03:34.868488] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.771 [2024-12-13 07:03:34.868504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.771 [2024-12-13 07:03:34.868557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:100663296 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.771 [2024-12-13 07:03:34.868571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.771 #30 NEW cov: 11892 ft: 13945 corp: 15/1289b lim: 120 exec/s: 0 rss: 68Mb L: 96/118 MS: 1 ChangeBinInt- 00:08:16.771 [2024-12-13 07:03:34.908484] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.771 [2024-12-13 07:03:34.908512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.771 [2024-12-13 07:03:34.908561] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.771 [2024-12-13 07:03:34.908578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.771 [2024-12-13 07:03:34.908645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.771 [2024-12-13 07:03:34.908659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.771 [2024-12-13 07:03:34.908712] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.771 [2024-12-13 07:03:34.908727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.771 #31 NEW cov: 11892 ft: 13981 corp: 16/1407b lim: 120 exec/s: 0 rss: 68Mb L: 118/118 MS: 1 CopyPart- 00:08:16.771 [2024-12-13 07:03:34.948601] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.771 [2024-12-13 07:03:34.948633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.771 [2024-12-13 07:03:34.948671] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.772 [2024-12-13 07:03:34.948686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.772 [2024-12-13 07:03:34.948739] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.772 [2024-12-13 07:03:34.948754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.772 [2024-12-13 07:03:34.948805] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.772 [2024-12-13 07:03:34.948820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.772 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:16.772 #32 NEW cov: 11915 ft: 14064 corp: 17/1517b lim: 120 exec/s: 0 rss: 68Mb L: 110/118 MS: 1 CrossOver- 00:08:16.772 [2024-12-13 07:03:34.998451] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.772 [2024-12-13 07:03:34.998479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.772 [2024-12-13 07:03:34.998530] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.772 [2024-12-13 07:03:34.998546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.039 #33 NEW cov: 11915 ft: 14487 corp: 18/1587b lim: 120 exec/s: 0 rss: 68Mb L: 70/118 MS: 1 EraseBytes- 00:08:17.039 [2024-12-13 07:03:35.038680] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.039 [2024-12-13 07:03:35.038707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.039 [2024-12-13 07:03:35.038751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.039 [2024-12-13 07:03:35.038767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.039 [2024-12-13 07:03:35.038816] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.039 [2024-12-13 07:03:35.038831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.039 #34 NEW cov: 11915 ft: 14575 corp: 19/1667b lim: 120 exec/s: 34 rss: 68Mb L: 80/118 MS: 1 EraseBytes- 00:08:17.039 [2024-12-13 07:03:35.078973] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070991642623 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.039 [2024-12-13 07:03:35.079001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.039 [2024-12-13 07:03:35.079044] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.039 [2024-12-13 07:03:35.079060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.039 [2024-12-13 07:03:35.079112] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.039 [2024-12-13 07:03:35.079131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.039 [2024-12-13 07:03:35.079185] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.039 [2024-12-13 07:03:35.079204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.039 #35 NEW cov: 11915 ft: 14629 corp: 20/1765b lim: 120 exec/s: 35 rss: 68Mb L: 98/118 MS: 1 CrossOver- 00:08:17.039 [2024-12-13 07:03:35.118928] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.039 [2024-12-13 07:03:35.118957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.039 [2024-12-13 07:03:35.118997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069414584320 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.039 [2024-12-13 07:03:35.119013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.039 [2024-12-13 07:03:35.119066] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744069414584575 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.039 [2024-12-13 07:03:35.119081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.039 #36 NEW cov: 11915 ft: 14649 corp: 21/1845b lim: 120 exec/s: 36 rss: 68Mb L: 80/118 MS: 1 PersAutoDict- DE: "\004\000\000\000\000\000\000\000"- 00:08:17.039 [2024-12-13 07:03:35.159091] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.039 [2024-12-13 07:03:35.159118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.039 [2024-12-13 07:03:35.159158] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.039 [2024-12-13 07:03:35.159195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.039 [2024-12-13 07:03:35.159251] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.039 [2024-12-13 07:03:35.159267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.039 #37 NEW cov: 11915 ft: 14688 corp: 22/1925b lim: 120 exec/s: 37 rss: 68Mb L: 80/118 MS: 1 ChangeBinInt- 00:08:17.039 [2024-12-13 07:03:35.199205] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.039 [2024-12-13 07:03:35.199232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.039 [2024-12-13 07:03:35.199270] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.039 [2024-12-13 07:03:35.199283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.039 [2024-12-13 07:03:35.199337] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.039 [2024-12-13 07:03:35.199353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.039 #38 NEW cov: 11915 ft: 14697 corp: 23/2004b lim: 120 exec/s: 38 rss: 68Mb L: 79/118 MS: 1 ChangeByte- 00:08:17.039 [2024-12-13 07:03:35.239446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.039 [2024-12-13 07:03:35.239473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.039 [2024-12-13 07:03:35.239513] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069414584575 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.039 [2024-12-13 07:03:35.239526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.039 [2024-12-13 07:03:35.239576] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.039 [2024-12-13 07:03:35.239592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.039 [2024-12-13 07:03:35.239642] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:67108864 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.039 [2024-12-13 07:03:35.239657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.039 #39 NEW cov: 11915 ft: 14706 corp: 24/2113b lim: 120 exec/s: 39 rss: 68Mb L: 109/118 MS: 1 InsertRepeatedBytes- 00:08:17.298 [2024-12-13 07:03:35.279460] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.298 [2024-12-13 07:03:35.279488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.298 [2024-12-13 07:03:35.279541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069414584320 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.298 [2024-12-13 07:03:35.279557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.298 [2024-12-13 07:03:35.279617] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744069414584575 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.298 [2024-12-13 07:03:35.279631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.298 #40 NEW cov: 11915 ft: 14722 corp: 25/2193b lim: 120 exec/s: 40 rss: 68Mb L: 80/118 MS: 1 PersAutoDict- DE: "\004\000\000\000\000\000\000\000"- 00:08:17.298 [2024-12-13 07:03:35.319681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.298 [2024-12-13 07:03:35.319710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.298 [2024-12-13 07:03:35.319767] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.298 [2024-12-13 07:03:35.319783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.298 [2024-12-13 07:03:35.319834] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.298 [2024-12-13 07:03:35.319850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.298 [2024-12-13 07:03:35.319891] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.298 [2024-12-13 07:03:35.319909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.298 #41 NEW cov: 11915 ft: 14769 corp: 26/2302b lim: 120 exec/s: 41 rss: 68Mb L: 109/118 MS: 1 CopyPart- 00:08:17.298 [2024-12-13 07:03:35.359778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446512074022091007 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.298 [2024-12-13 07:03:35.359805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.298 [2024-12-13 07:03:35.359871] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.298 [2024-12-13 07:03:35.359888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.298 [2024-12-13 07:03:35.359940] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.298 [2024-12-13 07:03:35.359956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.298 [2024-12-13 07:03:35.360009] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:100663296 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.298 [2024-12-13 07:03:35.360022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.298 #42 NEW cov: 11915 ft: 14783 corp: 27/2398b lim: 120 exec/s: 42 rss: 68Mb L: 96/118 MS: 1 ChangeByte- 00:08:17.298 [2024-12-13 07:03:35.399933] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446512074022091007 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.298 [2024-12-13 07:03:35.399961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.298 [2024-12-13 07:03:35.400008] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.298 [2024-12-13 07:03:35.400023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.299 [2024-12-13 07:03:35.400073] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.299 [2024-12-13 07:03:35.400089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.299 [2024-12-13 07:03:35.400141] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.299 [2024-12-13 07:03:35.400156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.299 #43 NEW cov: 11915 ft: 14793 corp: 28/2512b lim: 120 exec/s: 43 rss: 68Mb L: 114/118 MS: 1 CopyPart- 00:08:17.299 [2024-12-13 07:03:35.439941] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.299 [2024-12-13 07:03:35.439969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.299 [2024-12-13 07:03:35.440003] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069414584320 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.299 [2024-12-13 07:03:35.440018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.299 [2024-12-13 07:03:35.440072] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.299 [2024-12-13 07:03:35.440091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.299 #44 NEW cov: 11915 ft: 14801 corp: 29/2591b lim: 120 exec/s: 44 rss: 68Mb L: 79/118 MS: 1 PersAutoDict- DE: "\004\000\000\000\000\000\000\000"- 00:08:17.299 [2024-12-13 07:03:35.480127] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.299 [2024-12-13 07:03:35.480155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.299 [2024-12-13 07:03:35.480209] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.299 [2024-12-13 07:03:35.480226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.299 [2024-12-13 07:03:35.480279] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.299 [2024-12-13 07:03:35.480295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.299 [2024-12-13 07:03:35.480347] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2242545357980376863 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.299 [2024-12-13 07:03:35.480363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.299 #45 NEW cov: 11915 ft: 14817 corp: 30/2692b lim: 120 exec/s: 45 rss: 68Mb L: 101/118 MS: 1 InsertRepeatedBytes- 00:08:17.299 [2024-12-13 07:03:35.520153] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.299 [2024-12-13 07:03:35.520180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.299 [2024-12-13 07:03:35.520223] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:75153814486777856 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.299 [2024-12-13 07:03:35.520238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.299 [2024-12-13 07:03:35.520292] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.299 [2024-12-13 07:03:35.520307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.558 #46 NEW cov: 11915 ft: 14829 corp: 31/2772b lim: 120 exec/s: 46 rss: 68Mb L: 80/118 MS: 1 ChangeBinInt- 00:08:17.558 [2024-12-13 07:03:35.560119] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.558 [2024-12-13 07:03:35.560146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.558 [2024-12-13 07:03:35.560219] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.558 [2024-12-13 07:03:35.560238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.558 #47 NEW cov: 11915 ft: 14852 corp: 32/2842b lim: 120 exec/s: 47 rss: 68Mb L: 70/118 MS: 1 ShuffleBytes- 00:08:17.558 [2024-12-13 07:03:35.600549] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:11008 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.558 [2024-12-13 07:03:35.600576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.558 [2024-12-13 07:03:35.600617] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.558 [2024-12-13 07:03:35.600632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.558 [2024-12-13 07:03:35.600685] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.558 [2024-12-13 07:03:35.600701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.558 [2024-12-13 07:03:35.600753] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.558 [2024-12-13 07:03:35.600769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.558 #48 NEW cov: 11915 ft: 14855 corp: 33/2961b lim: 120 exec/s: 48 rss: 68Mb L: 119/119 MS: 1 InsertByte- 00:08:17.558 [2024-12-13 07:03:35.640495] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.558 [2024-12-13 07:03:35.640523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.558 [2024-12-13 07:03:35.640560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.558 [2024-12-13 07:03:35.640574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.558 [2024-12-13 07:03:35.640627] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.558 [2024-12-13 07:03:35.640643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.558 #49 NEW cov: 11915 ft: 14891 corp: 34/3047b lim: 120 exec/s: 49 rss: 68Mb L: 86/119 MS: 1 EraseBytes- 00:08:17.558 [2024-12-13 07:03:35.680624] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.558 [2024-12-13 07:03:35.680651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.559 [2024-12-13 07:03:35.680689] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.559 [2024-12-13 07:03:35.680704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.559 [2024-12-13 07:03:35.680757] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.559 [2024-12-13 07:03:35.680772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.559 #50 NEW cov: 11915 ft: 14898 corp: 35/3126b lim: 120 exec/s: 50 rss: 68Mb L: 79/119 MS: 1 CopyPart- 00:08:17.559 [2024-12-13 07:03:35.710731] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.559 [2024-12-13 07:03:35.710758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.559 [2024-12-13 07:03:35.710801] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:72057589742960816 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.559 [2024-12-13 07:03:35.710817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.559 [2024-12-13 07:03:35.710871] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744070085672959 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.559 [2024-12-13 07:03:35.710903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.559 #51 NEW cov: 11915 ft: 14905 corp: 36/3207b lim: 120 exec/s: 51 rss: 68Mb L: 81/119 MS: 1 InsertByte- 00:08:17.559 [2024-12-13 07:03:35.750817] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551577 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.559 [2024-12-13 07:03:35.750844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.559 [2024-12-13 07:03:35.750881] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.559 [2024-12-13 07:03:35.750897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.559 [2024-12-13 07:03:35.750950] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.559 [2024-12-13 07:03:35.750966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.559 #52 NEW cov: 11915 ft: 14941 corp: 37/3286b lim: 120 exec/s: 52 rss: 68Mb L: 79/119 MS: 1 ChangeByte- 00:08:17.559 [2024-12-13 07:03:35.790924] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.559 [2024-12-13 07:03:35.790951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.559 [2024-12-13 07:03:35.790996] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.559 [2024-12-13 07:03:35.791012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.559 [2024-12-13 07:03:35.791065] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.559 [2024-12-13 07:03:35.791082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.818 #53 NEW cov: 11915 ft: 14953 corp: 38/3365b lim: 120 exec/s: 53 rss: 68Mb L: 79/119 MS: 1 ShuffleBytes- 00:08:17.818 [2024-12-13 07:03:35.831259] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.818 [2024-12-13 07:03:35.831287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.818 [2024-12-13 07:03:35.831334] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.818 [2024-12-13 07:03:35.831350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.818 [2024-12-13 07:03:35.831403] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.818 [2024-12-13 07:03:35.831419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.818 [2024-12-13 07:03:35.831472] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.818 [2024-12-13 07:03:35.831488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.818 #54 NEW cov: 11915 ft: 14962 corp: 39/3474b lim: 120 exec/s: 54 rss: 68Mb L: 109/119 MS: 1 ShuffleBytes- 00:08:17.818 [2024-12-13 07:03:35.871384] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.818 [2024-12-13 07:03:35.871412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.818 [2024-12-13 07:03:35.871452] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.818 [2024-12-13 07:03:35.871464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.818 [2024-12-13 07:03:35.871516] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.818 [2024-12-13 07:03:35.871530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.818 [2024-12-13 07:03:35.871585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.818 [2024-12-13 07:03:35.871600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.818 #55 NEW cov: 11915 ft: 14966 corp: 40/3584b lim: 120 exec/s: 55 rss: 68Mb L: 110/119 MS: 1 ChangeBinInt- 00:08:17.818 [2024-12-13 07:03:35.911480] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.818 [2024-12-13 07:03:35.911507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.818 [2024-12-13 07:03:35.911556] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744069414584320 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.818 [2024-12-13 07:03:35.911571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.818 [2024-12-13 07:03:35.911621] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.818 [2024-12-13 07:03:35.911636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.818 [2024-12-13 07:03:35.911688] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744071435780095 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.819 [2024-12-13 07:03:35.911703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.819 #56 NEW cov: 11915 ft: 14967 corp: 41/3684b lim: 120 exec/s: 56 rss: 68Mb L: 100/119 MS: 1 ChangeBinInt- 00:08:17.819 [2024-12-13 07:03:35.951599] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.819 [2024-12-13 07:03:35.951627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.819 [2024-12-13 07:03:35.951666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.819 [2024-12-13 07:03:35.951682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.819 [2024-12-13 07:03:35.951736] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.819 [2024-12-13 07:03:35.951754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.819 [2024-12-13 07:03:35.951798] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.819 [2024-12-13 07:03:35.951813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.819 #57 NEW cov: 11915 ft: 14989 corp: 42/3802b lim: 120 exec/s: 57 rss: 68Mb L: 118/119 MS: 1 InsertRepeatedBytes- 00:08:17.819 [2024-12-13 07:03:35.991529] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551577 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.819 [2024-12-13 07:03:35.991557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.819 [2024-12-13 07:03:35.991595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.819 [2024-12-13 07:03:35.991610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.819 [2024-12-13 07:03:35.991668] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.819 [2024-12-13 07:03:35.991700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.819 #58 NEW cov: 11915 ft: 14993 corp: 43/3882b lim: 120 exec/s: 58 rss: 68Mb L: 80/119 MS: 1 InsertByte- 00:08:17.819 [2024-12-13 07:03:36.031676] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070975551743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.819 [2024-12-13 07:03:36.031705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.819 [2024-12-13 07:03:36.031745] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:4294967044 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.819 [2024-12-13 07:03:36.031761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.819 [2024-12-13 07:03:36.031816] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.819 [2024-12-13 07:03:36.031831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.078 #59 NEW cov: 11915 ft: 15007 corp: 44/3961b lim: 120 exec/s: 29 rss: 69Mb L: 79/119 MS: 1 PersAutoDict- DE: "\004\000\000\000\000\000\000\000"- 00:08:18.078 #59 DONE cov: 11915 ft: 15007 corp: 44/3961b lim: 120 exec/s: 29 rss: 69Mb 00:08:18.078 ###### Recommended dictionary. ###### 00:08:18.078 "\004\000\000\000\000\000\000\000" # Uses: 4 00:08:18.078 ###### End of recommended dictionary. ###### 00:08:18.078 Done 59 runs in 2 second(s) 00:08:18.078 07:03:36 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:08:18.078 07:03:36 -- ../common.sh@72 -- # (( i++ )) 00:08:18.078 07:03:36 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:18.078 07:03:36 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:18.078 07:03:36 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:18.078 07:03:36 -- nvmf/run.sh@24 -- # local timen=1 00:08:18.078 07:03:36 -- nvmf/run.sh@25 -- # local core=0x1 00:08:18.078 07:03:36 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:18.078 07:03:36 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:18.078 07:03:36 -- nvmf/run.sh@29 -- # printf %02d 18 00:08:18.078 07:03:36 -- nvmf/run.sh@29 -- # port=4418 00:08:18.078 07:03:36 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:18.078 07:03:36 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:18.078 07:03:36 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:18.078 07:03:36 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:08:18.078 [2024-12-13 07:03:36.213393] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:18.078 [2024-12-13 07:03:36.213468] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid497769 ] 00:08:18.078 EAL: No free 2048 kB hugepages reported on node 1 00:08:18.338 [2024-12-13 07:03:36.397276] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.338 [2024-12-13 07:03:36.417525] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:18.338 [2024-12-13 07:03:36.417648] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.338 [2024-12-13 07:03:36.469019] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:18.338 [2024-12-13 07:03:36.485365] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:18.338 INFO: Running with entropic power schedule (0xFF, 100). 00:08:18.338 INFO: Seed: 2219037188 00:08:18.338 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:18.338 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:18.338 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:18.338 INFO: A corpus is not provided, starting from an empty corpus 00:08:18.338 #2 INITED exec/s: 0 rss: 59Mb 00:08:18.338 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:18.338 This may also happen if the target rejected all inputs we tried so far 00:08:18.338 [2024-12-13 07:03:36.561439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.338 [2024-12-13 07:03:36.561472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.338 [2024-12-13 07:03:36.561595] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.338 [2024-12-13 07:03:36.561616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.338 [2024-12-13 07:03:36.561731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.338 [2024-12-13 07:03:36.561754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.856 NEW_FUNC[1/669]: 0x46ea18 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:18.856 NEW_FUNC[2/669]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:18.856 #7 NEW cov: 11618 ft: 11633 corp: 2/61b lim: 100 exec/s: 0 rss: 67Mb L: 60/60 MS: 5 ChangeBinInt-ShuffleBytes-ChangeBit-ChangeBit-InsertRepeatedBytes- 00:08:18.856 [2024-12-13 07:03:36.912475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.856 [2024-12-13 07:03:36.912524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.856 [2024-12-13 07:03:36.912661] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.856 [2024-12-13 07:03:36.912688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.856 [2024-12-13 07:03:36.912831] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.856 [2024-12-13 07:03:36.912856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.856 NEW_FUNC[1/1]: 0x1cc31f8 in thread_execute_poller /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:934 00:08:18.856 #9 NEW cov: 11745 ft: 12110 corp: 3/133b lim: 100 exec/s: 0 rss: 67Mb L: 72/72 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:18.856 [2024-12-13 07:03:36.962780] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.856 [2024-12-13 07:03:36.962826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.856 [2024-12-13 07:03:36.962913] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.856 [2024-12-13 07:03:36.962936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.856 [2024-12-13 07:03:36.963075] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.856 [2024-12-13 07:03:36.963097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.856 [2024-12-13 07:03:36.963223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.856 [2024-12-13 07:03:36.963247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.856 #15 NEW cov: 11751 ft: 12699 corp: 4/226b lim: 100 exec/s: 0 rss: 67Mb L: 93/93 MS: 1 InsertRepeatedBytes- 00:08:18.856 [2024-12-13 07:03:37.022941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.856 [2024-12-13 07:03:37.022976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.856 [2024-12-13 07:03:37.023078] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.856 [2024-12-13 07:03:37.023100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.856 [2024-12-13 07:03:37.023221] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.856 [2024-12-13 07:03:37.023236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.856 [2024-12-13 07:03:37.023359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.856 [2024-12-13 07:03:37.023380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.856 #16 NEW cov: 11836 ft: 12982 corp: 5/322b lim: 100 exec/s: 0 rss: 67Mb L: 96/96 MS: 1 InsertRepeatedBytes- 00:08:18.856 [2024-12-13 07:03:37.083001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.856 [2024-12-13 07:03:37.083047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.856 [2024-12-13 07:03:37.083178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.856 [2024-12-13 07:03:37.083204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.856 [2024-12-13 07:03:37.083339] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.856 [2024-12-13 07:03:37.083362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.116 #17 NEW cov: 11836 ft: 13088 corp: 6/382b lim: 100 exec/s: 0 rss: 67Mb L: 60/96 MS: 1 ChangeBinInt- 00:08:19.116 [2024-12-13 07:03:37.133133] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.116 [2024-12-13 07:03:37.133169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.116 [2024-12-13 07:03:37.133290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.116 [2024-12-13 07:03:37.133313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.116 [2024-12-13 07:03:37.133440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.116 [2024-12-13 07:03:37.133465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.116 #18 NEW cov: 11836 ft: 13147 corp: 7/457b lim: 100 exec/s: 0 rss: 67Mb L: 75/96 MS: 1 CrossOver- 00:08:19.116 [2024-12-13 07:03:37.193353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.116 [2024-12-13 07:03:37.193386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.116 [2024-12-13 07:03:37.193513] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.116 [2024-12-13 07:03:37.193532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.116 [2024-12-13 07:03:37.193657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.116 [2024-12-13 07:03:37.193675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.116 #19 NEW cov: 11836 ft: 13227 corp: 8/517b lim: 100 exec/s: 0 rss: 67Mb L: 60/96 MS: 1 ShuffleBytes- 00:08:19.116 [2024-12-13 07:03:37.243423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.116 [2024-12-13 07:03:37.243450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.116 [2024-12-13 07:03:37.243554] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.116 [2024-12-13 07:03:37.243579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.116 [2024-12-13 07:03:37.243708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.116 [2024-12-13 07:03:37.243728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.116 #20 NEW cov: 11836 ft: 13255 corp: 9/578b lim: 100 exec/s: 0 rss: 67Mb L: 61/96 MS: 1 InsertByte- 00:08:19.116 [2024-12-13 07:03:37.293634] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.116 [2024-12-13 07:03:37.293662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.116 [2024-12-13 07:03:37.293749] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.116 [2024-12-13 07:03:37.293774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.116 [2024-12-13 07:03:37.293897] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.116 [2024-12-13 07:03:37.293919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.116 #21 NEW cov: 11836 ft: 13308 corp: 10/642b lim: 100 exec/s: 0 rss: 68Mb L: 64/96 MS: 1 InsertRepeatedBytes- 00:08:19.116 [2024-12-13 07:03:37.343789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.116 [2024-12-13 07:03:37.343821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.116 [2024-12-13 07:03:37.343937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.116 [2024-12-13 07:03:37.343960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.116 [2024-12-13 07:03:37.344083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.116 [2024-12-13 07:03:37.344104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.375 #22 NEW cov: 11836 ft: 13326 corp: 11/715b lim: 100 exec/s: 0 rss: 68Mb L: 73/96 MS: 1 InsertByte- 00:08:19.375 [2024-12-13 07:03:37.394019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.375 [2024-12-13 07:03:37.394051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.375 [2024-12-13 07:03:37.394138] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.375 [2024-12-13 07:03:37.394158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.375 [2024-12-13 07:03:37.394289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.375 [2024-12-13 07:03:37.394308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.375 #23 NEW cov: 11836 ft: 13378 corp: 12/775b lim: 100 exec/s: 0 rss: 68Mb L: 60/96 MS: 1 CopyPart- 00:08:19.375 [2024-12-13 07:03:37.444153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.375 [2024-12-13 07:03:37.444184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.375 [2024-12-13 07:03:37.444303] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.375 [2024-12-13 07:03:37.444324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.375 [2024-12-13 07:03:37.444452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.375 [2024-12-13 07:03:37.444470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.375 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:19.375 #24 NEW cov: 11859 ft: 13416 corp: 13/835b lim: 100 exec/s: 0 rss: 68Mb L: 60/96 MS: 1 ChangeBit- 00:08:19.375 [2024-12-13 07:03:37.504629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.375 [2024-12-13 07:03:37.504659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.375 [2024-12-13 07:03:37.504753] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.375 [2024-12-13 07:03:37.504774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.375 [2024-12-13 07:03:37.504896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.376 [2024-12-13 07:03:37.504921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.376 [2024-12-13 07:03:37.505047] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.376 [2024-12-13 07:03:37.505070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.376 #25 NEW cov: 11859 ft: 13454 corp: 14/931b lim: 100 exec/s: 25 rss: 68Mb L: 96/96 MS: 1 CopyPart- 00:08:19.376 [2024-12-13 07:03:37.564885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.376 [2024-12-13 07:03:37.564921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.376 [2024-12-13 07:03:37.565034] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.376 [2024-12-13 07:03:37.565058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.376 [2024-12-13 07:03:37.565200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.376 [2024-12-13 07:03:37.565221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.376 [2024-12-13 07:03:37.565346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.376 [2024-12-13 07:03:37.565368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.376 #26 NEW cov: 11859 ft: 13460 corp: 15/1027b lim: 100 exec/s: 26 rss: 68Mb L: 96/96 MS: 1 ChangeBinInt- 00:08:19.376 [2024-12-13 07:03:37.614686] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.376 [2024-12-13 07:03:37.614721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.376 [2024-12-13 07:03:37.614818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.376 [2024-12-13 07:03:37.614839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.376 [2024-12-13 07:03:37.614979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.376 [2024-12-13 07:03:37.615005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.635 #27 NEW cov: 11859 ft: 13550 corp: 16/1087b lim: 100 exec/s: 27 rss: 68Mb L: 60/96 MS: 1 ChangeByte- 00:08:19.635 [2024-12-13 07:03:37.665004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.635 [2024-12-13 07:03:37.665035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.635 [2024-12-13 07:03:37.665126] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.635 [2024-12-13 07:03:37.665147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.635 [2024-12-13 07:03:37.665271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.635 [2024-12-13 07:03:37.665293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.635 #28 NEW cov: 11859 ft: 13591 corp: 17/1148b lim: 100 exec/s: 28 rss: 68Mb L: 61/96 MS: 1 ShuffleBytes- 00:08:19.635 [2024-12-13 07:03:37.715077] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.635 [2024-12-13 07:03:37.715107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.635 [2024-12-13 07:03:37.715208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.635 [2024-12-13 07:03:37.715228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.635 [2024-12-13 07:03:37.715352] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.635 [2024-12-13 07:03:37.715375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.635 #29 NEW cov: 11859 ft: 13606 corp: 18/1208b lim: 100 exec/s: 29 rss: 68Mb L: 60/96 MS: 1 ShuffleBytes- 00:08:19.635 [2024-12-13 07:03:37.765421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.635 [2024-12-13 07:03:37.765455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.635 [2024-12-13 07:03:37.765553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.635 [2024-12-13 07:03:37.765573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.635 [2024-12-13 07:03:37.765715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.635 [2024-12-13 07:03:37.765738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.635 [2024-12-13 07:03:37.765874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.635 [2024-12-13 07:03:37.765897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.635 #30 NEW cov: 11859 ft: 13617 corp: 19/1289b lim: 100 exec/s: 30 rss: 68Mb L: 81/96 MS: 1 InsertRepeatedBytes- 00:08:19.635 [2024-12-13 07:03:37.815631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.635 [2024-12-13 07:03:37.815663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.635 [2024-12-13 07:03:37.815754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.635 [2024-12-13 07:03:37.815775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.635 [2024-12-13 07:03:37.815894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.635 [2024-12-13 07:03:37.815916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.635 [2024-12-13 07:03:37.816036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.635 [2024-12-13 07:03:37.816056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.635 #31 NEW cov: 11859 ft: 13702 corp: 20/1384b lim: 100 exec/s: 31 rss: 68Mb L: 95/96 MS: 1 InsertRepeatedBytes- 00:08:19.635 [2024-12-13 07:03:37.865674] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.635 [2024-12-13 07:03:37.865703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.635 [2024-12-13 07:03:37.865786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.635 [2024-12-13 07:03:37.865809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.635 [2024-12-13 07:03:37.865940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.635 [2024-12-13 07:03:37.865963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.635 [2024-12-13 07:03:37.866096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.635 [2024-12-13 07:03:37.866122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.895 #32 NEW cov: 11859 ft: 13718 corp: 21/1481b lim: 100 exec/s: 32 rss: 68Mb L: 97/97 MS: 1 CrossOver- 00:08:19.895 [2024-12-13 07:03:37.915961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.895 [2024-12-13 07:03:37.915990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.895 [2024-12-13 07:03:37.916079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.895 [2024-12-13 07:03:37.916102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.895 [2024-12-13 07:03:37.916235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.895 [2024-12-13 07:03:37.916258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.895 [2024-12-13 07:03:37.916393] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.895 [2024-12-13 07:03:37.916415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.895 #33 NEW cov: 11859 ft: 13728 corp: 22/1567b lim: 100 exec/s: 33 rss: 68Mb L: 86/97 MS: 1 InsertRepeatedBytes- 00:08:19.895 [2024-12-13 07:03:37.965886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.895 [2024-12-13 07:03:37.965917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.895 [2024-12-13 07:03:37.966007] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.895 [2024-12-13 07:03:37.966028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.895 [2024-12-13 07:03:37.966155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.895 [2024-12-13 07:03:37.966177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.895 #34 NEW cov: 11859 ft: 13739 corp: 23/1628b lim: 100 exec/s: 34 rss: 68Mb L: 61/97 MS: 1 CopyPart- 00:08:19.895 [2024-12-13 07:03:38.015979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.895 [2024-12-13 07:03:38.016011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.895 [2024-12-13 07:03:38.016098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.895 [2024-12-13 07:03:38.016116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.895 [2024-12-13 07:03:38.016248] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.895 [2024-12-13 07:03:38.016271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.895 #35 NEW cov: 11859 ft: 13746 corp: 24/1688b lim: 100 exec/s: 35 rss: 68Mb L: 60/97 MS: 1 CopyPart- 00:08:19.895 [2024-12-13 07:03:38.066120] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.895 [2024-12-13 07:03:38.066155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.895 [2024-12-13 07:03:38.066277] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.895 [2024-12-13 07:03:38.066300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.895 [2024-12-13 07:03:38.066432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.895 [2024-12-13 07:03:38.066456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.895 #36 NEW cov: 11859 ft: 13753 corp: 25/1748b lim: 100 exec/s: 36 rss: 68Mb L: 60/97 MS: 1 CopyPart- 00:08:19.895 [2024-12-13 07:03:38.116335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.895 [2024-12-13 07:03:38.116365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.895 [2024-12-13 07:03:38.116444] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.895 [2024-12-13 07:03:38.116465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.895 [2024-12-13 07:03:38.116590] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.895 [2024-12-13 07:03:38.116613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.154 #37 NEW cov: 11859 ft: 13759 corp: 26/1820b lim: 100 exec/s: 37 rss: 68Mb L: 72/97 MS: 1 ChangeBit- 00:08:20.154 [2024-12-13 07:03:38.166790] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.154 [2024-12-13 07:03:38.166822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.154 [2024-12-13 07:03:38.166911] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.154 [2024-12-13 07:03:38.166947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.154 [2024-12-13 07:03:38.167070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.154 [2024-12-13 07:03:38.167095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.154 [2024-12-13 07:03:38.167211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:20.154 [2024-12-13 07:03:38.167232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.154 #43 NEW cov: 11859 ft: 13786 corp: 27/1916b lim: 100 exec/s: 43 rss: 68Mb L: 96/97 MS: 1 ChangeBinInt- 00:08:20.154 [2024-12-13 07:03:38.226545] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.154 [2024-12-13 07:03:38.226577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.154 [2024-12-13 07:03:38.226689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.154 [2024-12-13 07:03:38.226709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.154 #44 NEW cov: 11859 ft: 14098 corp: 28/1971b lim: 100 exec/s: 44 rss: 68Mb L: 55/97 MS: 1 EraseBytes- 00:08:20.154 [2024-12-13 07:03:38.286414] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.154 [2024-12-13 07:03:38.286438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.154 #45 NEW cov: 11859 ft: 14512 corp: 29/2003b lim: 100 exec/s: 45 rss: 68Mb L: 32/97 MS: 1 EraseBytes- 00:08:20.154 [2024-12-13 07:03:38.337023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.154 [2024-12-13 07:03:38.337052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.154 [2024-12-13 07:03:38.337140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.154 [2024-12-13 07:03:38.337161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.154 [2024-12-13 07:03:38.337290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.154 [2024-12-13 07:03:38.337312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.154 #46 NEW cov: 11859 ft: 14575 corp: 30/2072b lim: 100 exec/s: 46 rss: 68Mb L: 69/97 MS: 1 CMP- DE: "\377\001\347\262%\031\006\210"- 00:08:20.154 [2024-12-13 07:03:38.386988] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.154 [2024-12-13 07:03:38.387023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.154 [2024-12-13 07:03:38.387155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.154 [2024-12-13 07:03:38.387178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.414 #47 NEW cov: 11859 ft: 14592 corp: 31/2127b lim: 100 exec/s: 47 rss: 68Mb L: 55/97 MS: 1 ChangeBinInt- 00:08:20.414 [2024-12-13 07:03:38.447389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.414 [2024-12-13 07:03:38.447419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.414 [2024-12-13 07:03:38.447528] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.414 [2024-12-13 07:03:38.447546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.414 [2024-12-13 07:03:38.447668] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.414 [2024-12-13 07:03:38.447686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.414 #48 NEW cov: 11859 ft: 14632 corp: 32/2188b lim: 100 exec/s: 48 rss: 68Mb L: 61/97 MS: 1 ChangeBinInt- 00:08:20.414 [2024-12-13 07:03:38.497513] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.414 [2024-12-13 07:03:38.497546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.414 [2024-12-13 07:03:38.497664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.414 [2024-12-13 07:03:38.497685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.414 [2024-12-13 07:03:38.497779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.414 [2024-12-13 07:03:38.497804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.414 #49 NEW cov: 11859 ft: 14638 corp: 33/2265b lim: 100 exec/s: 49 rss: 69Mb L: 77/97 MS: 1 CopyPart- 00:08:20.414 [2024-12-13 07:03:38.547743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.414 [2024-12-13 07:03:38.547774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.414 [2024-12-13 07:03:38.547869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.414 [2024-12-13 07:03:38.547890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.414 [2024-12-13 07:03:38.548018] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.414 [2024-12-13 07:03:38.548039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.414 #50 NEW cov: 11859 ft: 14640 corp: 34/2325b lim: 100 exec/s: 25 rss: 69Mb L: 60/97 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:20.414 #50 DONE cov: 11859 ft: 14640 corp: 34/2325b lim: 100 exec/s: 25 rss: 69Mb 00:08:20.414 ###### Recommended dictionary. ###### 00:08:20.414 "\377\001\347\262%\031\006\210" # Uses: 0 00:08:20.414 "\001\000\000\000" # Uses: 0 00:08:20.414 ###### End of recommended dictionary. ###### 00:08:20.414 Done 50 runs in 2 second(s) 00:08:20.674 07:03:38 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:08:20.674 07:03:38 -- ../common.sh@72 -- # (( i++ )) 00:08:20.674 07:03:38 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:20.674 07:03:38 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:20.674 07:03:38 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:20.674 07:03:38 -- nvmf/run.sh@24 -- # local timen=1 00:08:20.674 07:03:38 -- nvmf/run.sh@25 -- # local core=0x1 00:08:20.674 07:03:38 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:20.674 07:03:38 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:20.674 07:03:38 -- nvmf/run.sh@29 -- # printf %02d 19 00:08:20.674 07:03:38 -- nvmf/run.sh@29 -- # port=4419 00:08:20.674 07:03:38 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:20.674 07:03:38 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:20.674 07:03:38 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:20.674 07:03:38 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:08:20.674 [2024-12-13 07:03:38.720836] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:20.674 [2024-12-13 07:03:38.720899] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid498207 ] 00:08:20.674 EAL: No free 2048 kB hugepages reported on node 1 00:08:20.674 [2024-12-13 07:03:38.895509] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.933 [2024-12-13 07:03:38.915376] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:20.934 [2024-12-13 07:03:38.915501] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.934 [2024-12-13 07:03:38.966724] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:20.934 [2024-12-13 07:03:38.983011] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:20.934 INFO: Running with entropic power schedule (0xFF, 100). 00:08:20.934 INFO: Seed: 422051975 00:08:20.934 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:20.934 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:20.934 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:20.934 INFO: A corpus is not provided, starting from an empty corpus 00:08:20.934 #2 INITED exec/s: 0 rss: 59Mb 00:08:20.934 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:20.934 This may also happen if the target rejected all inputs we tried so far 00:08:20.934 [2024-12-13 07:03:39.049246] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10778685752873424277 len:38294 00:08:20.934 [2024-12-13 07:03:39.049279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.934 [2024-12-13 07:03:39.049401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:10778685752873424277 len:38294 00:08:20.934 [2024-12-13 07:03:39.049422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.192 NEW_FUNC[1/670]: 0x4719d8 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:21.193 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:21.193 #8 NEW cov: 11610 ft: 11611 corp: 2/29b lim: 50 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:08:21.193 [2024-12-13 07:03:39.370039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071595618303 len:65536 00:08:21.193 [2024-12-13 07:03:39.370087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.193 [2024-12-13 07:03:39.370241] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:21.193 [2024-12-13 07:03:39.370267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.193 #13 NEW cov: 11723 ft: 12136 corp: 3/57b lim: 50 exec/s: 0 rss: 67Mb L: 28/28 MS: 5 InsertRepeatedBytes-ChangeByte-InsertByte-ChangeBit-InsertRepeatedBytes- 00:08:21.193 [2024-12-13 07:03:39.419971] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:9367370219646195093 len:38384 00:08:21.193 [2024-12-13 07:03:39.419999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.452 #14 NEW cov: 11729 ft: 12677 corp: 4/74b lim: 50 exec/s: 0 rss: 67Mb L: 17/28 MS: 1 CrossOver- 00:08:21.452 [2024-12-13 07:03:39.480227] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071595618303 len:65536 00:08:21.452 [2024-12-13 07:03:39.480266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.452 [2024-12-13 07:03:39.480382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18435766549617836031 len:65536 00:08:21.452 [2024-12-13 07:03:39.480407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.452 #15 NEW cov: 11814 ft: 12962 corp: 5/102b lim: 50 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 ChangeByte- 00:08:21.452 [2024-12-13 07:03:39.540423] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10778685752873424277 len:38294 00:08:21.452 [2024-12-13 07:03:39.540463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.452 [2024-12-13 07:03:39.540560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:10778685752873424277 len:38294 00:08:21.452 [2024-12-13 07:03:39.540579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.452 #16 NEW cov: 11814 ft: 13064 corp: 6/130b lim: 50 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 ShuffleBytes- 00:08:21.452 [2024-12-13 07:03:39.590560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10778685752873424277 len:38294 00:08:21.452 [2024-12-13 07:03:39.590593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.452 [2024-12-13 07:03:39.590698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:10778685752873424277 len:38294 00:08:21.452 [2024-12-13 07:03:39.590720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.452 #17 NEW cov: 11814 ft: 13212 corp: 7/158b lim: 50 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 CrossOver- 00:08:21.452 [2024-12-13 07:03:39.640660] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:9367370219646195093 len:4353 00:08:21.452 [2024-12-13 07:03:39.640692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.452 #18 NEW cov: 11814 ft: 13278 corp: 8/175b lim: 50 exec/s: 0 rss: 67Mb L: 17/28 MS: 1 ChangeBinInt- 00:08:21.452 [2024-12-13 07:03:39.690972] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10778685752873424277 len:38294 00:08:21.452 [2024-12-13 07:03:39.691005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.452 [2024-12-13 07:03:39.691122] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:10850743346911352213 len:38294 00:08:21.452 [2024-12-13 07:03:39.691150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.711 #19 NEW cov: 11814 ft: 13333 corp: 9/203b lim: 50 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 ChangeBinInt- 00:08:21.711 [2024-12-13 07:03:39.741062] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071595618303 len:65536 00:08:21.711 [2024-12-13 07:03:39.741093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.711 [2024-12-13 07:03:39.741205] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18435766549617836031 len:65536 00:08:21.711 [2024-12-13 07:03:39.741229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.711 #20 NEW cov: 11814 ft: 13375 corp: 10/232b lim: 50 exec/s: 0 rss: 68Mb L: 29/29 MS: 1 InsertByte- 00:08:21.711 [2024-12-13 07:03:39.791074] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:9367370215351227797 len:38384 00:08:21.711 [2024-12-13 07:03:39.791107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.711 #26 NEW cov: 11814 ft: 13403 corp: 11/249b lim: 50 exec/s: 0 rss: 68Mb L: 17/29 MS: 1 ChangeBit- 00:08:21.711 [2024-12-13 07:03:39.841555] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071595618303 len:65536 00:08:21.711 [2024-12-13 07:03:39.841590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.711 [2024-12-13 07:03:39.841677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18435766549617836031 len:65536 00:08:21.711 [2024-12-13 07:03:39.841702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.711 [2024-12-13 07:03:39.841829] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18441396049152022527 len:65314 00:08:21.711 [2024-12-13 07:03:39.841855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.711 #27 NEW cov: 11814 ft: 13715 corp: 12/279b lim: 50 exec/s: 0 rss: 68Mb L: 30/30 MS: 1 InsertByte- 00:08:21.711 [2024-12-13 07:03:39.901771] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10778685752873424277 len:38294 00:08:21.712 [2024-12-13 07:03:39.901802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.712 [2024-12-13 07:03:39.901916] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:10778685752873424277 len:38294 00:08:21.712 [2024-12-13 07:03:39.901940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.712 [2024-12-13 07:03:39.902058] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:38294 00:08:21.712 [2024-12-13 07:03:39.902081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.712 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:21.712 #33 NEW cov: 11837 ft: 13729 corp: 13/315b lim: 50 exec/s: 0 rss: 68Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:08:21.971 [2024-12-13 07:03:39.951580] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18416790720893785473 len:38384 00:08:21.971 [2024-12-13 07:03:39.951613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.971 #34 NEW cov: 11837 ft: 13750 corp: 14/332b lim: 50 exec/s: 0 rss: 68Mb L: 17/36 MS: 1 ShuffleBytes- 00:08:21.971 [2024-12-13 07:03:40.002124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10778685752873424277 len:38294 00:08:21.971 [2024-12-13 07:03:40.002157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.971 [2024-12-13 07:03:40.002257] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:10746714613322454421 len:150 00:08:21.971 [2024-12-13 07:03:40.002278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.971 [2024-12-13 07:03:40.002393] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:38294 00:08:21.971 [2024-12-13 07:03:40.002416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.971 #35 NEW cov: 11837 ft: 13772 corp: 15/368b lim: 50 exec/s: 35 rss: 68Mb L: 36/36 MS: 1 ChangeBinInt- 00:08:21.971 [2024-12-13 07:03:40.062095] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446726481515249663 len:65536 00:08:21.971 [2024-12-13 07:03:40.062137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.971 [2024-12-13 07:03:40.062260] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:21.971 [2024-12-13 07:03:40.062284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.971 #36 NEW cov: 11837 ft: 13787 corp: 16/396b lim: 50 exec/s: 36 rss: 68Mb L: 28/36 MS: 1 ShuffleBytes- 00:08:21.971 [2024-12-13 07:03:40.112269] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10778685752873424277 len:38294 00:08:21.971 [2024-12-13 07:03:40.112302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.971 [2024-12-13 07:03:40.112418] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:10778685752873424277 len:38294 00:08:21.971 [2024-12-13 07:03:40.112440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.971 #37 NEW cov: 11837 ft: 13795 corp: 17/424b lim: 50 exec/s: 37 rss: 68Mb L: 28/36 MS: 1 ShuffleBytes- 00:08:21.971 [2024-12-13 07:03:40.162213] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18416907724392863105 len:38294 00:08:21.971 [2024-12-13 07:03:40.162243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.971 #38 NEW cov: 11837 ft: 13820 corp: 18/435b lim: 50 exec/s: 38 rss: 68Mb L: 11/36 MS: 1 EraseBytes- 00:08:22.231 [2024-12-13 07:03:40.212823] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10778685752873424277 len:38294 00:08:22.231 [2024-12-13 07:03:40.212853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.231 [2024-12-13 07:03:40.212977] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:10778685752873424277 len:38294 00:08:22.231 [2024-12-13 07:03:40.213002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.231 [2024-12-13 07:03:40.213122] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:10778685752873424277 len:38294 00:08:22.231 [2024-12-13 07:03:40.213145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.231 #39 NEW cov: 11837 ft: 13828 corp: 19/471b lim: 50 exec/s: 39 rss: 68Mb L: 36/36 MS: 1 CopyPart- 00:08:22.231 [2024-12-13 07:03:40.262795] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071595618303 len:65536 00:08:22.231 [2024-12-13 07:03:40.262825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.231 [2024-12-13 07:03:40.262949] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18435766549617836031 len:65536 00:08:22.231 [2024-12-13 07:03:40.262970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.231 #40 NEW cov: 11837 ft: 13880 corp: 20/500b lim: 50 exec/s: 40 rss: 68Mb L: 29/36 MS: 1 ChangeBit- 00:08:22.231 [2024-12-13 07:03:40.312952] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071595618303 len:65536 00:08:22.231 [2024-12-13 07:03:40.312984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.231 [2024-12-13 07:03:40.313102] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18385945478740049919 len:63232 00:08:22.231 [2024-12-13 07:03:40.313126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.231 #41 NEW cov: 11837 ft: 13904 corp: 21/529b lim: 50 exec/s: 41 rss: 68Mb L: 29/36 MS: 1 ChangeBinInt- 00:08:22.231 [2024-12-13 07:03:40.362945] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:9367370215351227797 len:38384 00:08:22.231 [2024-12-13 07:03:40.362976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.231 #42 NEW cov: 11837 ft: 13912 corp: 22/547b lim: 50 exec/s: 42 rss: 68Mb L: 18/36 MS: 1 InsertByte- 00:08:22.231 [2024-12-13 07:03:40.413356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10778685752873424277 len:38294 00:08:22.231 [2024-12-13 07:03:40.413388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.231 [2024-12-13 07:03:40.413474] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:10778685752873424277 len:38294 00:08:22.231 [2024-12-13 07:03:40.413512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.231 [2024-12-13 07:03:40.413640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:38294 00:08:22.231 [2024-12-13 07:03:40.413663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.231 #43 NEW cov: 11837 ft: 13923 corp: 23/583b lim: 50 exec/s: 43 rss: 68Mb L: 36/36 MS: 1 ChangeByte- 00:08:22.231 [2024-12-13 07:03:40.463482] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:642459735296 len:38294 00:08:22.231 [2024-12-13 07:03:40.463515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.231 [2024-12-13 07:03:40.463623] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:10778685752873424277 len:38294 00:08:22.231 [2024-12-13 07:03:40.463650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.231 [2024-12-13 07:03:40.463762] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:10778685752873424277 len:38294 00:08:22.231 [2024-12-13 07:03:40.463782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.490 #44 NEW cov: 11837 ft: 13933 corp: 24/615b lim: 50 exec/s: 44 rss: 68Mb L: 32/36 MS: 1 InsertRepeatedBytes- 00:08:22.490 [2024-12-13 07:03:40.513372] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10778685752873424277 len:38294 00:08:22.490 [2024-12-13 07:03:40.513405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.490 #45 NEW cov: 11837 ft: 13942 corp: 25/631b lim: 50 exec/s: 45 rss: 68Mb L: 16/36 MS: 1 EraseBytes- 00:08:22.490 [2024-12-13 07:03:40.563605] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:8049030959038372136 len:243 00:08:22.490 [2024-12-13 07:03:40.563633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.490 #49 NEW cov: 11837 ft: 13946 corp: 26/641b lim: 50 exec/s: 49 rss: 68Mb L: 10/36 MS: 4 ChangeBinInt-CMP-ShuffleBytes-InsertByte- DE: "\373\232\231o\263\347\002\000"- 00:08:22.490 [2024-12-13 07:03:40.613811] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071595618303 len:65536 00:08:22.490 [2024-12-13 07:03:40.613842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.490 [2024-12-13 07:03:40.613971] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446723182988623871 len:65536 00:08:22.490 [2024-12-13 07:03:40.613989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.490 #50 NEW cov: 11837 ft: 13971 corp: 27/662b lim: 50 exec/s: 50 rss: 68Mb L: 21/36 MS: 1 EraseBytes- 00:08:22.490 [2024-12-13 07:03:40.673858] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10778685752806342639 len:33280 00:08:22.490 [2024-12-13 07:03:40.673889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.490 #53 NEW cov: 11837 ft: 13982 corp: 28/680b lim: 50 exec/s: 53 rss: 68Mb L: 18/36 MS: 3 CrossOver-ChangeBit-CrossOver- 00:08:22.750 [2024-12-13 07:03:40.734486] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071595618303 len:65536 00:08:22.750 [2024-12-13 07:03:40.734520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.750 [2024-12-13 07:03:40.734636] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:55552 00:08:22.750 [2024-12-13 07:03:40.734657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.750 [2024-12-13 07:03:40.734775] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446743992105172991 len:65536 00:08:22.750 [2024-12-13 07:03:40.734796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.750 #54 NEW cov: 11837 ft: 13998 corp: 29/712b lim: 50 exec/s: 54 rss: 68Mb L: 32/36 MS: 1 CopyPart- 00:08:22.750 [2024-12-13 07:03:40.784244] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:9943830967654651285 len:38384 00:08:22.750 [2024-12-13 07:03:40.784274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.750 #55 NEW cov: 11837 ft: 14006 corp: 30/730b lim: 50 exec/s: 55 rss: 68Mb L: 18/36 MS: 1 ChangeBinInt- 00:08:22.750 [2024-12-13 07:03:40.844367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10778685752873424277 len:38294 00:08:22.750 [2024-12-13 07:03:40.844398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.750 #56 NEW cov: 11837 ft: 14009 corp: 31/742b lim: 50 exec/s: 56 rss: 69Mb L: 12/36 MS: 1 CrossOver- 00:08:22.750 [2024-12-13 07:03:40.905057] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10778685752873424277 len:38294 00:08:22.750 [2024-12-13 07:03:40.905091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.750 [2024-12-13 07:03:40.905185] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:10778685752873424277 len:38294 00:08:22.750 [2024-12-13 07:03:40.905214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.750 [2024-12-13 07:03:40.905342] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:10778685752873424277 len:38294 00:08:22.750 [2024-12-13 07:03:40.905364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.750 [2024-12-13 07:03:40.905495] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:11056253417446308762 len:513 00:08:22.750 [2024-12-13 07:03:40.905519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.750 #57 NEW cov: 11837 ft: 14279 corp: 32/786b lim: 50 exec/s: 57 rss: 69Mb L: 44/44 MS: 1 PersAutoDict- DE: "\373\232\231o\263\347\002\000"- 00:08:22.750 [2024-12-13 07:03:40.965040] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10778685752873424277 len:38294 00:08:22.750 [2024-12-13 07:03:40.965078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.750 [2024-12-13 07:03:40.965185] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:10778685752866805141 len:38294 00:08:22.750 [2024-12-13 07:03:40.965212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.750 [2024-12-13 07:03:40.965334] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:38294 00:08:22.750 [2024-12-13 07:03:40.965356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.750 #58 NEW cov: 11837 ft: 14311 corp: 33/822b lim: 50 exec/s: 58 rss: 69Mb L: 36/44 MS: 1 ChangeByte- 00:08:23.010 [2024-12-13 07:03:41.015101] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:10778685752873424277 len:24726 00:08:23.010 [2024-12-13 07:03:41.015137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.010 [2024-12-13 07:03:41.015264] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:10778685752873424277 len:38294 00:08:23.010 [2024-12-13 07:03:41.015286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.010 #59 NEW cov: 11837 ft: 14323 corp: 34/851b lim: 50 exec/s: 29 rss: 69Mb L: 29/44 MS: 1 InsertByte- 00:08:23.010 #59 DONE cov: 11837 ft: 14323 corp: 34/851b lim: 50 exec/s: 29 rss: 69Mb 00:08:23.010 ###### Recommended dictionary. ###### 00:08:23.010 "\373\232\231o\263\347\002\000" # Uses: 1 00:08:23.010 ###### End of recommended dictionary. ###### 00:08:23.010 Done 59 runs in 2 second(s) 00:08:23.010 07:03:41 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:08:23.010 07:03:41 -- ../common.sh@72 -- # (( i++ )) 00:08:23.010 07:03:41 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:23.010 07:03:41 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:23.010 07:03:41 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:23.010 07:03:41 -- nvmf/run.sh@24 -- # local timen=1 00:08:23.010 07:03:41 -- nvmf/run.sh@25 -- # local core=0x1 00:08:23.010 07:03:41 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:23.010 07:03:41 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:23.010 07:03:41 -- nvmf/run.sh@29 -- # printf %02d 20 00:08:23.010 07:03:41 -- nvmf/run.sh@29 -- # port=4420 00:08:23.010 07:03:41 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:23.010 07:03:41 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:23.010 07:03:41 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:23.010 07:03:41 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:08:23.010 [2024-12-13 07:03:41.199967] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:23.010 [2024-12-13 07:03:41.200042] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid498747 ] 00:08:23.010 EAL: No free 2048 kB hugepages reported on node 1 00:08:23.269 [2024-12-13 07:03:41.376451] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.269 [2024-12-13 07:03:41.396127] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:23.269 [2024-12-13 07:03:41.396266] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.269 [2024-12-13 07:03:41.447495] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:23.269 [2024-12-13 07:03:41.463787] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:23.269 INFO: Running with entropic power schedule (0xFF, 100). 00:08:23.269 INFO: Seed: 2902042722 00:08:23.269 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:23.269 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:23.269 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:23.269 INFO: A corpus is not provided, starting from an empty corpus 00:08:23.269 #2 INITED exec/s: 0 rss: 59Mb 00:08:23.269 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:23.269 This may also happen if the target rejected all inputs we tried so far 00:08:23.528 [2024-12-13 07:03:41.528881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.528 [2024-12-13 07:03:41.528910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.788 NEW_FUNC[1/671]: 0x473598 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:23.788 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:23.788 #13 NEW cov: 11665 ft: 11669 corp: 2/33b lim: 90 exec/s: 0 rss: 67Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:08:23.788 [2024-12-13 07:03:41.849841] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.788 [2024-12-13 07:03:41.849899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.788 NEW_FUNC[1/1]: 0x1c716e8 in spdk_thread_is_exited /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:728 00:08:23.788 #18 NEW cov: 11781 ft: 12349 corp: 3/60b lim: 90 exec/s: 0 rss: 67Mb L: 27/32 MS: 5 CopyPart-InsertByte-ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:08:23.788 [2024-12-13 07:03:41.900075] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.788 [2024-12-13 07:03:41.900102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.788 [2024-12-13 07:03:41.900140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.788 [2024-12-13 07:03:41.900157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.788 [2024-12-13 07:03:41.900216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.788 [2024-12-13 07:03:41.900231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.788 #19 NEW cov: 11787 ft: 13424 corp: 4/130b lim: 90 exec/s: 0 rss: 67Mb L: 70/70 MS: 1 InsertRepeatedBytes- 00:08:23.788 [2024-12-13 07:03:41.939884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.788 [2024-12-13 07:03:41.939910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.788 #20 NEW cov: 11872 ft: 13766 corp: 5/163b lim: 90 exec/s: 0 rss: 67Mb L: 33/70 MS: 1 CrossOver- 00:08:23.788 [2024-12-13 07:03:41.979977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.788 [2024-12-13 07:03:41.980004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.788 #26 NEW cov: 11872 ft: 14010 corp: 6/195b lim: 90 exec/s: 0 rss: 67Mb L: 32/70 MS: 1 ShuffleBytes- 00:08:23.788 [2024-12-13 07:03:42.020094] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.788 [2024-12-13 07:03:42.020120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.047 #27 NEW cov: 11872 ft: 14214 corp: 7/227b lim: 90 exec/s: 0 rss: 67Mb L: 32/70 MS: 1 ChangeBinInt- 00:08:24.047 [2024-12-13 07:03:42.060336] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.047 [2024-12-13 07:03:42.060361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.047 [2024-12-13 07:03:42.060397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.047 [2024-12-13 07:03:42.060413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.047 #28 NEW cov: 11872 ft: 14577 corp: 8/267b lim: 90 exec/s: 0 rss: 67Mb L: 40/70 MS: 1 CrossOver- 00:08:24.047 [2024-12-13 07:03:42.110359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.047 [2024-12-13 07:03:42.110387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.047 #29 NEW cov: 11872 ft: 14617 corp: 9/300b lim: 90 exec/s: 0 rss: 67Mb L: 33/70 MS: 1 InsertByte- 00:08:24.047 [2024-12-13 07:03:42.150451] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.047 [2024-12-13 07:03:42.150479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.047 #30 NEW cov: 11872 ft: 14678 corp: 10/321b lim: 90 exec/s: 0 rss: 67Mb L: 21/70 MS: 1 CrossOver- 00:08:24.047 [2024-12-13 07:03:42.190553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.047 [2024-12-13 07:03:42.190580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.047 #31 NEW cov: 11872 ft: 14783 corp: 11/342b lim: 90 exec/s: 0 rss: 68Mb L: 21/70 MS: 1 ChangeByte- 00:08:24.047 [2024-12-13 07:03:42.230731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.047 [2024-12-13 07:03:42.230758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.047 #32 NEW cov: 11872 ft: 14865 corp: 12/374b lim: 90 exec/s: 0 rss: 68Mb L: 32/70 MS: 1 ChangeBinInt- 00:08:24.047 [2024-12-13 07:03:42.270768] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.047 [2024-12-13 07:03:42.270799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.307 #33 NEW cov: 11872 ft: 14920 corp: 13/404b lim: 90 exec/s: 0 rss: 68Mb L: 30/70 MS: 1 EraseBytes- 00:08:24.307 [2024-12-13 07:03:42.310898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.307 [2024-12-13 07:03:42.310926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.307 #34 NEW cov: 11872 ft: 14979 corp: 14/436b lim: 90 exec/s: 0 rss: 68Mb L: 32/70 MS: 1 ShuffleBytes- 00:08:24.307 [2024-12-13 07:03:42.351021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.307 [2024-12-13 07:03:42.351048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.307 #35 NEW cov: 11872 ft: 15026 corp: 15/470b lim: 90 exec/s: 0 rss: 68Mb L: 34/70 MS: 1 CMP- DE: "\001\000"- 00:08:24.307 [2024-12-13 07:03:42.391145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.307 [2024-12-13 07:03:42.391173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.307 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:24.307 #36 NEW cov: 11895 ft: 15069 corp: 16/499b lim: 90 exec/s: 0 rss: 68Mb L: 29/70 MS: 1 EraseBytes- 00:08:24.307 [2024-12-13 07:03:42.431700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.307 [2024-12-13 07:03:42.431728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.307 [2024-12-13 07:03:42.431786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.307 [2024-12-13 07:03:42.431800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.307 [2024-12-13 07:03:42.431853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.307 [2024-12-13 07:03:42.431868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.307 [2024-12-13 07:03:42.431921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.307 [2024-12-13 07:03:42.431934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.307 #37 NEW cov: 11895 ft: 15450 corp: 17/588b lim: 90 exec/s: 0 rss: 68Mb L: 89/89 MS: 1 CrossOver- 00:08:24.307 [2024-12-13 07:03:42.481411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.307 [2024-12-13 07:03:42.481438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.307 #38 NEW cov: 11895 ft: 15498 corp: 18/617b lim: 90 exec/s: 0 rss: 68Mb L: 29/89 MS: 1 CMP- DE: "\000\002\347\264\216T\212N"- 00:08:24.307 [2024-12-13 07:03:42.521638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.307 [2024-12-13 07:03:42.521665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.307 [2024-12-13 07:03:42.521704] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.307 [2024-12-13 07:03:42.521719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.567 #39 NEW cov: 11895 ft: 15525 corp: 19/660b lim: 90 exec/s: 39 rss: 68Mb L: 43/89 MS: 1 InsertRepeatedBytes- 00:08:24.567 [2024-12-13 07:03:42.571658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.567 [2024-12-13 07:03:42.571685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.567 #40 NEW cov: 11895 ft: 15552 corp: 20/681b lim: 90 exec/s: 40 rss: 68Mb L: 21/89 MS: 1 CopyPart- 00:08:24.567 [2024-12-13 07:03:42.601750] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.567 [2024-12-13 07:03:42.601776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.567 #41 NEW cov: 11895 ft: 15616 corp: 21/714b lim: 90 exec/s: 41 rss: 68Mb L: 33/89 MS: 1 InsertByte- 00:08:24.567 [2024-12-13 07:03:42.631988] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.567 [2024-12-13 07:03:42.632014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.567 [2024-12-13 07:03:42.632065] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.567 [2024-12-13 07:03:42.632080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.567 #42 NEW cov: 11895 ft: 15663 corp: 22/750b lim: 90 exec/s: 42 rss: 68Mb L: 36/89 MS: 1 InsertRepeatedBytes- 00:08:24.567 [2024-12-13 07:03:42.671935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.567 [2024-12-13 07:03:42.671961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.567 #43 NEW cov: 11895 ft: 15677 corp: 23/784b lim: 90 exec/s: 43 rss: 68Mb L: 34/89 MS: 1 InsertByte- 00:08:24.567 [2024-12-13 07:03:42.712190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.567 [2024-12-13 07:03:42.712218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.567 [2024-12-13 07:03:42.712285] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.567 [2024-12-13 07:03:42.712301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.567 #44 NEW cov: 11895 ft: 15738 corp: 24/820b lim: 90 exec/s: 44 rss: 68Mb L: 36/89 MS: 1 EraseBytes- 00:08:24.567 [2024-12-13 07:03:42.752180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.567 [2024-12-13 07:03:42.752209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.567 #47 NEW cov: 11895 ft: 15750 corp: 25/838b lim: 90 exec/s: 47 rss: 68Mb L: 18/89 MS: 3 CrossOver-ChangeByte-InsertByte- 00:08:24.567 [2024-12-13 07:03:42.792591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.567 [2024-12-13 07:03:42.792618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.567 [2024-12-13 07:03:42.792676] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.567 [2024-12-13 07:03:42.792692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.567 [2024-12-13 07:03:42.792747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.567 [2024-12-13 07:03:42.792761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.826 #48 NEW cov: 11895 ft: 15766 corp: 26/897b lim: 90 exec/s: 48 rss: 68Mb L: 59/89 MS: 1 InsertRepeatedBytes- 00:08:24.826 [2024-12-13 07:03:42.832402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.826 [2024-12-13 07:03:42.832430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.826 #49 NEW cov: 11895 ft: 15770 corp: 27/930b lim: 90 exec/s: 49 rss: 68Mb L: 33/89 MS: 1 PersAutoDict- DE: "\001\000"- 00:08:24.826 [2024-12-13 07:03:42.872512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.826 [2024-12-13 07:03:42.872538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.826 #50 NEW cov: 11895 ft: 15799 corp: 28/952b lim: 90 exec/s: 50 rss: 68Mb L: 22/89 MS: 1 InsertByte- 00:08:24.826 [2024-12-13 07:03:42.912889] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.826 [2024-12-13 07:03:42.912915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.826 [2024-12-13 07:03:42.912973] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.826 [2024-12-13 07:03:42.912989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.826 [2024-12-13 07:03:42.913043] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.826 [2024-12-13 07:03:42.913058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.826 #53 NEW cov: 11895 ft: 15810 corp: 29/1020b lim: 90 exec/s: 53 rss: 68Mb L: 68/89 MS: 3 EraseBytes-ChangeByte-InsertRepeatedBytes- 00:08:24.826 [2024-12-13 07:03:42.953353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.826 [2024-12-13 07:03:42.953380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.826 [2024-12-13 07:03:42.953435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.826 [2024-12-13 07:03:42.953451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.826 [2024-12-13 07:03:42.953503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.826 [2024-12-13 07:03:42.953518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.826 [2024-12-13 07:03:42.953568] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.826 [2024-12-13 07:03:42.953583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.826 [2024-12-13 07:03:42.953636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:24.826 [2024-12-13 07:03:42.953651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:24.826 #54 NEW cov: 11895 ft: 15874 corp: 30/1110b lim: 90 exec/s: 54 rss: 68Mb L: 90/90 MS: 1 CrossOver- 00:08:24.826 [2024-12-13 07:03:42.992841] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.826 [2024-12-13 07:03:42.992867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.827 #55 NEW cov: 11895 ft: 15882 corp: 31/1143b lim: 90 exec/s: 55 rss: 68Mb L: 33/90 MS: 1 ShuffleBytes- 00:08:24.827 [2024-12-13 07:03:43.032953] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.827 [2024-12-13 07:03:43.032980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.827 #56 NEW cov: 11895 ft: 15886 corp: 32/1166b lim: 90 exec/s: 56 rss: 69Mb L: 23/90 MS: 1 InsertByte- 00:08:25.086 [2024-12-13 07:03:43.073520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.086 [2024-12-13 07:03:43.073547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.086 [2024-12-13 07:03:43.073593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.086 [2024-12-13 07:03:43.073609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.086 [2024-12-13 07:03:43.073662] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.086 [2024-12-13 07:03:43.073677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.086 [2024-12-13 07:03:43.073728] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:25.086 [2024-12-13 07:03:43.073743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.086 #57 NEW cov: 11895 ft: 15899 corp: 33/1240b lim: 90 exec/s: 57 rss: 69Mb L: 74/90 MS: 1 InsertRepeatedBytes- 00:08:25.086 [2024-12-13 07:03:43.113182] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.086 [2024-12-13 07:03:43.113211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.086 #58 NEW cov: 11895 ft: 15920 corp: 34/1272b lim: 90 exec/s: 58 rss: 69Mb L: 32/90 MS: 1 CrossOver- 00:08:25.086 [2024-12-13 07:03:43.153571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.086 [2024-12-13 07:03:43.153598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.086 [2024-12-13 07:03:43.153636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.086 [2024-12-13 07:03:43.153650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.086 [2024-12-13 07:03:43.153703] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.086 [2024-12-13 07:03:43.153717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.086 #59 NEW cov: 11895 ft: 15937 corp: 35/1333b lim: 90 exec/s: 59 rss: 69Mb L: 61/90 MS: 1 CopyPart- 00:08:25.086 [2024-12-13 07:03:43.193418] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.086 [2024-12-13 07:03:43.193444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.086 #60 NEW cov: 11895 ft: 16012 corp: 36/1365b lim: 90 exec/s: 60 rss: 69Mb L: 32/90 MS: 1 ChangeByte- 00:08:25.086 [2024-12-13 07:03:43.233807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.086 [2024-12-13 07:03:43.233835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.086 [2024-12-13 07:03:43.233892] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.086 [2024-12-13 07:03:43.233908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.086 [2024-12-13 07:03:43.233962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.086 [2024-12-13 07:03:43.233977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.086 #61 NEW cov: 11895 ft: 16016 corp: 37/1424b lim: 90 exec/s: 61 rss: 69Mb L: 59/90 MS: 1 ChangeByte- 00:08:25.086 [2024-12-13 07:03:43.273772] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.086 [2024-12-13 07:03:43.273805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.086 [2024-12-13 07:03:43.273877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.086 [2024-12-13 07:03:43.273893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.086 #62 NEW cov: 11895 ft: 16019 corp: 38/1464b lim: 90 exec/s: 62 rss: 69Mb L: 40/90 MS: 1 CrossOver- 00:08:25.086 [2024-12-13 07:03:43.313767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.086 [2024-12-13 07:03:43.313793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.346 #63 NEW cov: 11895 ft: 16021 corp: 39/1497b lim: 90 exec/s: 63 rss: 69Mb L: 33/90 MS: 1 ChangeBinInt- 00:08:25.346 [2024-12-13 07:03:43.353864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.346 [2024-12-13 07:03:43.353890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.346 #64 NEW cov: 11895 ft: 16048 corp: 40/1529b lim: 90 exec/s: 64 rss: 69Mb L: 32/90 MS: 1 CopyPart- 00:08:25.346 [2024-12-13 07:03:43.394415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.346 [2024-12-13 07:03:43.394442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.346 [2024-12-13 07:03:43.394490] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.346 [2024-12-13 07:03:43.394506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.346 [2024-12-13 07:03:43.394557] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.346 [2024-12-13 07:03:43.394572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.346 [2024-12-13 07:03:43.394624] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:25.346 [2024-12-13 07:03:43.394638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.346 #65 NEW cov: 11895 ft: 16053 corp: 41/1602b lim: 90 exec/s: 65 rss: 69Mb L: 73/90 MS: 1 EraseBytes- 00:08:25.346 [2024-12-13 07:03:43.434075] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.346 [2024-12-13 07:03:43.434101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.346 #66 NEW cov: 11895 ft: 16058 corp: 42/1623b lim: 90 exec/s: 66 rss: 69Mb L: 21/90 MS: 1 ChangeBit- 00:08:25.346 [2024-12-13 07:03:43.464591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.346 [2024-12-13 07:03:43.464617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.346 [2024-12-13 07:03:43.464664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.346 [2024-12-13 07:03:43.464679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.346 [2024-12-13 07:03:43.464728] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.346 [2024-12-13 07:03:43.464743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.346 [2024-12-13 07:03:43.464794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:25.346 [2024-12-13 07:03:43.464812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.346 #67 NEW cov: 11895 ft: 16066 corp: 43/1696b lim: 90 exec/s: 67 rss: 69Mb L: 73/90 MS: 1 ChangeByte- 00:08:25.346 [2024-12-13 07:03:43.504724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.346 [2024-12-13 07:03:43.504750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.346 [2024-12-13 07:03:43.504793] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.346 [2024-12-13 07:03:43.504809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.346 [2024-12-13 07:03:43.504861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.346 [2024-12-13 07:03:43.504893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.346 [2024-12-13 07:03:43.504944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:25.346 [2024-12-13 07:03:43.504959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.346 #68 NEW cov: 11895 ft: 16101 corp: 44/1769b lim: 90 exec/s: 34 rss: 69Mb L: 73/90 MS: 1 ChangeBit- 00:08:25.346 #68 DONE cov: 11895 ft: 16101 corp: 44/1769b lim: 90 exec/s: 34 rss: 69Mb 00:08:25.346 ###### Recommended dictionary. ###### 00:08:25.346 "\001\000" # Uses: 1 00:08:25.346 "\000\002\347\264\216T\212N" # Uses: 0 00:08:25.346 ###### End of recommended dictionary. ###### 00:08:25.346 Done 68 runs in 2 second(s) 00:08:25.606 07:03:43 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:08:25.606 07:03:43 -- ../common.sh@72 -- # (( i++ )) 00:08:25.606 07:03:43 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:25.606 07:03:43 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:25.606 07:03:43 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:25.606 07:03:43 -- nvmf/run.sh@24 -- # local timen=1 00:08:25.606 07:03:43 -- nvmf/run.sh@25 -- # local core=0x1 00:08:25.606 07:03:43 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:25.606 07:03:43 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:25.606 07:03:43 -- nvmf/run.sh@29 -- # printf %02d 21 00:08:25.606 07:03:43 -- nvmf/run.sh@29 -- # port=4421 00:08:25.606 07:03:43 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:25.606 07:03:43 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:25.606 07:03:43 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:25.606 07:03:43 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:08:25.606 [2024-12-13 07:03:43.682065] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:25.606 [2024-12-13 07:03:43.682160] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid499053 ] 00:08:25.606 EAL: No free 2048 kB hugepages reported on node 1 00:08:25.865 [2024-12-13 07:03:43.866463] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.865 [2024-12-13 07:03:43.887659] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:25.865 [2024-12-13 07:03:43.887784] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.865 [2024-12-13 07:03:43.939274] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:25.865 [2024-12-13 07:03:43.955600] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:25.865 INFO: Running with entropic power schedule (0xFF, 100). 00:08:25.865 INFO: Seed: 1100104316 00:08:25.865 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:25.865 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:25.865 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:25.865 INFO: A corpus is not provided, starting from an empty corpus 00:08:25.865 #2 INITED exec/s: 0 rss: 59Mb 00:08:25.865 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:25.865 This may also happen if the target rejected all inputs we tried so far 00:08:25.865 [2024-12-13 07:03:44.031950] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.865 [2024-12-13 07:03:44.031988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.865 [2024-12-13 07:03:44.032113] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:25.865 [2024-12-13 07:03:44.032136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.865 [2024-12-13 07:03:44.032267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:25.865 [2024-12-13 07:03:44.032288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.124 NEW_FUNC[1/671]: 0x4767c8 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:26.124 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:26.124 #12 NEW cov: 11628 ft: 11644 corp: 2/40b lim: 50 exec/s: 0 rss: 67Mb L: 39/39 MS: 5 ChangeByte-ShuffleBytes-ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:08:26.124 [2024-12-13 07:03:44.353006] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.124 [2024-12-13 07:03:44.353044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.124 [2024-12-13 07:03:44.353183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.124 [2024-12-13 07:03:44.353213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.124 [2024-12-13 07:03:44.353346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.124 [2024-12-13 07:03:44.353367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.124 [2024-12-13 07:03:44.353501] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:26.124 [2024-12-13 07:03:44.353527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.383 NEW_FUNC[1/1]: 0x19613b8 in event_queue_run_batch /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:528 00:08:26.383 #16 NEW cov: 11756 ft: 12514 corp: 3/88b lim: 50 exec/s: 0 rss: 67Mb L: 48/48 MS: 4 ChangeBit-CopyPart-CopyPart-InsertRepeatedBytes- 00:08:26.383 [2024-12-13 07:03:44.402376] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.383 [2024-12-13 07:03:44.402407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.383 #22 NEW cov: 11762 ft: 13529 corp: 4/99b lim: 50 exec/s: 0 rss: 67Mb L: 11/48 MS: 1 CrossOver- 00:08:26.383 [2024-12-13 07:03:44.462600] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.383 [2024-12-13 07:03:44.462640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.383 #23 NEW cov: 11847 ft: 13801 corp: 5/111b lim: 50 exec/s: 0 rss: 67Mb L: 12/48 MS: 1 InsertByte- 00:08:26.383 [2024-12-13 07:03:44.522658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.383 [2024-12-13 07:03:44.522685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.383 #24 NEW cov: 11847 ft: 13934 corp: 6/122b lim: 50 exec/s: 0 rss: 67Mb L: 11/48 MS: 1 ShuffleBytes- 00:08:26.383 [2024-12-13 07:03:44.562708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.383 [2024-12-13 07:03:44.562735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.383 [2024-12-13 07:03:44.562856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.383 [2024-12-13 07:03:44.562878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.383 #25 NEW cov: 11847 ft: 14375 corp: 7/144b lim: 50 exec/s: 0 rss: 67Mb L: 22/48 MS: 1 CrossOver- 00:08:26.383 [2024-12-13 07:03:44.613681] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.383 [2024-12-13 07:03:44.613711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.383 [2024-12-13 07:03:44.613813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.383 [2024-12-13 07:03:44.613842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.383 [2024-12-13 07:03:44.613969] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.383 [2024-12-13 07:03:44.613994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.383 [2024-12-13 07:03:44.614131] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:26.383 [2024-12-13 07:03:44.614151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.643 #26 NEW cov: 11847 ft: 14481 corp: 8/192b lim: 50 exec/s: 0 rss: 67Mb L: 48/48 MS: 1 ChangeBinInt- 00:08:26.643 [2024-12-13 07:03:44.653432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.643 [2024-12-13 07:03:44.653465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.643 [2024-12-13 07:03:44.653586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.643 [2024-12-13 07:03:44.653608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.643 [2024-12-13 07:03:44.653723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.643 [2024-12-13 07:03:44.653746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.643 [2024-12-13 07:03:44.653871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:26.643 [2024-12-13 07:03:44.653893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.643 #27 NEW cov: 11847 ft: 14634 corp: 9/240b lim: 50 exec/s: 0 rss: 68Mb L: 48/48 MS: 1 ChangeByte- 00:08:26.643 [2024-12-13 07:03:44.703216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.643 [2024-12-13 07:03:44.703246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.643 #28 NEW cov: 11847 ft: 14680 corp: 10/252b lim: 50 exec/s: 0 rss: 68Mb L: 12/48 MS: 1 CMP- DE: "\000\002"- 00:08:26.643 [2024-12-13 07:03:44.753397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.643 [2024-12-13 07:03:44.753424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.643 #29 NEW cov: 11847 ft: 14709 corp: 11/264b lim: 50 exec/s: 0 rss: 68Mb L: 12/48 MS: 1 CrossOver- 00:08:26.643 [2024-12-13 07:03:44.804324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.643 [2024-12-13 07:03:44.804354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.643 [2024-12-13 07:03:44.804444] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.643 [2024-12-13 07:03:44.804465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.643 [2024-12-13 07:03:44.804579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.643 [2024-12-13 07:03:44.804604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.643 [2024-12-13 07:03:44.804732] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:26.643 [2024-12-13 07:03:44.804752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.643 #30 NEW cov: 11847 ft: 14729 corp: 12/304b lim: 50 exec/s: 0 rss: 68Mb L: 40/48 MS: 1 EraseBytes- 00:08:26.643 [2024-12-13 07:03:44.853746] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.643 [2024-12-13 07:03:44.853775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.643 #31 NEW cov: 11847 ft: 14750 corp: 13/316b lim: 50 exec/s: 0 rss: 68Mb L: 12/48 MS: 1 ChangeByte- 00:08:26.903 [2024-12-13 07:03:44.903948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.903 [2024-12-13 07:03:44.903975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.903 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:26.903 #32 NEW cov: 11870 ft: 14796 corp: 14/328b lim: 50 exec/s: 0 rss: 68Mb L: 12/48 MS: 1 CopyPart- 00:08:26.903 [2024-12-13 07:03:44.964388] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.903 [2024-12-13 07:03:44.964421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.903 [2024-12-13 07:03:44.964549] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.903 [2024-12-13 07:03:44.964570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.903 #33 NEW cov: 11870 ft: 14822 corp: 15/350b lim: 50 exec/s: 0 rss: 68Mb L: 22/48 MS: 1 ShuffleBytes- 00:08:26.903 [2024-12-13 07:03:45.014449] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.903 [2024-12-13 07:03:45.014482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.903 [2024-12-13 07:03:45.014588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.903 [2024-12-13 07:03:45.014614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.903 #34 NEW cov: 11870 ft: 14850 corp: 16/373b lim: 50 exec/s: 34 rss: 68Mb L: 23/48 MS: 1 InsertByte- 00:08:26.903 [2024-12-13 07:03:45.064731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.903 [2024-12-13 07:03:45.064764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.903 [2024-12-13 07:03:45.064892] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.903 [2024-12-13 07:03:45.064917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.903 #35 NEW cov: 11870 ft: 14861 corp: 17/395b lim: 50 exec/s: 35 rss: 68Mb L: 22/48 MS: 1 ChangeBit- 00:08:26.903 [2024-12-13 07:03:45.114685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.903 [2024-12-13 07:03:45.114721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.903 #36 NEW cov: 11870 ft: 14880 corp: 18/407b lim: 50 exec/s: 36 rss: 68Mb L: 12/48 MS: 1 ChangeBinInt- 00:08:27.162 [2024-12-13 07:03:45.164718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.162 [2024-12-13 07:03:45.164749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.162 #37 NEW cov: 11870 ft: 14895 corp: 19/418b lim: 50 exec/s: 37 rss: 68Mb L: 11/48 MS: 1 ChangeBit- 00:08:27.162 [2024-12-13 07:03:45.215171] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.162 [2024-12-13 07:03:45.215208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.162 [2024-12-13 07:03:45.215341] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.162 [2024-12-13 07:03:45.215364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.162 #38 NEW cov: 11870 ft: 14900 corp: 20/442b lim: 50 exec/s: 38 rss: 68Mb L: 24/48 MS: 1 CrossOver- 00:08:27.162 [2024-12-13 07:03:45.275378] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.162 [2024-12-13 07:03:45.275410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.162 [2024-12-13 07:03:45.275523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.162 [2024-12-13 07:03:45.275544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.162 #39 NEW cov: 11870 ft: 14933 corp: 21/467b lim: 50 exec/s: 39 rss: 68Mb L: 25/48 MS: 1 InsertRepeatedBytes- 00:08:27.162 [2024-12-13 07:03:45.316089] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.162 [2024-12-13 07:03:45.316120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.162 [2024-12-13 07:03:45.316247] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.162 [2024-12-13 07:03:45.316272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.162 [2024-12-13 07:03:45.316399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.162 [2024-12-13 07:03:45.316419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.162 [2024-12-13 07:03:45.316547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.162 [2024-12-13 07:03:45.316573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.162 #40 NEW cov: 11870 ft: 14960 corp: 22/508b lim: 50 exec/s: 40 rss: 68Mb L: 41/48 MS: 1 InsertByte- 00:08:27.163 [2024-12-13 07:03:45.376222] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.163 [2024-12-13 07:03:45.376255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.163 [2024-12-13 07:03:45.376369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.163 [2024-12-13 07:03:45.376392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.163 [2024-12-13 07:03:45.376522] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.163 [2024-12-13 07:03:45.376546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.163 [2024-12-13 07:03:45.376680] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.163 [2024-12-13 07:03:45.376701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.421 #41 NEW cov: 11870 ft: 14966 corp: 23/556b lim: 50 exec/s: 41 rss: 69Mb L: 48/48 MS: 1 CrossOver- 00:08:27.421 [2024-12-13 07:03:45.436460] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.421 [2024-12-13 07:03:45.436492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.421 [2024-12-13 07:03:45.436625] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.421 [2024-12-13 07:03:45.436649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.421 [2024-12-13 07:03:45.436773] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.421 [2024-12-13 07:03:45.436799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.421 [2024-12-13 07:03:45.436932] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.421 [2024-12-13 07:03:45.436954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.421 #42 NEW cov: 11870 ft: 15034 corp: 24/596b lim: 50 exec/s: 42 rss: 69Mb L: 40/48 MS: 1 ChangeBit- 00:08:27.421 [2024-12-13 07:03:45.485758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.422 [2024-12-13 07:03:45.485792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.422 #43 NEW cov: 11870 ft: 15066 corp: 25/608b lim: 50 exec/s: 43 rss: 69Mb L: 12/48 MS: 1 ChangeBit- 00:08:27.422 [2024-12-13 07:03:45.536483] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.422 [2024-12-13 07:03:45.536512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.422 [2024-12-13 07:03:45.536631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.422 [2024-12-13 07:03:45.536654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.422 [2024-12-13 07:03:45.536782] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.422 [2024-12-13 07:03:45.536804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.422 #44 NEW cov: 11870 ft: 15082 corp: 26/647b lim: 50 exec/s: 44 rss: 69Mb L: 39/48 MS: 1 CMP- DE: "\017\000"- 00:08:27.422 [2024-12-13 07:03:45.596125] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.422 [2024-12-13 07:03:45.596152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.422 #45 NEW cov: 11870 ft: 15094 corp: 27/658b lim: 50 exec/s: 45 rss: 69Mb L: 11/48 MS: 1 EraseBytes- 00:08:27.422 [2024-12-13 07:03:45.646605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.422 [2024-12-13 07:03:45.646632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.422 [2024-12-13 07:03:45.646754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.422 [2024-12-13 07:03:45.646771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.680 #46 NEW cov: 11870 ft: 15113 corp: 28/682b lim: 50 exec/s: 46 rss: 69Mb L: 24/48 MS: 1 PersAutoDict- DE: "\000\002"- 00:08:27.680 [2024-12-13 07:03:45.696989] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.680 [2024-12-13 07:03:45.697023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.680 [2024-12-13 07:03:45.697147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.680 [2024-12-13 07:03:45.697171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.680 [2024-12-13 07:03:45.697294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.680 [2024-12-13 07:03:45.697314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.680 #47 NEW cov: 11870 ft: 15122 corp: 29/715b lim: 50 exec/s: 47 rss: 69Mb L: 33/48 MS: 1 CrossOver- 00:08:27.680 [2024-12-13 07:03:45.746804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.680 [2024-12-13 07:03:45.746831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.680 [2024-12-13 07:03:45.746971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.680 [2024-12-13 07:03:45.746991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.680 #48 NEW cov: 11870 ft: 15138 corp: 30/738b lim: 50 exec/s: 48 rss: 69Mb L: 23/48 MS: 1 CrossOver- 00:08:27.680 [2024-12-13 07:03:45.786921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.680 [2024-12-13 07:03:45.786954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.680 [2024-12-13 07:03:45.787044] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.680 [2024-12-13 07:03:45.787066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.680 [2024-12-13 07:03:45.787191] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.680 [2024-12-13 07:03:45.787215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.680 [2024-12-13 07:03:45.787358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.680 [2024-12-13 07:03:45.787382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.680 #49 NEW cov: 11870 ft: 15179 corp: 31/781b lim: 50 exec/s: 49 rss: 69Mb L: 43/48 MS: 1 PersAutoDict- DE: "\017\000"- 00:08:27.680 [2024-12-13 07:03:45.837674] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.680 [2024-12-13 07:03:45.837706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.680 [2024-12-13 07:03:45.837827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.680 [2024-12-13 07:03:45.837849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.680 [2024-12-13 07:03:45.837972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.680 [2024-12-13 07:03:45.837997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.680 [2024-12-13 07:03:45.838126] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.680 [2024-12-13 07:03:45.838148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.680 #50 NEW cov: 11870 ft: 15240 corp: 32/824b lim: 50 exec/s: 50 rss: 69Mb L: 43/48 MS: 1 ChangeBit- 00:08:27.680 [2024-12-13 07:03:45.886961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.680 [2024-12-13 07:03:45.886987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.680 #51 NEW cov: 11870 ft: 15259 corp: 33/836b lim: 50 exec/s: 51 rss: 69Mb L: 12/48 MS: 1 ChangeBinInt- 00:08:27.939 [2024-12-13 07:03:45.937673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.939 [2024-12-13 07:03:45.937704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.939 [2024-12-13 07:03:45.937817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.939 [2024-12-13 07:03:45.937839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.939 [2024-12-13 07:03:45.937972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.939 [2024-12-13 07:03:45.937993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.939 #52 NEW cov: 11870 ft: 15287 corp: 34/870b lim: 50 exec/s: 52 rss: 69Mb L: 34/48 MS: 1 CopyPart- 00:08:27.939 [2024-12-13 07:03:45.988132] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.939 [2024-12-13 07:03:45.988162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.939 [2024-12-13 07:03:45.988302] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.939 [2024-12-13 07:03:45.988328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.939 [2024-12-13 07:03:45.988457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.939 [2024-12-13 07:03:45.988478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.939 [2024-12-13 07:03:45.988608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.939 [2024-12-13 07:03:45.988632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.939 #53 NEW cov: 11870 ft: 15301 corp: 35/915b lim: 50 exec/s: 26 rss: 69Mb L: 45/48 MS: 1 CrossOver- 00:08:27.939 #53 DONE cov: 11870 ft: 15301 corp: 35/915b lim: 50 exec/s: 26 rss: 69Mb 00:08:27.939 ###### Recommended dictionary. ###### 00:08:27.939 "\000\002" # Uses: 1 00:08:27.939 "\017\000" # Uses: 1 00:08:27.939 ###### End of recommended dictionary. ###### 00:08:27.939 Done 53 runs in 2 second(s) 00:08:27.939 07:03:46 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:08:27.939 07:03:46 -- ../common.sh@72 -- # (( i++ )) 00:08:27.939 07:03:46 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:27.939 07:03:46 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:27.939 07:03:46 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:27.939 07:03:46 -- nvmf/run.sh@24 -- # local timen=1 00:08:27.939 07:03:46 -- nvmf/run.sh@25 -- # local core=0x1 00:08:27.939 07:03:46 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:27.939 07:03:46 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:27.939 07:03:46 -- nvmf/run.sh@29 -- # printf %02d 22 00:08:27.939 07:03:46 -- nvmf/run.sh@29 -- # port=4422 00:08:27.939 07:03:46 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:27.939 07:03:46 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:27.939 07:03:46 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:27.939 07:03:46 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:08:27.939 [2024-12-13 07:03:46.171364] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:27.939 [2024-12-13 07:03:46.171454] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid499573 ] 00:08:28.198 EAL: No free 2048 kB hugepages reported on node 1 00:08:28.198 [2024-12-13 07:03:46.347370] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.198 [2024-12-13 07:03:46.367038] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:28.198 [2024-12-13 07:03:46.367172] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:28.198 [2024-12-13 07:03:46.418583] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:28.198 [2024-12-13 07:03:46.434887] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:28.456 INFO: Running with entropic power schedule (0xFF, 100). 00:08:28.456 INFO: Seed: 3579078632 00:08:28.456 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:28.456 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:28.456 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:28.456 INFO: A corpus is not provided, starting from an empty corpus 00:08:28.456 #2 INITED exec/s: 0 rss: 59Mb 00:08:28.456 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:28.456 This may also happen if the target rejected all inputs we tried so far 00:08:28.456 [2024-12-13 07:03:46.511035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.456 [2024-12-13 07:03:46.511068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.456 [2024-12-13 07:03:46.511184] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.456 [2024-12-13 07:03:46.511205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.456 [2024-12-13 07:03:46.511323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.456 [2024-12-13 07:03:46.511350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.715 NEW_FUNC[1/672]: 0x478a98 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:28.715 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:28.715 #14 NEW cov: 11668 ft: 11670 corp: 2/55b lim: 85 exec/s: 0 rss: 67Mb L: 54/54 MS: 2 CMP-InsertRepeatedBytes- DE: "\004\000\000\000"- 00:08:28.715 [2024-12-13 07:03:46.831971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.715 [2024-12-13 07:03:46.832009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.715 [2024-12-13 07:03:46.832130] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.715 [2024-12-13 07:03:46.832153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.715 [2024-12-13 07:03:46.832278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.715 [2024-12-13 07:03:46.832297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.715 #15 NEW cov: 11782 ft: 12141 corp: 3/109b lim: 85 exec/s: 0 rss: 67Mb L: 54/54 MS: 1 ChangeBit- 00:08:28.715 [2024-12-13 07:03:46.882087] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.715 [2024-12-13 07:03:46.882120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.715 [2024-12-13 07:03:46.882234] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.716 [2024-12-13 07:03:46.882253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.716 [2024-12-13 07:03:46.882370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.716 [2024-12-13 07:03:46.882391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.716 #25 NEW cov: 11788 ft: 12530 corp: 4/168b lim: 85 exec/s: 0 rss: 67Mb L: 59/59 MS: 5 ChangeByte-CrossOver-ChangeByte-CrossOver-InsertRepeatedBytes- 00:08:28.716 [2024-12-13 07:03:46.921628] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.716 [2024-12-13 07:03:46.921659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.716 #26 NEW cov: 11873 ft: 13552 corp: 5/189b lim: 85 exec/s: 0 rss: 67Mb L: 21/59 MS: 1 CrossOver- 00:08:28.975 [2024-12-13 07:03:46.972315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.975 [2024-12-13 07:03:46.972348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.975 [2024-12-13 07:03:46.972462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.975 [2024-12-13 07:03:46.972482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.975 [2024-12-13 07:03:46.972594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.975 [2024-12-13 07:03:46.972615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.975 #27 NEW cov: 11873 ft: 13744 corp: 6/243b lim: 85 exec/s: 0 rss: 67Mb L: 54/59 MS: 1 ChangeByte- 00:08:28.975 [2024-12-13 07:03:47.012165] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.975 [2024-12-13 07:03:47.012202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.975 [2024-12-13 07:03:47.012322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.975 [2024-12-13 07:03:47.012346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.975 #33 NEW cov: 11873 ft: 14064 corp: 7/283b lim: 85 exec/s: 0 rss: 67Mb L: 40/59 MS: 1 EraseBytes- 00:08:28.975 [2024-12-13 07:03:47.062721] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.975 [2024-12-13 07:03:47.062754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.975 [2024-12-13 07:03:47.062871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.975 [2024-12-13 07:03:47.062893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.975 [2024-12-13 07:03:47.063007] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.975 [2024-12-13 07:03:47.063028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.975 #34 NEW cov: 11873 ft: 14172 corp: 8/337b lim: 85 exec/s: 0 rss: 67Mb L: 54/59 MS: 1 ChangeBinInt- 00:08:28.975 [2024-12-13 07:03:47.102896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.975 [2024-12-13 07:03:47.102929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.975 [2024-12-13 07:03:47.103031] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.975 [2024-12-13 07:03:47.103053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.975 [2024-12-13 07:03:47.103176] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.975 [2024-12-13 07:03:47.103202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.975 [2024-12-13 07:03:47.103329] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:28.975 [2024-12-13 07:03:47.103352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.975 #35 NEW cov: 11873 ft: 14574 corp: 9/406b lim: 85 exec/s: 0 rss: 67Mb L: 69/69 MS: 1 InsertRepeatedBytes- 00:08:28.975 [2024-12-13 07:03:47.152384] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.975 [2024-12-13 07:03:47.152410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.975 #36 NEW cov: 11873 ft: 14633 corp: 10/427b lim: 85 exec/s: 0 rss: 67Mb L: 21/69 MS: 1 ChangeBinInt- 00:08:28.975 [2024-12-13 07:03:47.202981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.975 [2024-12-13 07:03:47.203014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.975 [2024-12-13 07:03:47.203114] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.975 [2024-12-13 07:03:47.203138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.975 [2024-12-13 07:03:47.203261] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.975 [2024-12-13 07:03:47.203280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.235 #37 NEW cov: 11873 ft: 14669 corp: 11/486b lim: 85 exec/s: 0 rss: 67Mb L: 59/69 MS: 1 ShuffleBytes- 00:08:29.235 [2024-12-13 07:03:47.243109] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.235 [2024-12-13 07:03:47.243138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.235 [2024-12-13 07:03:47.243263] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.235 [2024-12-13 07:03:47.243289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.235 [2024-12-13 07:03:47.243414] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.235 [2024-12-13 07:03:47.243436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.235 #38 NEW cov: 11873 ft: 14687 corp: 12/545b lim: 85 exec/s: 0 rss: 67Mb L: 59/69 MS: 1 ChangeBinInt- 00:08:29.235 [2024-12-13 07:03:47.283149] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.235 [2024-12-13 07:03:47.283178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.235 [2024-12-13 07:03:47.283254] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.235 [2024-12-13 07:03:47.283277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.235 [2024-12-13 07:03:47.283393] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.235 [2024-12-13 07:03:47.283410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.235 #39 NEW cov: 11873 ft: 14703 corp: 13/604b lim: 85 exec/s: 0 rss: 68Mb L: 59/69 MS: 1 ShuffleBytes- 00:08:29.235 [2024-12-13 07:03:47.322878] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.235 [2024-12-13 07:03:47.322906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.235 #40 NEW cov: 11873 ft: 14718 corp: 14/637b lim: 85 exec/s: 0 rss: 68Mb L: 33/69 MS: 1 EraseBytes- 00:08:29.235 [2024-12-13 07:03:47.362974] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.235 [2024-12-13 07:03:47.363000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.235 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:29.235 #45 NEW cov: 11896 ft: 14788 corp: 15/661b lim: 85 exec/s: 0 rss: 68Mb L: 24/69 MS: 5 EraseBytes-ChangeByte-CrossOver-ChangeBit-CopyPart- 00:08:29.235 [2024-12-13 07:03:47.413193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.235 [2024-12-13 07:03:47.413236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.235 #46 NEW cov: 11896 ft: 14914 corp: 16/682b lim: 85 exec/s: 0 rss: 68Mb L: 21/69 MS: 1 ChangeByte- 00:08:29.235 [2024-12-13 07:03:47.453171] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.235 [2024-12-13 07:03:47.453208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.494 #47 NEW cov: 11896 ft: 14947 corp: 17/715b lim: 85 exec/s: 47 rss: 68Mb L: 33/69 MS: 1 ShuffleBytes- 00:08:29.494 [2024-12-13 07:03:47.503431] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.494 [2024-12-13 07:03:47.503460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.494 #48 NEW cov: 11896 ft: 15026 corp: 18/748b lim: 85 exec/s: 48 rss: 68Mb L: 33/69 MS: 1 ChangeByte- 00:08:29.494 [2024-12-13 07:03:47.543508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.494 [2024-12-13 07:03:47.543535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.494 #49 NEW cov: 11896 ft: 15046 corp: 19/781b lim: 85 exec/s: 49 rss: 68Mb L: 33/69 MS: 1 ShuffleBytes- 00:08:29.494 [2024-12-13 07:03:47.583645] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.494 [2024-12-13 07:03:47.583671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.494 #50 NEW cov: 11896 ft: 15138 corp: 20/802b lim: 85 exec/s: 50 rss: 68Mb L: 21/69 MS: 1 ChangeBit- 00:08:29.494 [2024-12-13 07:03:47.624039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.494 [2024-12-13 07:03:47.624070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.494 [2024-12-13 07:03:47.624199] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.494 [2024-12-13 07:03:47.624219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.494 #51 NEW cov: 11896 ft: 15168 corp: 21/842b lim: 85 exec/s: 51 rss: 68Mb L: 40/69 MS: 1 ShuffleBytes- 00:08:29.494 [2024-12-13 07:03:47.673970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.494 [2024-12-13 07:03:47.674002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.494 #53 NEW cov: 11896 ft: 15184 corp: 22/875b lim: 85 exec/s: 53 rss: 68Mb L: 33/69 MS: 2 EraseBytes-CrossOver- 00:08:29.494 [2024-12-13 07:03:47.724433] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.494 [2024-12-13 07:03:47.724463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.494 [2024-12-13 07:03:47.724596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.494 [2024-12-13 07:03:47.724621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.494 [2024-12-13 07:03:47.724735] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.494 [2024-12-13 07:03:47.724762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.753 #54 NEW cov: 11896 ft: 15223 corp: 23/934b lim: 85 exec/s: 54 rss: 68Mb L: 59/69 MS: 1 ChangeBit- 00:08:29.753 [2024-12-13 07:03:47.763647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.753 [2024-12-13 07:03:47.763677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.753 #55 NEW cov: 11896 ft: 15243 corp: 24/955b lim: 85 exec/s: 55 rss: 68Mb L: 21/69 MS: 1 ChangeByte- 00:08:29.753 [2024-12-13 07:03:47.804734] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.753 [2024-12-13 07:03:47.804764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.753 [2024-12-13 07:03:47.804876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.753 [2024-12-13 07:03:47.804897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.753 [2024-12-13 07:03:47.805014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.753 [2024-12-13 07:03:47.805037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.753 [2024-12-13 07:03:47.845013] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.753 [2024-12-13 07:03:47.845046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.753 [2024-12-13 07:03:47.845170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.753 [2024-12-13 07:03:47.845193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.753 [2024-12-13 07:03:47.845310] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.753 [2024-12-13 07:03:47.845333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.753 [2024-12-13 07:03:47.885046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.753 [2024-12-13 07:03:47.885077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.753 [2024-12-13 07:03:47.885180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.754 [2024-12-13 07:03:47.885220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.754 [2024-12-13 07:03:47.885347] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.754 [2024-12-13 07:03:47.885365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.754 #58 NEW cov: 11896 ft: 15249 corp: 25/1009b lim: 85 exec/s: 58 rss: 68Mb L: 54/69 MS: 3 ChangeBit-PersAutoDict-ChangeBinInt- DE: "\004\000\000\000"- 00:08:29.754 [2024-12-13 07:03:47.924271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.754 [2024-12-13 07:03:47.924297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.754 #59 NEW cov: 11896 ft: 15299 corp: 26/1042b lim: 85 exec/s: 59 rss: 68Mb L: 33/69 MS: 1 PersAutoDict- DE: "\004\000\000\000"- 00:08:29.754 [2024-12-13 07:03:47.965494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.754 [2024-12-13 07:03:47.965525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.754 [2024-12-13 07:03:47.965611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.754 [2024-12-13 07:03:47.965632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.754 [2024-12-13 07:03:47.965748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.754 [2024-12-13 07:03:47.965768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.754 [2024-12-13 07:03:47.965894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.754 [2024-12-13 07:03:47.965910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.012 #60 NEW cov: 11896 ft: 15306 corp: 27/1117b lim: 85 exec/s: 60 rss: 68Mb L: 75/75 MS: 1 InsertRepeatedBytes- 00:08:30.012 [2024-12-13 07:03:48.015365] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.012 [2024-12-13 07:03:48.015397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.012 [2024-12-13 07:03:48.015509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.012 [2024-12-13 07:03:48.015532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.012 [2024-12-13 07:03:48.015652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.012 [2024-12-13 07:03:48.015671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.012 #61 NEW cov: 11896 ft: 15336 corp: 28/1176b lim: 85 exec/s: 61 rss: 68Mb L: 59/75 MS: 1 ChangeBinInt- 00:08:30.012 [2024-12-13 07:03:48.054543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.012 [2024-12-13 07:03:48.054571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.012 #62 NEW cov: 11896 ft: 15340 corp: 29/1200b lim: 85 exec/s: 62 rss: 68Mb L: 24/75 MS: 1 ShuffleBytes- 00:08:30.012 [2024-12-13 07:03:48.095125] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.012 [2024-12-13 07:03:48.095157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.012 #63 NEW cov: 11896 ft: 15344 corp: 30/1221b lim: 85 exec/s: 63 rss: 68Mb L: 21/75 MS: 1 CopyPart- 00:08:30.012 [2024-12-13 07:03:48.135683] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.012 [2024-12-13 07:03:48.135713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.012 [2024-12-13 07:03:48.135823] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.012 [2024-12-13 07:03:48.135845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.012 [2024-12-13 07:03:48.135958] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.012 [2024-12-13 07:03:48.135978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.012 #64 NEW cov: 11896 ft: 15353 corp: 31/1275b lim: 85 exec/s: 64 rss: 69Mb L: 54/75 MS: 1 ShuffleBytes- 00:08:30.012 [2024-12-13 07:03:48.175380] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.012 [2024-12-13 07:03:48.175405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.012 #65 NEW cov: 11896 ft: 15403 corp: 32/1308b lim: 85 exec/s: 65 rss: 69Mb L: 33/75 MS: 1 ChangeBinInt- 00:08:30.012 [2024-12-13 07:03:48.216144] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.012 [2024-12-13 07:03:48.216179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.012 [2024-12-13 07:03:48.216305] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.012 [2024-12-13 07:03:48.216328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.012 [2024-12-13 07:03:48.216447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.012 [2024-12-13 07:03:48.216469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.012 #66 NEW cov: 11896 ft: 15421 corp: 33/1363b lim: 85 exec/s: 66 rss: 69Mb L: 55/75 MS: 1 InsertByte- 00:08:30.272 [2024-12-13 07:03:48.255906] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.272 [2024-12-13 07:03:48.255939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.272 [2024-12-13 07:03:48.256072] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.272 [2024-12-13 07:03:48.256091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.272 #67 NEW cov: 11896 ft: 15442 corp: 34/1399b lim: 85 exec/s: 67 rss: 69Mb L: 36/75 MS: 1 InsertRepeatedBytes- 00:08:30.272 [2024-12-13 07:03:48.296310] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.272 [2024-12-13 07:03:48.296343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.272 [2024-12-13 07:03:48.296452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.272 [2024-12-13 07:03:48.296473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.272 [2024-12-13 07:03:48.296597] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.272 [2024-12-13 07:03:48.296619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.272 #68 NEW cov: 11896 ft: 15446 corp: 35/1454b lim: 85 exec/s: 68 rss: 69Mb L: 55/75 MS: 1 InsertByte- 00:08:30.272 [2024-12-13 07:03:48.336380] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.272 [2024-12-13 07:03:48.336412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.272 [2024-12-13 07:03:48.336527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.272 [2024-12-13 07:03:48.336549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.272 [2024-12-13 07:03:48.336669] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.272 [2024-12-13 07:03:48.336691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.272 #69 NEW cov: 11896 ft: 15450 corp: 36/1509b lim: 85 exec/s: 69 rss: 69Mb L: 55/75 MS: 1 ChangeBit- 00:08:30.272 [2024-12-13 07:03:48.376687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.272 [2024-12-13 07:03:48.376718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.272 [2024-12-13 07:03:48.376807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.272 [2024-12-13 07:03:48.376833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.272 [2024-12-13 07:03:48.376947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.272 [2024-12-13 07:03:48.376970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.272 [2024-12-13 07:03:48.377095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:30.272 [2024-12-13 07:03:48.377119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.272 #70 NEW cov: 11896 ft: 15456 corp: 37/1579b lim: 85 exec/s: 70 rss: 69Mb L: 70/75 MS: 1 InsertByte- 00:08:30.272 [2024-12-13 07:03:48.416564] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.272 [2024-12-13 07:03:48.416597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.272 [2024-12-13 07:03:48.416688] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.272 [2024-12-13 07:03:48.416713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.272 [2024-12-13 07:03:48.416827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.272 [2024-12-13 07:03:48.416847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.272 #71 NEW cov: 11896 ft: 15503 corp: 38/1638b lim: 85 exec/s: 71 rss: 69Mb L: 59/75 MS: 1 ChangeBinInt- 00:08:30.272 [2024-12-13 07:03:48.466705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.272 [2024-12-13 07:03:48.466735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.272 [2024-12-13 07:03:48.466855] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.272 [2024-12-13 07:03:48.466875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.272 [2024-12-13 07:03:48.466995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.272 [2024-12-13 07:03:48.467020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.272 #72 NEW cov: 11896 ft: 15512 corp: 39/1693b lim: 85 exec/s: 36 rss: 69Mb L: 55/75 MS: 1 ChangeBinInt- 00:08:30.272 #72 DONE cov: 11896 ft: 15512 corp: 39/1693b lim: 85 exec/s: 36 rss: 69Mb 00:08:30.272 ###### Recommended dictionary. ###### 00:08:30.272 "\004\000\000\000" # Uses: 2 00:08:30.272 ###### End of recommended dictionary. ###### 00:08:30.272 Done 72 runs in 2 second(s) 00:08:30.531 07:03:48 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:08:30.531 07:03:48 -- ../common.sh@72 -- # (( i++ )) 00:08:30.531 07:03:48 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:30.531 07:03:48 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:30.531 07:03:48 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:30.531 07:03:48 -- nvmf/run.sh@24 -- # local timen=1 00:08:30.531 07:03:48 -- nvmf/run.sh@25 -- # local core=0x1 00:08:30.531 07:03:48 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:30.531 07:03:48 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:30.531 07:03:48 -- nvmf/run.sh@29 -- # printf %02d 23 00:08:30.531 07:03:48 -- nvmf/run.sh@29 -- # port=4423 00:08:30.531 07:03:48 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:30.531 07:03:48 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:30.531 07:03:48 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:30.531 07:03:48 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:08:30.531 [2024-12-13 07:03:48.659492] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:30.531 [2024-12-13 07:03:48.659582] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid500077 ] 00:08:30.531 EAL: No free 2048 kB hugepages reported on node 1 00:08:30.790 [2024-12-13 07:03:48.835985] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:30.790 [2024-12-13 07:03:48.855395] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:30.790 [2024-12-13 07:03:48.855531] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:30.790 [2024-12-13 07:03:48.906959] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:30.790 [2024-12-13 07:03:48.923260] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:30.790 INFO: Running with entropic power schedule (0xFF, 100). 00:08:30.790 INFO: Seed: 1773122413 00:08:30.790 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:30.790 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:30.790 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:30.790 INFO: A corpus is not provided, starting from an empty corpus 00:08:30.790 #2 INITED exec/s: 0 rss: 60Mb 00:08:30.790 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:30.790 This may also happen if the target rejected all inputs we tried so far 00:08:30.790 [2024-12-13 07:03:48.968458] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:30.790 [2024-12-13 07:03:48.968487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.790 [2024-12-13 07:03:48.968540] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:30.790 [2024-12-13 07:03:48.968555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.049 NEW_FUNC[1/671]: 0x47bcd8 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:31.049 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:31.049 #5 NEW cov: 11597 ft: 11599 corp: 2/13b lim: 25 exec/s: 0 rss: 67Mb L: 12/12 MS: 3 CrossOver-EraseBytes-InsertRepeatedBytes- 00:08:31.049 [2024-12-13 07:03:49.279469] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.049 [2024-12-13 07:03:49.279513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.049 [2024-12-13 07:03:49.279593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.049 [2024-12-13 07:03:49.279616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.049 [2024-12-13 07:03:49.279679] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.049 [2024-12-13 07:03:49.279700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.049 [2024-12-13 07:03:49.279763] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:31.049 [2024-12-13 07:03:49.279784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.308 #9 NEW cov: 11715 ft: 12673 corp: 3/35b lim: 25 exec/s: 0 rss: 67Mb L: 22/22 MS: 4 InsertByte-CrossOver-ChangeBinInt-InsertRepeatedBytes- 00:08:31.308 [2024-12-13 07:03:49.329209] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.308 [2024-12-13 07:03:49.329237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.308 [2024-12-13 07:03:49.329275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.308 [2024-12-13 07:03:49.329290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.308 #15 NEW cov: 11721 ft: 12917 corp: 4/47b lim: 25 exec/s: 0 rss: 67Mb L: 12/22 MS: 1 ChangeBinInt- 00:08:31.308 [2024-12-13 07:03:49.369428] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.308 [2024-12-13 07:03:49.369454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.308 [2024-12-13 07:03:49.369496] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.308 [2024-12-13 07:03:49.369512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.308 [2024-12-13 07:03:49.369568] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.308 [2024-12-13 07:03:49.369584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.308 #16 NEW cov: 11806 ft: 13449 corp: 5/63b lim: 25 exec/s: 0 rss: 67Mb L: 16/22 MS: 1 CopyPart- 00:08:31.308 [2024-12-13 07:03:49.409586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.308 [2024-12-13 07:03:49.409613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.308 [2024-12-13 07:03:49.409650] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.308 [2024-12-13 07:03:49.409665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.308 [2024-12-13 07:03:49.409723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.308 [2024-12-13 07:03:49.409738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.308 #22 NEW cov: 11806 ft: 13610 corp: 6/79b lim: 25 exec/s: 0 rss: 67Mb L: 16/22 MS: 1 ChangeASCIIInt- 00:08:31.308 [2024-12-13 07:03:49.449707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.308 [2024-12-13 07:03:49.449734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.308 [2024-12-13 07:03:49.449772] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.308 [2024-12-13 07:03:49.449787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.308 [2024-12-13 07:03:49.449846] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.308 [2024-12-13 07:03:49.449861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.308 #28 NEW cov: 11806 ft: 13689 corp: 7/96b lim: 25 exec/s: 0 rss: 67Mb L: 17/22 MS: 1 CrossOver- 00:08:31.309 [2024-12-13 07:03:49.489710] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.309 [2024-12-13 07:03:49.489738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.309 [2024-12-13 07:03:49.489776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.309 [2024-12-13 07:03:49.489792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.309 #29 NEW cov: 11806 ft: 13783 corp: 8/108b lim: 25 exec/s: 0 rss: 67Mb L: 12/22 MS: 1 CrossOver- 00:08:31.309 [2024-12-13 07:03:49.529787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.309 [2024-12-13 07:03:49.529814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.309 [2024-12-13 07:03:49.529864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.309 [2024-12-13 07:03:49.529880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.568 #35 NEW cov: 11806 ft: 13808 corp: 9/120b lim: 25 exec/s: 0 rss: 68Mb L: 12/22 MS: 1 ShuffleBytes- 00:08:31.568 [2024-12-13 07:03:49.570145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.568 [2024-12-13 07:03:49.570175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.568 [2024-12-13 07:03:49.570217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.568 [2024-12-13 07:03:49.570231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.568 [2024-12-13 07:03:49.570285] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.568 [2024-12-13 07:03:49.570299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.568 [2024-12-13 07:03:49.570353] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:31.568 [2024-12-13 07:03:49.570367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.568 #36 NEW cov: 11806 ft: 13880 corp: 10/143b lim: 25 exec/s: 0 rss: 68Mb L: 23/23 MS: 1 InsertByte- 00:08:31.568 [2024-12-13 07:03:49.610245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.568 [2024-12-13 07:03:49.610271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.568 [2024-12-13 07:03:49.610322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.568 [2024-12-13 07:03:49.610337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.568 [2024-12-13 07:03:49.610387] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.568 [2024-12-13 07:03:49.610402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.568 [2024-12-13 07:03:49.610455] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:31.568 [2024-12-13 07:03:49.610469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.568 #37 NEW cov: 11806 ft: 13943 corp: 11/166b lim: 25 exec/s: 0 rss: 68Mb L: 23/23 MS: 1 ChangeBinInt- 00:08:31.568 [2024-12-13 07:03:49.650424] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.568 [2024-12-13 07:03:49.650450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.568 [2024-12-13 07:03:49.650497] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.568 [2024-12-13 07:03:49.650512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.568 [2024-12-13 07:03:49.650565] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.568 [2024-12-13 07:03:49.650580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.568 [2024-12-13 07:03:49.650634] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:31.568 [2024-12-13 07:03:49.650648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.568 #38 NEW cov: 11806 ft: 13949 corp: 12/188b lim: 25 exec/s: 0 rss: 68Mb L: 22/23 MS: 1 EraseBytes- 00:08:31.568 [2024-12-13 07:03:49.690248] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.568 [2024-12-13 07:03:49.690274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.568 [2024-12-13 07:03:49.690329] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.568 [2024-12-13 07:03:49.690355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.568 #39 NEW cov: 11806 ft: 13973 corp: 13/201b lim: 25 exec/s: 0 rss: 68Mb L: 13/23 MS: 1 InsertByte- 00:08:31.568 [2024-12-13 07:03:49.730592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.568 [2024-12-13 07:03:49.730620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.568 [2024-12-13 07:03:49.730663] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.568 [2024-12-13 07:03:49.730678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.568 [2024-12-13 07:03:49.730733] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.568 [2024-12-13 07:03:49.730749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.568 [2024-12-13 07:03:49.730801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:31.568 [2024-12-13 07:03:49.730816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.568 #40 NEW cov: 11806 ft: 14002 corp: 14/223b lim: 25 exec/s: 0 rss: 68Mb L: 22/23 MS: 1 CMP- DE: "\000\000\000\000"- 00:08:31.568 [2024-12-13 07:03:49.770364] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.568 [2024-12-13 07:03:49.770393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.568 #42 NEW cov: 11806 ft: 14382 corp: 15/229b lim: 25 exec/s: 0 rss: 68Mb L: 6/23 MS: 2 InsertByte-PersAutoDict- DE: "\000\000\000\000"- 00:08:31.828 [2024-12-13 07:03:49.810732] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.828 [2024-12-13 07:03:49.810760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.828 [2024-12-13 07:03:49.810799] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.828 [2024-12-13 07:03:49.810814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.828 [2024-12-13 07:03:49.810871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.828 [2024-12-13 07:03:49.810886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.828 #43 NEW cov: 11806 ft: 14386 corp: 16/245b lim: 25 exec/s: 0 rss: 68Mb L: 16/23 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:31.828 [2024-12-13 07:03:49.850934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.828 [2024-12-13 07:03:49.850961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.828 [2024-12-13 07:03:49.851008] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.828 [2024-12-13 07:03:49.851023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.828 [2024-12-13 07:03:49.851076] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.828 [2024-12-13 07:03:49.851091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.828 [2024-12-13 07:03:49.851144] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:31.828 [2024-12-13 07:03:49.851161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.828 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:31.828 #44 NEW cov: 11829 ft: 14438 corp: 17/267b lim: 25 exec/s: 0 rss: 68Mb L: 22/23 MS: 1 ChangeByte- 00:08:31.828 [2024-12-13 07:03:49.890705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.828 [2024-12-13 07:03:49.890731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.828 #45 NEW cov: 11829 ft: 14451 corp: 18/273b lim: 25 exec/s: 0 rss: 68Mb L: 6/23 MS: 1 ShuffleBytes- 00:08:31.828 [2024-12-13 07:03:49.931065] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.828 [2024-12-13 07:03:49.931093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.828 [2024-12-13 07:03:49.931129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.828 [2024-12-13 07:03:49.931145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.828 [2024-12-13 07:03:49.931204] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.828 [2024-12-13 07:03:49.931220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.828 #46 NEW cov: 11829 ft: 14457 corp: 19/289b lim: 25 exec/s: 0 rss: 68Mb L: 16/23 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:31.828 [2024-12-13 07:03:49.971144] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.828 [2024-12-13 07:03:49.971171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.828 [2024-12-13 07:03:49.971241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.828 [2024-12-13 07:03:49.971257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.828 [2024-12-13 07:03:49.971309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.828 [2024-12-13 07:03:49.971324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.828 #47 NEW cov: 11829 ft: 14476 corp: 20/306b lim: 25 exec/s: 47 rss: 68Mb L: 17/23 MS: 1 ShuffleBytes- 00:08:31.828 [2024-12-13 07:03:50.011099] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.828 [2024-12-13 07:03:50.011128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.828 #48 NEW cov: 11829 ft: 14491 corp: 21/312b lim: 25 exec/s: 48 rss: 68Mb L: 6/23 MS: 1 CMP- DE: "\366\377\377\377"- 00:08:31.828 [2024-12-13 07:03:50.051557] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.828 [2024-12-13 07:03:50.051584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.828 [2024-12-13 07:03:50.051633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.828 [2024-12-13 07:03:50.051649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.828 [2024-12-13 07:03:50.051704] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.828 [2024-12-13 07:03:50.051720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.828 [2024-12-13 07:03:50.051779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:31.828 [2024-12-13 07:03:50.051794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.088 #49 NEW cov: 11829 ft: 14506 corp: 22/334b lim: 25 exec/s: 49 rss: 68Mb L: 22/23 MS: 1 ChangeBit- 00:08:32.088 [2024-12-13 07:03:50.091549] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.088 [2024-12-13 07:03:50.091578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.088 [2024-12-13 07:03:50.091617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.088 [2024-12-13 07:03:50.091633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.088 [2024-12-13 07:03:50.091690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.088 [2024-12-13 07:03:50.091705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.088 #50 NEW cov: 11829 ft: 14549 corp: 23/350b lim: 25 exec/s: 50 rss: 68Mb L: 16/23 MS: 1 ChangeBinInt- 00:08:32.088 [2024-12-13 07:03:50.131467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.088 [2024-12-13 07:03:50.131495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.088 #51 NEW cov: 11829 ft: 14564 corp: 24/356b lim: 25 exec/s: 51 rss: 69Mb L: 6/23 MS: 1 CMP- DE: "\001\014"- 00:08:32.088 [2024-12-13 07:03:50.171885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.088 [2024-12-13 07:03:50.171912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.088 [2024-12-13 07:03:50.171955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.088 [2024-12-13 07:03:50.171971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.088 [2024-12-13 07:03:50.172024] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.088 [2024-12-13 07:03:50.172040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.088 [2024-12-13 07:03:50.172094] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.088 [2024-12-13 07:03:50.172110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.088 #52 NEW cov: 11829 ft: 14592 corp: 25/380b lim: 25 exec/s: 52 rss: 69Mb L: 24/24 MS: 1 InsertByte- 00:08:32.088 [2024-12-13 07:03:50.211995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.088 [2024-12-13 07:03:50.212021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.088 [2024-12-13 07:03:50.212064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.088 [2024-12-13 07:03:50.212078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.088 [2024-12-13 07:03:50.212134] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.088 [2024-12-13 07:03:50.212148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.088 [2024-12-13 07:03:50.212202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.088 [2024-12-13 07:03:50.212235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.088 #53 NEW cov: 11829 ft: 14640 corp: 26/404b lim: 25 exec/s: 53 rss: 69Mb L: 24/24 MS: 1 CMP- DE: "\257qn\013\271\347\002\000"- 00:08:32.088 [2024-12-13 07:03:50.262154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.088 [2024-12-13 07:03:50.262181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.088 [2024-12-13 07:03:50.262244] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.088 [2024-12-13 07:03:50.262257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.088 [2024-12-13 07:03:50.262312] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.088 [2024-12-13 07:03:50.262327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.088 [2024-12-13 07:03:50.262379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.088 [2024-12-13 07:03:50.262404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.088 #54 NEW cov: 11829 ft: 14658 corp: 27/427b lim: 25 exec/s: 54 rss: 69Mb L: 23/24 MS: 1 InsertRepeatedBytes- 00:08:32.088 [2024-12-13 07:03:50.302240] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.088 [2024-12-13 07:03:50.302267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.088 [2024-12-13 07:03:50.302319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.088 [2024-12-13 07:03:50.302335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.088 [2024-12-13 07:03:50.302391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.088 [2024-12-13 07:03:50.302406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.088 [2024-12-13 07:03:50.302462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.088 [2024-12-13 07:03:50.302478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.348 #55 NEW cov: 11829 ft: 14696 corp: 28/451b lim: 25 exec/s: 55 rss: 69Mb L: 24/24 MS: 1 PersAutoDict- DE: "\366\377\377\377"- 00:08:32.348 [2024-12-13 07:03:50.342152] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.348 [2024-12-13 07:03:50.342180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.348 [2024-12-13 07:03:50.342241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.348 [2024-12-13 07:03:50.342257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.348 #56 NEW cov: 11829 ft: 14736 corp: 29/463b lim: 25 exec/s: 56 rss: 69Mb L: 12/24 MS: 1 ChangeASCIIInt- 00:08:32.348 [2024-12-13 07:03:50.382504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.348 [2024-12-13 07:03:50.382531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.348 [2024-12-13 07:03:50.382594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.348 [2024-12-13 07:03:50.382609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.348 [2024-12-13 07:03:50.382666] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.348 [2024-12-13 07:03:50.382681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.348 [2024-12-13 07:03:50.382736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.348 [2024-12-13 07:03:50.382752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.348 #62 NEW cov: 11829 ft: 14755 corp: 30/486b lim: 25 exec/s: 62 rss: 69Mb L: 23/24 MS: 1 ChangeBit- 00:08:32.348 [2024-12-13 07:03:50.422231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.348 [2024-12-13 07:03:50.422257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.348 #63 NEW cov: 11829 ft: 14757 corp: 31/491b lim: 25 exec/s: 63 rss: 69Mb L: 5/24 MS: 1 EraseBytes- 00:08:32.348 [2024-12-13 07:03:50.462723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.348 [2024-12-13 07:03:50.462750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.348 [2024-12-13 07:03:50.462801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.348 [2024-12-13 07:03:50.462816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.348 [2024-12-13 07:03:50.462871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.348 [2024-12-13 07:03:50.462887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.348 [2024-12-13 07:03:50.462941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.348 [2024-12-13 07:03:50.462957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.348 #64 NEW cov: 11829 ft: 14758 corp: 32/515b lim: 25 exec/s: 64 rss: 69Mb L: 24/24 MS: 1 ChangeByte- 00:08:32.348 [2024-12-13 07:03:50.502849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.348 [2024-12-13 07:03:50.502876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.348 [2024-12-13 07:03:50.502924] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.348 [2024-12-13 07:03:50.502940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.348 [2024-12-13 07:03:50.502994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.348 [2024-12-13 07:03:50.503007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.348 [2024-12-13 07:03:50.503061] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.348 [2024-12-13 07:03:50.503076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.348 #65 NEW cov: 11829 ft: 14770 corp: 33/537b lim: 25 exec/s: 65 rss: 69Mb L: 22/24 MS: 1 ChangeBinInt- 00:08:32.348 [2024-12-13 07:03:50.543019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.348 [2024-12-13 07:03:50.543046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.348 [2024-12-13 07:03:50.543094] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.348 [2024-12-13 07:03:50.543112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.348 [2024-12-13 07:03:50.543173] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.348 [2024-12-13 07:03:50.543192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.348 [2024-12-13 07:03:50.543248] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.348 [2024-12-13 07:03:50.543263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.348 #66 NEW cov: 11829 ft: 14786 corp: 34/561b lim: 25 exec/s: 66 rss: 69Mb L: 24/24 MS: 1 ShuffleBytes- 00:08:32.349 [2024-12-13 07:03:50.582970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.349 [2024-12-13 07:03:50.582996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.349 [2024-12-13 07:03:50.583035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.349 [2024-12-13 07:03:50.583050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.349 [2024-12-13 07:03:50.583107] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.349 [2024-12-13 07:03:50.583122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.608 #67 NEW cov: 11829 ft: 14863 corp: 35/577b lim: 25 exec/s: 67 rss: 69Mb L: 16/24 MS: 1 ChangeBinInt- 00:08:32.608 [2024-12-13 07:03:50.622952] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.608 [2024-12-13 07:03:50.622978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.608 [2024-12-13 07:03:50.623032] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.608 [2024-12-13 07:03:50.623046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.608 #68 NEW cov: 11829 ft: 14914 corp: 36/590b lim: 25 exec/s: 68 rss: 69Mb L: 13/24 MS: 1 ChangeASCIIInt- 00:08:32.608 [2024-12-13 07:03:50.663163] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.608 [2024-12-13 07:03:50.663192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.608 [2024-12-13 07:03:50.663251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.608 [2024-12-13 07:03:50.663267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.608 [2024-12-13 07:03:50.663323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.608 [2024-12-13 07:03:50.663338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.608 #69 NEW cov: 11829 ft: 14924 corp: 37/606b lim: 25 exec/s: 69 rss: 69Mb L: 16/24 MS: 1 InsertRepeatedBytes- 00:08:32.608 [2024-12-13 07:03:50.703511] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.608 [2024-12-13 07:03:50.703538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.608 [2024-12-13 07:03:50.703594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.608 [2024-12-13 07:03:50.703610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.608 [2024-12-13 07:03:50.703679] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.608 [2024-12-13 07:03:50.703694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.608 [2024-12-13 07:03:50.703747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.608 [2024-12-13 07:03:50.703762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.608 [2024-12-13 07:03:50.703817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:32.608 [2024-12-13 07:03:50.703831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:32.608 #70 NEW cov: 11829 ft: 14990 corp: 38/631b lim: 25 exec/s: 70 rss: 69Mb L: 25/25 MS: 1 CrossOver- 00:08:32.608 [2024-12-13 07:03:50.743390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.608 [2024-12-13 07:03:50.743416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.608 [2024-12-13 07:03:50.743476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.608 [2024-12-13 07:03:50.743491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.608 [2024-12-13 07:03:50.743546] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.608 [2024-12-13 07:03:50.743562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.608 #71 NEW cov: 11829 ft: 15008 corp: 39/649b lim: 25 exec/s: 71 rss: 69Mb L: 18/25 MS: 1 CrossOver- 00:08:32.608 [2024-12-13 07:03:50.783405] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.608 [2024-12-13 07:03:50.783432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.608 [2024-12-13 07:03:50.783470] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.608 [2024-12-13 07:03:50.783485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.608 #75 NEW cov: 11829 ft: 15017 corp: 40/661b lim: 25 exec/s: 75 rss: 69Mb L: 12/25 MS: 4 ChangeBit-PersAutoDict-CopyPart-CMP- DE: "\001\014"-"\367\025\273`\271\347\002\000"- 00:08:32.608 [2024-12-13 07:03:50.813494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.608 [2024-12-13 07:03:50.813519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.608 [2024-12-13 07:03:50.813557] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.608 [2024-12-13 07:03:50.813572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.608 #76 NEW cov: 11829 ft: 15074 corp: 41/673b lim: 25 exec/s: 76 rss: 69Mb L: 12/25 MS: 1 ShuffleBytes- 00:08:32.868 [2024-12-13 07:03:50.853542] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.868 [2024-12-13 07:03:50.853569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.868 #77 NEW cov: 11829 ft: 15097 corp: 42/678b lim: 25 exec/s: 77 rss: 69Mb L: 5/25 MS: 1 ChangeBit- 00:08:32.868 [2024-12-13 07:03:50.893632] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.868 [2024-12-13 07:03:50.893658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.868 #78 NEW cov: 11829 ft: 15107 corp: 43/684b lim: 25 exec/s: 78 rss: 69Mb L: 6/25 MS: 1 ShuffleBytes- 00:08:32.868 [2024-12-13 07:03:50.934093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.868 [2024-12-13 07:03:50.934119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.868 [2024-12-13 07:03:50.934167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.868 [2024-12-13 07:03:50.934182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.868 [2024-12-13 07:03:50.934241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.868 [2024-12-13 07:03:50.934256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.868 [2024-12-13 07:03:50.934309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.868 [2024-12-13 07:03:50.934324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.868 #79 NEW cov: 11829 ft: 15126 corp: 44/706b lim: 25 exec/s: 39 rss: 70Mb L: 22/25 MS: 1 ChangeBinInt- 00:08:32.868 #79 DONE cov: 11829 ft: 15126 corp: 44/706b lim: 25 exec/s: 39 rss: 70Mb 00:08:32.868 ###### Recommended dictionary. ###### 00:08:32.868 "\000\000\000\000" # Uses: 3 00:08:32.868 "\366\377\377\377" # Uses: 1 00:08:32.868 "\001\014" # Uses: 1 00:08:32.868 "\257qn\013\271\347\002\000" # Uses: 0 00:08:32.868 "\367\025\273`\271\347\002\000" # Uses: 0 00:08:32.868 ###### End of recommended dictionary. ###### 00:08:32.868 Done 79 runs in 2 second(s) 00:08:32.868 07:03:51 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:08:32.868 07:03:51 -- ../common.sh@72 -- # (( i++ )) 00:08:32.868 07:03:51 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:32.868 07:03:51 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:32.868 07:03:51 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:32.868 07:03:51 -- nvmf/run.sh@24 -- # local timen=1 00:08:32.868 07:03:51 -- nvmf/run.sh@25 -- # local core=0x1 00:08:32.868 07:03:51 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:32.868 07:03:51 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:32.868 07:03:51 -- nvmf/run.sh@29 -- # printf %02d 24 00:08:32.868 07:03:51 -- nvmf/run.sh@29 -- # port=4424 00:08:32.868 07:03:51 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:32.868 07:03:51 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:32.868 07:03:51 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:32.868 07:03:51 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:08:33.127 [2024-12-13 07:03:51.118896] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:33.127 [2024-12-13 07:03:51.118985] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid500406 ] 00:08:33.127 EAL: No free 2048 kB hugepages reported on node 1 00:08:33.127 [2024-12-13 07:03:51.298285] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.127 [2024-12-13 07:03:51.318078] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:33.127 [2024-12-13 07:03:51.318226] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.386 [2024-12-13 07:03:51.369662] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:33.386 [2024-12-13 07:03:51.385984] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:33.386 INFO: Running with entropic power schedule (0xFF, 100). 00:08:33.386 INFO: Seed: 4234118587 00:08:33.386 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:33.386 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:33.386 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:33.386 INFO: A corpus is not provided, starting from an empty corpus 00:08:33.386 #2 INITED exec/s: 0 rss: 59Mb 00:08:33.386 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:33.386 This may also happen if the target rejected all inputs we tried so far 00:08:33.386 [2024-12-13 07:03:51.451765] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.386 [2024-12-13 07:03:51.451800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.646 NEW_FUNC[1/671]: 0x47cdc8 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:33.646 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:33.646 #13 NEW cov: 11650 ft: 11664 corp: 2/29b lim: 100 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:08:33.646 [2024-12-13 07:03:51.782653] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.646 [2024-12-13 07:03:51.782696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.646 NEW_FUNC[1/1]: 0xed3e28 in spdk_ring_dequeue /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/env.c:415 00:08:33.646 #19 NEW cov: 11787 ft: 12212 corp: 3/57b lim: 100 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 CMP- DE: "\377\377\377\017"- 00:08:33.646 [2024-12-13 07:03:51.832623] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.646 [2024-12-13 07:03:51.832650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.646 #50 NEW cov: 11793 ft: 12672 corp: 4/85b lim: 100 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 PersAutoDict- DE: "\377\377\377\017"- 00:08:33.646 [2024-12-13 07:03:51.872807] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.646 [2024-12-13 07:03:51.872836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.905 #56 NEW cov: 11878 ft: 12893 corp: 5/117b lim: 100 exec/s: 0 rss: 67Mb L: 32/32 MS: 1 CMP- DE: "\002\000\000\000"- 00:08:33.905 [2024-12-13 07:03:51.913708] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.905 [2024-12-13 07:03:51.913737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.905 [2024-12-13 07:03:51.913834] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.905 [2024-12-13 07:03:51.913855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.905 [2024-12-13 07:03:51.913965] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.905 [2024-12-13 07:03:51.913985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.905 [2024-12-13 07:03:51.914101] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.905 [2024-12-13 07:03:51.914122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.906 #58 NEW cov: 11878 ft: 13881 corp: 6/203b lim: 100 exec/s: 0 rss: 67Mb L: 86/86 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:33.906 [2024-12-13 07:03:51.953830] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.906 [2024-12-13 07:03:51.953857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.906 [2024-12-13 07:03:51.953924] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.906 [2024-12-13 07:03:51.953942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.906 [2024-12-13 07:03:51.954052] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.906 [2024-12-13 07:03:51.954072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.906 [2024-12-13 07:03:51.954197] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.906 [2024-12-13 07:03:51.954217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.906 #59 NEW cov: 11878 ft: 13958 corp: 7/289b lim: 100 exec/s: 0 rss: 67Mb L: 86/86 MS: 1 ChangeByte- 00:08:33.906 [2024-12-13 07:03:52.003165] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.906 [2024-12-13 07:03:52.003201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.906 #60 NEW cov: 11878 ft: 14018 corp: 8/317b lim: 100 exec/s: 0 rss: 67Mb L: 28/86 MS: 1 PersAutoDict- DE: "\377\377\377\017"- 00:08:33.906 [2024-12-13 07:03:52.043843] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.906 [2024-12-13 07:03:52.043874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.906 [2024-12-13 07:03:52.043983] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.906 [2024-12-13 07:03:52.044011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.906 [2024-12-13 07:03:52.044127] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.906 [2024-12-13 07:03:52.044149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.906 #61 NEW cov: 11878 ft: 14374 corp: 9/377b lim: 100 exec/s: 0 rss: 67Mb L: 60/86 MS: 1 InsertRepeatedBytes- 00:08:33.906 [2024-12-13 07:03:52.094230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.906 [2024-12-13 07:03:52.094263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.906 [2024-12-13 07:03:52.094349] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.906 [2024-12-13 07:03:52.094372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.906 [2024-12-13 07:03:52.094489] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.906 [2024-12-13 07:03:52.094508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.906 [2024-12-13 07:03:52.094615] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.906 [2024-12-13 07:03:52.094637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.906 #62 NEW cov: 11878 ft: 14452 corp: 10/463b lim: 100 exec/s: 0 rss: 68Mb L: 86/86 MS: 1 ChangeBit- 00:08:33.906 [2024-12-13 07:03:52.144551] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446606093773373439 len:33411 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.906 [2024-12-13 07:03:52.144582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.906 [2024-12-13 07:03:52.144660] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9404222468949967490 len:33411 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.906 [2024-12-13 07:03:52.144680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.906 [2024-12-13 07:03:52.144797] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9404222468949967490 len:33411 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.906 [2024-12-13 07:03:52.144817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.906 [2024-12-13 07:03:52.144936] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9404222468949967490 len:33411 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.906 [2024-12-13 07:03:52.144954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.165 #65 NEW cov: 11878 ft: 14490 corp: 11/562b lim: 100 exec/s: 0 rss: 68Mb L: 99/99 MS: 3 CrossOver-InsertByte-InsertRepeatedBytes- 00:08:34.165 [2024-12-13 07:03:52.184268] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.165 [2024-12-13 07:03:52.184302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.165 [2024-12-13 07:03:52.184399] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18408182001900191743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.165 [2024-12-13 07:03:52.184422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.165 [2024-12-13 07:03:52.184542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.165 [2024-12-13 07:03:52.184562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.165 #66 NEW cov: 11878 ft: 14545 corp: 12/622b lim: 100 exec/s: 0 rss: 68Mb L: 60/99 MS: 1 ChangeByte- 00:08:34.165 [2024-12-13 07:03:52.233871] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.165 [2024-12-13 07:03:52.233900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.165 #67 NEW cov: 11878 ft: 14588 corp: 13/654b lim: 100 exec/s: 0 rss: 68Mb L: 32/99 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:08:34.165 [2024-12-13 07:03:52.274586] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.165 [2024-12-13 07:03:52.274617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.166 [2024-12-13 07:03:52.274689] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18408182001900191743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.166 [2024-12-13 07:03:52.274712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.166 [2024-12-13 07:03:52.274828] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446503280663068671 len:513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.166 [2024-12-13 07:03:52.274853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.166 #68 NEW cov: 11878 ft: 14609 corp: 14/714b lim: 100 exec/s: 0 rss: 68Mb L: 60/99 MS: 1 ChangeByte- 00:08:34.166 [2024-12-13 07:03:52.324994] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.166 [2024-12-13 07:03:52.325024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.166 [2024-12-13 07:03:52.325088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.166 [2024-12-13 07:03:52.325106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.166 [2024-12-13 07:03:52.325227] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.166 [2024-12-13 07:03:52.325248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.166 [2024-12-13 07:03:52.325375] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073700114431 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.166 [2024-12-13 07:03:52.325397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.166 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:34.166 #69 NEW cov: 11901 ft: 14663 corp: 15/801b lim: 100 exec/s: 0 rss: 68Mb L: 87/99 MS: 1 InsertByte- 00:08:34.166 [2024-12-13 07:03:52.364881] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.166 [2024-12-13 07:03:52.364910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.166 [2024-12-13 07:03:52.365021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446742978492891135 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.166 [2024-12-13 07:03:52.365041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.166 [2024-12-13 07:03:52.365147] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.166 [2024-12-13 07:03:52.365169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.166 #70 NEW cov: 11901 ft: 14731 corp: 16/861b lim: 100 exec/s: 0 rss: 68Mb L: 60/99 MS: 1 CopyPart- 00:08:34.166 [2024-12-13 07:03:52.404718] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.166 [2024-12-13 07:03:52.404752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.166 [2024-12-13 07:03:52.404864] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.166 [2024-12-13 07:03:52.404886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.425 #71 NEW cov: 11901 ft: 15054 corp: 17/905b lim: 100 exec/s: 71 rss: 68Mb L: 44/99 MS: 1 CopyPart- 00:08:34.425 [2024-12-13 07:03:52.455058] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.425 [2024-12-13 07:03:52.455088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.425 [2024-12-13 07:03:52.455176] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446742978476129535 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.425 [2024-12-13 07:03:52.455201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.425 [2024-12-13 07:03:52.455320] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.425 [2024-12-13 07:03:52.455342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.425 #72 NEW cov: 11901 ft: 15075 corp: 18/965b lim: 100 exec/s: 72 rss: 68Mb L: 60/99 MS: 1 ChangeBinInt- 00:08:34.425 [2024-12-13 07:03:52.504751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.425 [2024-12-13 07:03:52.504774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.425 #73 NEW cov: 11901 ft: 15096 corp: 19/993b lim: 100 exec/s: 73 rss: 68Mb L: 28/99 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:08:34.425 [2024-12-13 07:03:52.545719] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.425 [2024-12-13 07:03:52.545751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.425 [2024-12-13 07:03:52.545864] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.425 [2024-12-13 07:03:52.545885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.425 [2024-12-13 07:03:52.545996] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.425 [2024-12-13 07:03:52.546016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.425 [2024-12-13 07:03:52.546132] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.425 [2024-12-13 07:03:52.546149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.425 #75 NEW cov: 11901 ft: 15136 corp: 20/1089b lim: 100 exec/s: 75 rss: 68Mb L: 96/99 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:34.425 [2024-12-13 07:03:52.584959] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732802 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.425 [2024-12-13 07:03:52.584991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.425 #76 NEW cov: 11901 ft: 15171 corp: 21/1117b lim: 100 exec/s: 76 rss: 68Mb L: 28/99 MS: 1 ChangeBit- 00:08:34.426 [2024-12-13 07:03:52.625630] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.426 [2024-12-13 07:03:52.625662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.426 [2024-12-13 07:03:52.625751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.426 [2024-12-13 07:03:52.625773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.426 [2024-12-13 07:03:52.625887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4278190080 len:513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.426 [2024-12-13 07:03:52.625906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.426 #77 NEW cov: 11901 ft: 15193 corp: 22/1177b lim: 100 exec/s: 77 rss: 68Mb L: 60/99 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\004"- 00:08:34.685 [2024-12-13 07:03:52.665960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.685 [2024-12-13 07:03:52.665993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.685 [2024-12-13 07:03:52.666079] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.685 [2024-12-13 07:03:52.666098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.685 [2024-12-13 07:03:52.666217] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.685 [2024-12-13 07:03:52.666242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.685 [2024-12-13 07:03:52.666365] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18230571291595767807 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.685 [2024-12-13 07:03:52.666385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.685 #78 NEW cov: 11901 ft: 15262 corp: 23/1263b lim: 100 exec/s: 78 rss: 68Mb L: 86/99 MS: 1 ChangeBinInt- 00:08:34.685 [2024-12-13 07:03:52.715608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.685 [2024-12-13 07:03:52.715637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.685 [2024-12-13 07:03:52.715735] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.685 [2024-12-13 07:03:52.715758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.685 #79 NEW cov: 11901 ft: 15276 corp: 24/1307b lim: 100 exec/s: 79 rss: 68Mb L: 44/99 MS: 1 CopyPart- 00:08:34.685 [2024-12-13 07:03:52.756192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.685 [2024-12-13 07:03:52.756221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.685 [2024-12-13 07:03:52.756304] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.685 [2024-12-13 07:03:52.756329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.685 [2024-12-13 07:03:52.756443] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.685 [2024-12-13 07:03:52.756466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.685 [2024-12-13 07:03:52.756578] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:14974415777481871311 len:53248 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.685 [2024-12-13 07:03:52.756599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.685 #80 NEW cov: 11901 ft: 15282 corp: 25/1405b lim: 100 exec/s: 80 rss: 68Mb L: 98/99 MS: 1 InsertRepeatedBytes- 00:08:34.685 [2024-12-13 07:03:52.806441] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.685 [2024-12-13 07:03:52.806471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.685 [2024-12-13 07:03:52.806568] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446584644523524095 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.686 [2024-12-13 07:03:52.806588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.686 [2024-12-13 07:03:52.806712] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.686 [2024-12-13 07:03:52.806739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.686 [2024-12-13 07:03:52.806861] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.686 [2024-12-13 07:03:52.806884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.686 #81 NEW cov: 11901 ft: 15293 corp: 26/1485b lim: 100 exec/s: 81 rss: 69Mb L: 80/99 MS: 1 EraseBytes- 00:08:34.686 [2024-12-13 07:03:52.845816] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732802 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.686 [2024-12-13 07:03:52.845842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.686 #82 NEW cov: 11901 ft: 15333 corp: 27/1513b lim: 100 exec/s: 82 rss: 69Mb L: 28/99 MS: 1 ChangeByte- 00:08:34.686 [2024-12-13 07:03:52.896387] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.686 [2024-12-13 07:03:52.896421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.686 [2024-12-13 07:03:52.896496] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.686 [2024-12-13 07:03:52.896522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.686 [2024-12-13 07:03:52.896640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4278190080 len:513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.686 [2024-12-13 07:03:52.896662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.945 #83 NEW cov: 11901 ft: 15353 corp: 28/1573b lim: 100 exec/s: 83 rss: 69Mb L: 60/99 MS: 1 ShuffleBytes- 00:08:34.945 [2024-12-13 07:03:52.946587] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:256 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.945 [2024-12-13 07:03:52.946620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.945 [2024-12-13 07:03:52.946729] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.945 [2024-12-13 07:03:52.946753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.945 [2024-12-13 07:03:52.946883] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.945 [2024-12-13 07:03:52.946902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.945 #84 NEW cov: 11901 ft: 15361 corp: 29/1633b lim: 100 exec/s: 84 rss: 69Mb L: 60/99 MS: 1 CMP- DE: "\377\377}\376\\\016OU"- 00:08:34.945 [2024-12-13 07:03:52.987034] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.945 [2024-12-13 07:03:52.987064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.945 [2024-12-13 07:03:52.987150] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.945 [2024-12-13 07:03:52.987171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.945 [2024-12-13 07:03:52.987289] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.945 [2024-12-13 07:03:52.987313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.945 [2024-12-13 07:03:52.987427] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.945 [2024-12-13 07:03:52.987447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.945 #85 NEW cov: 11901 ft: 15378 corp: 30/1729b lim: 100 exec/s: 85 rss: 69Mb L: 96/99 MS: 1 CMP- DE: "\377\377"- 00:08:34.945 [2024-12-13 07:03:53.036902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.945 [2024-12-13 07:03:53.036936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.945 [2024-12-13 07:03:53.037048] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18408182001900191743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.945 [2024-12-13 07:03:53.037068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.945 [2024-12-13 07:03:53.037178] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446503280663068671 len:513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.945 [2024-12-13 07:03:53.037209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.945 #86 NEW cov: 11901 ft: 15383 corp: 31/1790b lim: 100 exec/s: 86 rss: 69Mb L: 61/99 MS: 1 InsertByte- 00:08:34.945 [2024-12-13 07:03:53.087330] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.945 [2024-12-13 07:03:53.087363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.945 [2024-12-13 07:03:53.087493] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.945 [2024-12-13 07:03:53.087516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.945 [2024-12-13 07:03:53.087626] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.945 [2024-12-13 07:03:53.087646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.945 [2024-12-13 07:03:53.087777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.945 [2024-12-13 07:03:53.087799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.945 #87 NEW cov: 11901 ft: 15444 corp: 32/1886b lim: 100 exec/s: 87 rss: 69Mb L: 96/99 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:34.945 [2024-12-13 07:03:53.127440] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.945 [2024-12-13 07:03:53.127473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.945 [2024-12-13 07:03:53.127557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744069448139007 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.945 [2024-12-13 07:03:53.127577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.945 [2024-12-13 07:03:53.127687] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.945 [2024-12-13 07:03:53.127711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.946 [2024-12-13 07:03:53.127830] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.946 [2024-12-13 07:03:53.127852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.946 #93 NEW cov: 11901 ft: 15461 corp: 33/1983b lim: 100 exec/s: 93 rss: 69Mb L: 97/99 MS: 1 InsertRepeatedBytes- 00:08:34.946 [2024-12-13 07:03:53.177338] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.946 [2024-12-13 07:03:53.177369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.946 [2024-12-13 07:03:53.177462] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18374966856160116735 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.946 [2024-12-13 07:03:53.177485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.946 [2024-12-13 07:03:53.177605] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.946 [2024-12-13 07:03:53.177626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.205 #94 NEW cov: 11901 ft: 15492 corp: 34/2043b lim: 100 exec/s: 94 rss: 69Mb L: 60/99 MS: 1 ShuffleBytes- 00:08:35.205 [2024-12-13 07:03:53.226971] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.205 [2024-12-13 07:03:53.227002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.205 #100 NEW cov: 11901 ft: 15496 corp: 35/2070b lim: 100 exec/s: 100 rss: 69Mb L: 27/99 MS: 1 EraseBytes- 00:08:35.205 [2024-12-13 07:03:53.267010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.205 [2024-12-13 07:03:53.267042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.205 #101 NEW cov: 11901 ft: 15514 corp: 36/2098b lim: 100 exec/s: 101 rss: 69Mb L: 28/99 MS: 1 ChangeBit- 00:08:35.205 [2024-12-13 07:03:53.307739] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.205 [2024-12-13 07:03:53.307770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.205 [2024-12-13 07:03:53.307885] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18408182001900191743 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.205 [2024-12-13 07:03:53.307906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.205 [2024-12-13 07:03:53.308027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446503280663068671 len:513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.205 [2024-12-13 07:03:53.308049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.205 #102 NEW cov: 11901 ft: 15516 corp: 37/2159b lim: 100 exec/s: 102 rss: 69Mb L: 61/99 MS: 1 ChangeBit- 00:08:35.205 [2024-12-13 07:03:53.358123] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446743055802302463 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.205 [2024-12-13 07:03:53.358154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.205 [2024-12-13 07:03:53.358237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.205 [2024-12-13 07:03:53.358255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.205 [2024-12-13 07:03:53.358369] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.205 [2024-12-13 07:03:53.358388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.205 [2024-12-13 07:03:53.358506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18230571291595767807 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.205 [2024-12-13 07:03:53.358523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.205 #103 NEW cov: 11901 ft: 15523 corp: 38/2245b lim: 100 exec/s: 103 rss: 69Mb L: 86/99 MS: 1 ChangeByte- 00:08:35.205 [2024-12-13 07:03:53.408211] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:72057589910732800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.205 [2024-12-13 07:03:53.408244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.205 [2024-12-13 07:03:53.408311] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446528573725474815 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.205 [2024-12-13 07:03:53.408329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.205 [2024-12-13 07:03:53.408462] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:4278190080 len:513 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.205 [2024-12-13 07:03:53.408483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.205 #104 NEW cov: 11901 ft: 15627 corp: 39/2305b lim: 100 exec/s: 52 rss: 69Mb L: 60/99 MS: 1 ChangeBinInt- 00:08:35.205 #104 DONE cov: 11901 ft: 15627 corp: 39/2305b lim: 100 exec/s: 52 rss: 69Mb 00:08:35.205 ###### Recommended dictionary. ###### 00:08:35.205 "\377\377\377\017" # Uses: 5 00:08:35.205 "\002\000\000\000" # Uses: 2 00:08:35.205 "\000\000\000\000\000\000\000\004" # Uses: 1 00:08:35.205 "\377\377}\376\\\016OU" # Uses: 0 00:08:35.205 "\377\377" # Uses: 0 00:08:35.205 "\000\000\000\000\000\000\000\000" # Uses: 0 00:08:35.205 ###### End of recommended dictionary. ###### 00:08:35.205 Done 104 runs in 2 second(s) 00:08:35.465 07:03:53 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:08:35.465 07:03:53 -- ../common.sh@72 -- # (( i++ )) 00:08:35.465 07:03:53 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:35.465 07:03:53 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:08:35.465 00:08:35.465 real 1m2.397s 00:08:35.465 user 1m38.893s 00:08:35.465 sys 0m6.958s 00:08:35.465 07:03:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:35.465 07:03:53 -- common/autotest_common.sh@10 -- # set +x 00:08:35.465 ************************************ 00:08:35.465 END TEST nvmf_fuzz 00:08:35.465 ************************************ 00:08:35.465 07:03:53 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:35.465 07:03:53 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:35.465 07:03:53 -- fuzz/llvm.sh@20 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:35.465 07:03:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:08:35.465 07:03:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:35.465 07:03:53 -- common/autotest_common.sh@10 -- # set +x 00:08:35.465 ************************************ 00:08:35.465 START TEST vfio_fuzz 00:08:35.465 ************************************ 00:08:35.465 07:03:53 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:35.465 * Looking for test storage... 00:08:35.465 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:35.465 07:03:53 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:08:35.465 07:03:53 -- common/autotest_common.sh@1690 -- # lcov --version 00:08:35.465 07:03:53 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:08:35.727 07:03:53 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:08:35.727 07:03:53 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:08:35.727 07:03:53 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:08:35.727 07:03:53 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:08:35.727 07:03:53 -- scripts/common.sh@335 -- # IFS=.-: 00:08:35.727 07:03:53 -- scripts/common.sh@335 -- # read -ra ver1 00:08:35.727 07:03:53 -- scripts/common.sh@336 -- # IFS=.-: 00:08:35.727 07:03:53 -- scripts/common.sh@336 -- # read -ra ver2 00:08:35.727 07:03:53 -- scripts/common.sh@337 -- # local 'op=<' 00:08:35.727 07:03:53 -- scripts/common.sh@339 -- # ver1_l=2 00:08:35.727 07:03:53 -- scripts/common.sh@340 -- # ver2_l=1 00:08:35.727 07:03:53 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:08:35.727 07:03:53 -- scripts/common.sh@343 -- # case "$op" in 00:08:35.727 07:03:53 -- scripts/common.sh@344 -- # : 1 00:08:35.727 07:03:53 -- scripts/common.sh@363 -- # (( v = 0 )) 00:08:35.727 07:03:53 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:35.727 07:03:53 -- scripts/common.sh@364 -- # decimal 1 00:08:35.727 07:03:53 -- scripts/common.sh@352 -- # local d=1 00:08:35.727 07:03:53 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:35.727 07:03:53 -- scripts/common.sh@354 -- # echo 1 00:08:35.727 07:03:53 -- scripts/common.sh@364 -- # ver1[v]=1 00:08:35.727 07:03:53 -- scripts/common.sh@365 -- # decimal 2 00:08:35.727 07:03:53 -- scripts/common.sh@352 -- # local d=2 00:08:35.727 07:03:53 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:35.727 07:03:53 -- scripts/common.sh@354 -- # echo 2 00:08:35.727 07:03:53 -- scripts/common.sh@365 -- # ver2[v]=2 00:08:35.727 07:03:53 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:08:35.727 07:03:53 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:08:35.727 07:03:53 -- scripts/common.sh@367 -- # return 0 00:08:35.727 07:03:53 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:35.727 07:03:53 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:08:35.727 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.727 --rc genhtml_branch_coverage=1 00:08:35.727 --rc genhtml_function_coverage=1 00:08:35.727 --rc genhtml_legend=1 00:08:35.727 --rc geninfo_all_blocks=1 00:08:35.727 --rc geninfo_unexecuted_blocks=1 00:08:35.727 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.727 ' 00:08:35.727 07:03:53 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:08:35.727 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.727 --rc genhtml_branch_coverage=1 00:08:35.727 --rc genhtml_function_coverage=1 00:08:35.727 --rc genhtml_legend=1 00:08:35.727 --rc geninfo_all_blocks=1 00:08:35.727 --rc geninfo_unexecuted_blocks=1 00:08:35.727 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.727 ' 00:08:35.727 07:03:53 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:08:35.727 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.727 --rc genhtml_branch_coverage=1 00:08:35.727 --rc genhtml_function_coverage=1 00:08:35.727 --rc genhtml_legend=1 00:08:35.727 --rc geninfo_all_blocks=1 00:08:35.727 --rc geninfo_unexecuted_blocks=1 00:08:35.727 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.727 ' 00:08:35.727 07:03:53 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:08:35.727 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.727 --rc genhtml_branch_coverage=1 00:08:35.727 --rc genhtml_function_coverage=1 00:08:35.727 --rc genhtml_legend=1 00:08:35.727 --rc geninfo_all_blocks=1 00:08:35.727 --rc geninfo_unexecuted_blocks=1 00:08:35.727 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.727 ' 00:08:35.727 07:03:53 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:35.727 07:03:53 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:35.727 07:03:53 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:35.727 07:03:53 -- common/autotest_common.sh@34 -- # set -e 00:08:35.727 07:03:53 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:35.727 07:03:53 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:35.727 07:03:53 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:35.727 07:03:53 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:35.727 07:03:53 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:35.727 07:03:53 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:35.727 07:03:53 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:35.727 07:03:53 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:35.727 07:03:53 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:35.727 07:03:53 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:35.727 07:03:53 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:35.727 07:03:53 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:35.727 07:03:53 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:35.727 07:03:53 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:35.727 07:03:53 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:35.727 07:03:53 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:35.727 07:03:53 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:35.727 07:03:53 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:35.727 07:03:53 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:35.727 07:03:53 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:35.727 07:03:53 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:35.727 07:03:53 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:35.727 07:03:53 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:35.727 07:03:53 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:35.727 07:03:53 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:35.727 07:03:53 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:35.727 07:03:53 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:35.727 07:03:53 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:35.727 07:03:53 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:35.727 07:03:53 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:35.727 07:03:53 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:35.727 07:03:53 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:35.727 07:03:53 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:35.727 07:03:53 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:35.727 07:03:53 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:35.727 07:03:53 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:35.727 07:03:53 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:35.727 07:03:53 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:35.727 07:03:53 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:35.727 07:03:53 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:35.727 07:03:53 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:35.727 07:03:53 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:35.727 07:03:53 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:35.727 07:03:53 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:35.727 07:03:53 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:35.727 07:03:53 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:35.727 07:03:53 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:35.727 07:03:53 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:35.727 07:03:53 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:35.727 07:03:53 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:35.727 07:03:53 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:35.728 07:03:53 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:35.728 07:03:53 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:35.728 07:03:53 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:35.728 07:03:53 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:35.728 07:03:53 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:35.728 07:03:53 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:08:35.728 07:03:53 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:08:35.728 07:03:53 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:08:35.728 07:03:53 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:08:35.728 07:03:53 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:08:35.728 07:03:53 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:08:35.728 07:03:53 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:08:35.728 07:03:53 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:08:35.728 07:03:53 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:35.728 07:03:53 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:08:35.728 07:03:53 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:08:35.728 07:03:53 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:08:35.728 07:03:53 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:08:35.728 07:03:53 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:35.728 07:03:53 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:08:35.728 07:03:53 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:08:35.728 07:03:53 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:08:35.728 07:03:53 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:08:35.728 07:03:53 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:08:35.728 07:03:53 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:08:35.728 07:03:53 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:08:35.728 07:03:53 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:08:35.728 07:03:53 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:08:35.728 07:03:53 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:08:35.728 07:03:53 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:35.728 07:03:53 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:08:35.728 07:03:53 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:08:35.728 07:03:53 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:35.728 07:03:53 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:35.728 07:03:53 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:35.728 07:03:53 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:35.728 07:03:53 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:35.728 07:03:53 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:35.728 07:03:53 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:35.728 07:03:53 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:35.728 07:03:53 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:35.728 07:03:53 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:35.728 07:03:53 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:35.728 07:03:53 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:35.728 07:03:53 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:35.728 07:03:53 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:35.728 07:03:53 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:35.728 07:03:53 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:35.728 #define SPDK_CONFIG_H 00:08:35.728 #define SPDK_CONFIG_APPS 1 00:08:35.728 #define SPDK_CONFIG_ARCH native 00:08:35.728 #undef SPDK_CONFIG_ASAN 00:08:35.728 #undef SPDK_CONFIG_AVAHI 00:08:35.728 #undef SPDK_CONFIG_CET 00:08:35.728 #define SPDK_CONFIG_COVERAGE 1 00:08:35.728 #define SPDK_CONFIG_CROSS_PREFIX 00:08:35.728 #undef SPDK_CONFIG_CRYPTO 00:08:35.728 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:35.728 #undef SPDK_CONFIG_CUSTOMOCF 00:08:35.728 #undef SPDK_CONFIG_DAOS 00:08:35.728 #define SPDK_CONFIG_DAOS_DIR 00:08:35.728 #define SPDK_CONFIG_DEBUG 1 00:08:35.728 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:35.728 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:35.728 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:35.728 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:35.728 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:35.728 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:35.728 #define SPDK_CONFIG_EXAMPLES 1 00:08:35.728 #undef SPDK_CONFIG_FC 00:08:35.728 #define SPDK_CONFIG_FC_PATH 00:08:35.728 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:35.728 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:35.728 #undef SPDK_CONFIG_FUSE 00:08:35.728 #define SPDK_CONFIG_FUZZER 1 00:08:35.728 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:35.728 #undef SPDK_CONFIG_GOLANG 00:08:35.728 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:35.728 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:35.728 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:35.728 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:35.728 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:35.728 #define SPDK_CONFIG_IDXD 1 00:08:35.728 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:35.728 #undef SPDK_CONFIG_IPSEC_MB 00:08:35.728 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:35.728 #define SPDK_CONFIG_ISAL 1 00:08:35.728 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:35.728 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:35.728 #define SPDK_CONFIG_LIBDIR 00:08:35.728 #undef SPDK_CONFIG_LTO 00:08:35.728 #define SPDK_CONFIG_MAX_LCORES 00:08:35.728 #define SPDK_CONFIG_NVME_CUSE 1 00:08:35.728 #undef SPDK_CONFIG_OCF 00:08:35.728 #define SPDK_CONFIG_OCF_PATH 00:08:35.728 #define SPDK_CONFIG_OPENSSL_PATH 00:08:35.728 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:35.728 #undef SPDK_CONFIG_PGO_USE 00:08:35.728 #define SPDK_CONFIG_PREFIX /usr/local 00:08:35.728 #undef SPDK_CONFIG_RAID5F 00:08:35.728 #undef SPDK_CONFIG_RBD 00:08:35.728 #define SPDK_CONFIG_RDMA 1 00:08:35.728 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:35.728 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:35.728 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:35.728 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:35.728 #undef SPDK_CONFIG_SHARED 00:08:35.728 #undef SPDK_CONFIG_SMA 00:08:35.728 #define SPDK_CONFIG_TESTS 1 00:08:35.728 #undef SPDK_CONFIG_TSAN 00:08:35.728 #define SPDK_CONFIG_UBLK 1 00:08:35.728 #define SPDK_CONFIG_UBSAN 1 00:08:35.728 #undef SPDK_CONFIG_UNIT_TESTS 00:08:35.728 #undef SPDK_CONFIG_URING 00:08:35.728 #define SPDK_CONFIG_URING_PATH 00:08:35.728 #undef SPDK_CONFIG_URING_ZNS 00:08:35.728 #undef SPDK_CONFIG_USDT 00:08:35.728 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:35.728 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:35.728 #define SPDK_CONFIG_VFIO_USER 1 00:08:35.728 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:35.728 #define SPDK_CONFIG_VHOST 1 00:08:35.728 #define SPDK_CONFIG_VIRTIO 1 00:08:35.728 #undef SPDK_CONFIG_VTUNE 00:08:35.728 #define SPDK_CONFIG_VTUNE_DIR 00:08:35.728 #define SPDK_CONFIG_WERROR 1 00:08:35.728 #define SPDK_CONFIG_WPDK_DIR 00:08:35.728 #undef SPDK_CONFIG_XNVME 00:08:35.728 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:35.728 07:03:53 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:35.728 07:03:53 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:35.728 07:03:53 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:35.728 07:03:53 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:35.728 07:03:53 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:35.728 07:03:53 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:35.728 07:03:53 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:35.728 07:03:53 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:35.728 07:03:53 -- paths/export.sh@5 -- # export PATH 00:08:35.728 07:03:53 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:35.728 07:03:53 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:35.728 07:03:53 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:35.728 07:03:53 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:35.728 07:03:53 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:35.728 07:03:53 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:35.728 07:03:53 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:35.728 07:03:53 -- pm/common@16 -- # TEST_TAG=N/A 00:08:35.728 07:03:53 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:35.728 07:03:53 -- common/autotest_common.sh@52 -- # : 1 00:08:35.728 07:03:53 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:08:35.728 07:03:53 -- common/autotest_common.sh@56 -- # : 0 00:08:35.728 07:03:53 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:35.728 07:03:53 -- common/autotest_common.sh@58 -- # : 0 00:08:35.728 07:03:53 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:08:35.728 07:03:53 -- common/autotest_common.sh@60 -- # : 1 00:08:35.728 07:03:53 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:35.729 07:03:53 -- common/autotest_common.sh@62 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:08:35.729 07:03:53 -- common/autotest_common.sh@64 -- # : 00:08:35.729 07:03:53 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:08:35.729 07:03:53 -- common/autotest_common.sh@66 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:08:35.729 07:03:53 -- common/autotest_common.sh@68 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:08:35.729 07:03:53 -- common/autotest_common.sh@70 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:08:35.729 07:03:53 -- common/autotest_common.sh@72 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:35.729 07:03:53 -- common/autotest_common.sh@74 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:08:35.729 07:03:53 -- common/autotest_common.sh@76 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:08:35.729 07:03:53 -- common/autotest_common.sh@78 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:08:35.729 07:03:53 -- common/autotest_common.sh@80 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:08:35.729 07:03:53 -- common/autotest_common.sh@82 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:08:35.729 07:03:53 -- common/autotest_common.sh@84 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:08:35.729 07:03:53 -- common/autotest_common.sh@86 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:08:35.729 07:03:53 -- common/autotest_common.sh@88 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:08:35.729 07:03:53 -- common/autotest_common.sh@90 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:35.729 07:03:53 -- common/autotest_common.sh@92 -- # : 1 00:08:35.729 07:03:53 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:08:35.729 07:03:53 -- common/autotest_common.sh@94 -- # : 1 00:08:35.729 07:03:53 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:08:35.729 07:03:53 -- common/autotest_common.sh@96 -- # : rdma 00:08:35.729 07:03:53 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:35.729 07:03:53 -- common/autotest_common.sh@98 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:08:35.729 07:03:53 -- common/autotest_common.sh@100 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:08:35.729 07:03:53 -- common/autotest_common.sh@102 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:08:35.729 07:03:53 -- common/autotest_common.sh@104 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:08:35.729 07:03:53 -- common/autotest_common.sh@106 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:08:35.729 07:03:53 -- common/autotest_common.sh@108 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:08:35.729 07:03:53 -- common/autotest_common.sh@110 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:08:35.729 07:03:53 -- common/autotest_common.sh@112 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:35.729 07:03:53 -- common/autotest_common.sh@114 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:08:35.729 07:03:53 -- common/autotest_common.sh@116 -- # : 1 00:08:35.729 07:03:53 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:08:35.729 07:03:53 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:35.729 07:03:53 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:35.729 07:03:53 -- common/autotest_common.sh@120 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:08:35.729 07:03:53 -- common/autotest_common.sh@122 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:08:35.729 07:03:53 -- common/autotest_common.sh@124 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:08:35.729 07:03:53 -- common/autotest_common.sh@126 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:08:35.729 07:03:53 -- common/autotest_common.sh@128 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:08:35.729 07:03:53 -- common/autotest_common.sh@130 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:08:35.729 07:03:53 -- common/autotest_common.sh@132 -- # : v22.11.4 00:08:35.729 07:03:53 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:08:35.729 07:03:53 -- common/autotest_common.sh@134 -- # : true 00:08:35.729 07:03:53 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:08:35.729 07:03:53 -- common/autotest_common.sh@136 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:08:35.729 07:03:53 -- common/autotest_common.sh@138 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:08:35.729 07:03:53 -- common/autotest_common.sh@140 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:08:35.729 07:03:53 -- common/autotest_common.sh@142 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:08:35.729 07:03:53 -- common/autotest_common.sh@144 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:08:35.729 07:03:53 -- common/autotest_common.sh@146 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:08:35.729 07:03:53 -- common/autotest_common.sh@148 -- # : 00:08:35.729 07:03:53 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:08:35.729 07:03:53 -- common/autotest_common.sh@150 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:08:35.729 07:03:53 -- common/autotest_common.sh@152 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:08:35.729 07:03:53 -- common/autotest_common.sh@154 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:08:35.729 07:03:53 -- common/autotest_common.sh@156 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:08:35.729 07:03:53 -- common/autotest_common.sh@158 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:08:35.729 07:03:53 -- common/autotest_common.sh@160 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:08:35.729 07:03:53 -- common/autotest_common.sh@163 -- # : 00:08:35.729 07:03:53 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:08:35.729 07:03:53 -- common/autotest_common.sh@165 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:08:35.729 07:03:53 -- common/autotest_common.sh@167 -- # : 0 00:08:35.729 07:03:53 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:35.729 07:03:53 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:35.729 07:03:53 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:35.729 07:03:53 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:35.729 07:03:53 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:35.729 07:03:53 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:35.729 07:03:53 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:35.729 07:03:53 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:35.729 07:03:53 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:35.729 07:03:53 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:35.729 07:03:53 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:35.729 07:03:53 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:35.729 07:03:53 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:35.729 07:03:53 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:35.729 07:03:53 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:08:35.729 07:03:53 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:35.729 07:03:53 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:35.730 07:03:53 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:35.730 07:03:53 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:35.730 07:03:53 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:35.730 07:03:53 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:08:35.730 07:03:53 -- common/autotest_common.sh@196 -- # cat 00:08:35.730 07:03:53 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:08:35.730 07:03:53 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:35.730 07:03:53 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:35.730 07:03:53 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:35.730 07:03:53 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:35.730 07:03:53 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:08:35.730 07:03:53 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:08:35.730 07:03:53 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:35.730 07:03:53 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:35.730 07:03:53 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:35.730 07:03:53 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:35.730 07:03:53 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:35.730 07:03:53 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:35.730 07:03:53 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:35.730 07:03:53 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:35.730 07:03:53 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:35.730 07:03:53 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:35.730 07:03:53 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:35.730 07:03:53 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:35.730 07:03:53 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:08:35.730 07:03:53 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:08:35.730 07:03:53 -- common/autotest_common.sh@249 -- # _LCOV= 00:08:35.730 07:03:53 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:08:35.730 07:03:53 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:08:35.730 07:03:53 -- common/autotest_common.sh@250 -- # _LCOV=1 00:08:35.730 07:03:53 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:35.730 07:03:53 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:08:35.730 07:03:53 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:35.730 07:03:53 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:08:35.730 07:03:53 -- common/autotest_common.sh@259 -- # export valgrind= 00:08:35.730 07:03:53 -- common/autotest_common.sh@259 -- # valgrind= 00:08:35.730 07:03:53 -- common/autotest_common.sh@265 -- # uname -s 00:08:35.730 07:03:53 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:08:35.730 07:03:53 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:08:35.730 07:03:53 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:08:35.730 07:03:53 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:08:35.730 07:03:53 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:08:35.730 07:03:53 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:08:35.730 07:03:53 -- common/autotest_common.sh@275 -- # MAKE=make 00:08:35.730 07:03:53 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:08:35.730 07:03:53 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:08:35.730 07:03:53 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:08:35.730 07:03:53 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:35.730 07:03:53 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:08:35.730 07:03:53 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:08:35.730 07:03:53 -- common/autotest_common.sh@319 -- # [[ -z 500981 ]] 00:08:35.730 07:03:53 -- common/autotest_common.sh@319 -- # kill -0 500981 00:08:35.730 07:03:53 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:08:35.730 07:03:53 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:08:35.730 07:03:53 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:08:35.730 07:03:53 -- common/autotest_common.sh@332 -- # local mount target_dir 00:08:35.730 07:03:53 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:08:35.730 07:03:53 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:08:35.730 07:03:53 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:08:35.730 07:03:53 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:08:35.730 07:03:53 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.j3GAOP 00:08:35.730 07:03:53 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:35.730 07:03:53 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:08:35.730 07:03:53 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:08:35.730 07:03:53 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.j3GAOP/tests/vfio /tmp/spdk.j3GAOP 00:08:35.730 07:03:53 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:08:35.730 07:03:53 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:35.730 07:03:53 -- common/autotest_common.sh@328 -- # df -T 00:08:35.730 07:03:53 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:08:35.730 07:03:53 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:08:35.730 07:03:53 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:08:35.730 07:03:53 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:08:35.730 07:03:53 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:08:35.730 07:03:53 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:08:35.730 07:03:53 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:35.730 07:03:53 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:08:35.730 07:03:53 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:08:35.730 07:03:53 -- common/autotest_common.sh@363 -- # avails["$mount"]=785162240 00:08:35.730 07:03:53 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:08:35.730 07:03:53 -- common/autotest_common.sh@364 -- # uses["$mount"]=4499267584 00:08:35.730 07:03:53 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:35.730 07:03:53 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:08:35.730 07:03:53 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:08:35.730 07:03:53 -- common/autotest_common.sh@363 -- # avails["$mount"]=53192916992 00:08:35.730 07:03:53 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730607104 00:08:35.730 07:03:53 -- common/autotest_common.sh@364 -- # uses["$mount"]=8537690112 00:08:35.730 07:03:53 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:35.730 07:03:53 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:35.730 07:03:53 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:35.730 07:03:53 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864044032 00:08:35.730 07:03:53 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865301504 00:08:35.730 07:03:53 -- common/autotest_common.sh@364 -- # uses["$mount"]=1257472 00:08:35.730 07:03:53 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:35.730 07:03:53 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:35.730 07:03:53 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:35.730 07:03:53 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340121600 00:08:35.730 07:03:53 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346122240 00:08:35.730 07:03:53 -- common/autotest_common.sh@364 -- # uses["$mount"]=6000640 00:08:35.730 07:03:53 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:35.730 07:03:53 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:35.730 07:03:53 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:35.730 07:03:53 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864986112 00:08:35.730 07:03:53 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865305600 00:08:35.730 07:03:53 -- common/autotest_common.sh@364 -- # uses["$mount"]=319488 00:08:35.730 07:03:53 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:35.730 07:03:53 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:35.730 07:03:53 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:35.730 07:03:53 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:08:35.730 07:03:53 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:08:35.730 07:03:53 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:08:35.730 07:03:53 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:35.730 07:03:53 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:08:35.730 * Looking for test storage... 00:08:35.730 07:03:53 -- common/autotest_common.sh@369 -- # local target_space new_size 00:08:35.730 07:03:53 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:08:35.730 07:03:53 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:35.730 07:03:53 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:35.730 07:03:53 -- common/autotest_common.sh@373 -- # mount=/ 00:08:35.730 07:03:53 -- common/autotest_common.sh@375 -- # target_space=53192916992 00:08:35.730 07:03:53 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:08:35.730 07:03:53 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:08:35.730 07:03:53 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:08:35.730 07:03:53 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:08:35.730 07:03:53 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:08:35.730 07:03:53 -- common/autotest_common.sh@382 -- # new_size=10752282624 00:08:35.730 07:03:53 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:35.730 07:03:53 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:35.730 07:03:53 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:35.730 07:03:53 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:35.730 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:35.730 07:03:53 -- common/autotest_common.sh@390 -- # return 0 00:08:35.730 07:03:53 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:08:35.730 07:03:53 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:08:35.730 07:03:53 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:35.731 07:03:53 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:35.731 07:03:53 -- common/autotest_common.sh@1682 -- # true 00:08:35.731 07:03:53 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:08:35.731 07:03:53 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:35.731 07:03:53 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:35.731 07:03:53 -- common/autotest_common.sh@27 -- # exec 00:08:35.731 07:03:53 -- common/autotest_common.sh@29 -- # exec 00:08:35.731 07:03:53 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:35.731 07:03:53 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:35.731 07:03:53 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:35.731 07:03:53 -- common/autotest_common.sh@18 -- # set -x 00:08:35.731 07:03:53 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:08:35.731 07:03:53 -- common/autotest_common.sh@1690 -- # lcov --version 00:08:35.731 07:03:53 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:08:35.990 07:03:53 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:08:35.990 07:03:53 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:08:35.990 07:03:53 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:08:35.990 07:03:53 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:08:35.990 07:03:53 -- scripts/common.sh@335 -- # IFS=.-: 00:08:35.990 07:03:53 -- scripts/common.sh@335 -- # read -ra ver1 00:08:35.990 07:03:53 -- scripts/common.sh@336 -- # IFS=.-: 00:08:35.990 07:03:53 -- scripts/common.sh@336 -- # read -ra ver2 00:08:35.990 07:03:53 -- scripts/common.sh@337 -- # local 'op=<' 00:08:35.990 07:03:53 -- scripts/common.sh@339 -- # ver1_l=2 00:08:35.990 07:03:53 -- scripts/common.sh@340 -- # ver2_l=1 00:08:35.990 07:03:53 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:08:35.990 07:03:53 -- scripts/common.sh@343 -- # case "$op" in 00:08:35.990 07:03:53 -- scripts/common.sh@344 -- # : 1 00:08:35.990 07:03:53 -- scripts/common.sh@363 -- # (( v = 0 )) 00:08:35.990 07:03:53 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:35.990 07:03:53 -- scripts/common.sh@364 -- # decimal 1 00:08:35.990 07:03:53 -- scripts/common.sh@352 -- # local d=1 00:08:35.990 07:03:54 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:35.991 07:03:54 -- scripts/common.sh@354 -- # echo 1 00:08:35.991 07:03:54 -- scripts/common.sh@364 -- # ver1[v]=1 00:08:35.991 07:03:54 -- scripts/common.sh@365 -- # decimal 2 00:08:35.991 07:03:54 -- scripts/common.sh@352 -- # local d=2 00:08:35.991 07:03:54 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:35.991 07:03:54 -- scripts/common.sh@354 -- # echo 2 00:08:35.991 07:03:54 -- scripts/common.sh@365 -- # ver2[v]=2 00:08:35.991 07:03:54 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:08:35.991 07:03:54 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:08:35.991 07:03:54 -- scripts/common.sh@367 -- # return 0 00:08:35.991 07:03:54 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:35.991 07:03:54 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:08:35.991 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.991 --rc genhtml_branch_coverage=1 00:08:35.991 --rc genhtml_function_coverage=1 00:08:35.991 --rc genhtml_legend=1 00:08:35.991 --rc geninfo_all_blocks=1 00:08:35.991 --rc geninfo_unexecuted_blocks=1 00:08:35.991 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.991 ' 00:08:35.991 07:03:54 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:08:35.991 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.991 --rc genhtml_branch_coverage=1 00:08:35.991 --rc genhtml_function_coverage=1 00:08:35.991 --rc genhtml_legend=1 00:08:35.991 --rc geninfo_all_blocks=1 00:08:35.991 --rc geninfo_unexecuted_blocks=1 00:08:35.991 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.991 ' 00:08:35.991 07:03:54 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:08:35.991 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.991 --rc genhtml_branch_coverage=1 00:08:35.991 --rc genhtml_function_coverage=1 00:08:35.991 --rc genhtml_legend=1 00:08:35.991 --rc geninfo_all_blocks=1 00:08:35.991 --rc geninfo_unexecuted_blocks=1 00:08:35.991 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.991 ' 00:08:35.991 07:03:54 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:08:35.991 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.991 --rc genhtml_branch_coverage=1 00:08:35.991 --rc genhtml_function_coverage=1 00:08:35.991 --rc genhtml_legend=1 00:08:35.991 --rc geninfo_all_blocks=1 00:08:35.991 --rc geninfo_unexecuted_blocks=1 00:08:35.991 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.991 ' 00:08:35.991 07:03:54 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:35.991 07:03:54 -- ../common.sh@8 -- # pids=() 00:08:35.991 07:03:54 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:35.991 07:03:54 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:35.991 07:03:54 -- vfio/run.sh@59 -- # fuzz_num=7 00:08:35.991 07:03:54 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:08:35.991 07:03:54 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:08:35.991 07:03:54 -- vfio/run.sh@65 -- # mem_size=0 00:08:35.991 07:03:54 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:08:35.991 07:03:54 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:08:35.991 07:03:54 -- ../common.sh@69 -- # local fuzz_num=7 00:08:35.991 07:03:54 -- ../common.sh@70 -- # local time=1 00:08:35.991 07:03:54 -- ../common.sh@72 -- # (( i = 0 )) 00:08:35.991 07:03:54 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:35.991 07:03:54 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:35.991 07:03:54 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:35.991 07:03:54 -- vfio/run.sh@23 -- # local timen=1 00:08:35.991 07:03:54 -- vfio/run.sh@24 -- # local core=0x1 00:08:35.991 07:03:54 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:35.991 07:03:54 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:35.991 07:03:54 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:35.991 07:03:54 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:35.991 07:03:54 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:35.991 07:03:54 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:35.991 07:03:54 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:35.991 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:35.991 07:03:54 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:35.991 [2024-12-13 07:03:54.065058] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:35.991 [2024-12-13 07:03:54.065132] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid501034 ] 00:08:35.991 EAL: No free 2048 kB hugepages reported on node 1 00:08:35.991 [2024-12-13 07:03:54.135721] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:35.991 [2024-12-13 07:03:54.172091] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:35.991 [2024-12-13 07:03:54.172245] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.250 INFO: Running with entropic power schedule (0xFF, 100). 00:08:36.250 INFO: Seed: 2885171513 00:08:36.250 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:36.250 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:36.250 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:36.250 INFO: A corpus is not provided, starting from an empty corpus 00:08:36.250 #2 INITED exec/s: 0 rss: 60Mb 00:08:36.250 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:36.250 This may also happen if the target rejected all inputs we tried so far 00:08:36.768 NEW_FUNC[1/631]: 0x450dd8 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:08:36.768 NEW_FUNC[2/631]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:36.768 #7 NEW cov: 10761 ft: 10706 corp: 2/7b lim: 60 exec/s: 0 rss: 66Mb L: 6/6 MS: 5 CrossOver-ChangeBit-ChangeByte-CopyPart-CMP- DE: "\026\000\000\000"- 00:08:37.027 #13 NEW cov: 10775 ft: 14662 corp: 3/37b lim: 60 exec/s: 0 rss: 67Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:08:37.027 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:37.027 #14 NEW cov: 10792 ft: 15435 corp: 4/43b lim: 60 exec/s: 0 rss: 68Mb L: 6/30 MS: 1 ShuffleBytes- 00:08:37.286 #20 NEW cov: 10792 ft: 16732 corp: 5/49b lim: 60 exec/s: 20 rss: 68Mb L: 6/30 MS: 1 ChangeByte- 00:08:37.545 #21 NEW cov: 10792 ft: 17134 corp: 6/79b lim: 60 exec/s: 21 rss: 68Mb L: 30/30 MS: 1 ChangeByte- 00:08:37.545 #22 NEW cov: 10792 ft: 17398 corp: 7/85b lim: 60 exec/s: 22 rss: 68Mb L: 6/30 MS: 1 CMP- DE: "\003\000"- 00:08:37.803 #24 NEW cov: 10792 ft: 17614 corp: 8/92b lim: 60 exec/s: 24 rss: 68Mb L: 7/30 MS: 2 PersAutoDict-CrossOver- DE: "\003\000"- 00:08:38.063 #25 NEW cov: 10792 ft: 17674 corp: 9/99b lim: 60 exec/s: 25 rss: 68Mb L: 7/30 MS: 1 InsertByte- 00:08:38.063 #26 NEW cov: 10799 ft: 17859 corp: 10/105b lim: 60 exec/s: 26 rss: 68Mb L: 6/30 MS: 1 ChangeByte- 00:08:38.323 #27 NEW cov: 10799 ft: 18026 corp: 11/146b lim: 60 exec/s: 13 rss: 68Mb L: 41/41 MS: 1 InsertRepeatedBytes- 00:08:38.323 #27 DONE cov: 10799 ft: 18026 corp: 11/146b lim: 60 exec/s: 13 rss: 68Mb 00:08:38.323 ###### Recommended dictionary. ###### 00:08:38.323 "\026\000\000\000" # Uses: 0 00:08:38.323 "\003\000" # Uses: 1 00:08:38.323 ###### End of recommended dictionary. ###### 00:08:38.323 Done 27 runs in 2 second(s) 00:08:38.583 07:03:56 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:08:38.583 07:03:56 -- ../common.sh@72 -- # (( i++ )) 00:08:38.583 07:03:56 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:38.583 07:03:56 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:38.583 07:03:56 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:38.583 07:03:56 -- vfio/run.sh@23 -- # local timen=1 00:08:38.583 07:03:56 -- vfio/run.sh@24 -- # local core=0x1 00:08:38.583 07:03:56 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:38.583 07:03:56 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:38.583 07:03:56 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:38.583 07:03:56 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:38.583 07:03:56 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:38.583 07:03:56 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:38.583 07:03:56 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:38.583 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:38.583 07:03:56 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:38.583 [2024-12-13 07:03:56.740577] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:38.583 [2024-12-13 07:03:56.740667] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid501588 ] 00:08:38.583 EAL: No free 2048 kB hugepages reported on node 1 00:08:38.583 [2024-12-13 07:03:56.811559] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:38.842 [2024-12-13 07:03:56.848470] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:38.842 [2024-12-13 07:03:56.848624] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.842 INFO: Running with entropic power schedule (0xFF, 100). 00:08:38.842 INFO: Seed: 1277171349 00:08:38.842 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:38.842 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:38.842 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:38.842 INFO: A corpus is not provided, starting from an empty corpus 00:08:38.842 #2 INITED exec/s: 0 rss: 60Mb 00:08:38.842 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:38.842 This may also happen if the target rejected all inputs we tried so far 00:08:39.101 [2024-12-13 07:03:57.135266] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.101 [2024-12-13 07:03:57.135297] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.101 [2024-12-13 07:03:57.135316] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:39.360 NEW_FUNC[1/638]: 0x451378 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:08:39.360 NEW_FUNC[2/638]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:39.360 #4 NEW cov: 10775 ft: 10610 corp: 2/23b lim: 40 exec/s: 0 rss: 66Mb L: 22/22 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:39.360 [2024-12-13 07:03:57.589506] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.360 [2024-12-13 07:03:57.589538] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.360 [2024-12-13 07:03:57.589556] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:39.619 #5 NEW cov: 10789 ft: 14059 corp: 3/45b lim: 40 exec/s: 0 rss: 67Mb L: 22/22 MS: 1 ChangeBit- 00:08:39.619 [2024-12-13 07:03:57.781749] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.619 [2024-12-13 07:03:57.781772] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.619 [2024-12-13 07:03:57.781794] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:39.878 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:39.878 #6 NEW cov: 10809 ft: 14982 corp: 4/68b lim: 40 exec/s: 0 rss: 68Mb L: 23/23 MS: 1 InsertByte- 00:08:39.878 [2024-12-13 07:03:57.973380] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.878 [2024-12-13 07:03:57.973402] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.878 [2024-12-13 07:03:57.973419] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:39.878 #7 NEW cov: 10809 ft: 15891 corp: 5/97b lim: 40 exec/s: 7 rss: 68Mb L: 29/29 MS: 1 CrossOver- 00:08:40.138 [2024-12-13 07:03:58.165472] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.138 [2024-12-13 07:03:58.165494] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.138 [2024-12-13 07:03:58.165511] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.138 #8 NEW cov: 10809 ft: 16431 corp: 6/130b lim: 40 exec/s: 8 rss: 68Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:40.138 [2024-12-13 07:03:58.356582] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.138 [2024-12-13 07:03:58.356603] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.138 [2024-12-13 07:03:58.356620] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.397 #13 NEW cov: 10809 ft: 17500 corp: 7/147b lim: 40 exec/s: 13 rss: 68Mb L: 17/33 MS: 5 ChangeByte-ChangeByte-CrossOver-ChangeByte-InsertRepeatedBytes- 00:08:40.397 [2024-12-13 07:03:58.558644] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.397 [2024-12-13 07:03:58.558668] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.397 [2024-12-13 07:03:58.558686] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.656 #14 NEW cov: 10809 ft: 17537 corp: 8/164b lim: 40 exec/s: 14 rss: 68Mb L: 17/33 MS: 1 CrossOver- 00:08:40.656 [2024-12-13 07:03:58.750737] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.656 [2024-12-13 07:03:58.750758] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.656 [2024-12-13 07:03:58.750775] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.656 #15 NEW cov: 10816 ft: 17777 corp: 9/193b lim: 40 exec/s: 15 rss: 68Mb L: 29/33 MS: 1 ShuffleBytes- 00:08:40.916 [2024-12-13 07:03:58.943610] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.916 [2024-12-13 07:03:58.943631] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.916 [2024-12-13 07:03:58.943648] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.916 #16 pulse cov: 10816 ft: 17881 corp: 9/193b lim: 40 exec/s: 8 rss: 68Mb 00:08:40.916 #16 NEW cov: 10816 ft: 17881 corp: 10/210b lim: 40 exec/s: 8 rss: 68Mb L: 17/33 MS: 1 CrossOver- 00:08:40.916 #16 DONE cov: 10816 ft: 17881 corp: 10/210b lim: 40 exec/s: 8 rss: 68Mb 00:08:40.916 Done 16 runs in 2 second(s) 00:08:41.175 07:03:59 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:08:41.175 07:03:59 -- ../common.sh@72 -- # (( i++ )) 00:08:41.175 07:03:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:41.175 07:03:59 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:41.175 07:03:59 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:41.175 07:03:59 -- vfio/run.sh@23 -- # local timen=1 00:08:41.175 07:03:59 -- vfio/run.sh@24 -- # local core=0x1 00:08:41.175 07:03:59 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:41.175 07:03:59 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:41.175 07:03:59 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:41.175 07:03:59 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:41.175 07:03:59 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:41.175 07:03:59 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:41.175 07:03:59 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:41.175 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:41.175 07:03:59 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:41.175 [2024-12-13 07:03:59.360024] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:41.175 [2024-12-13 07:03:59.360114] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid502050 ] 00:08:41.175 EAL: No free 2048 kB hugepages reported on node 1 00:08:41.434 [2024-12-13 07:03:59.432844] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:41.434 [2024-12-13 07:03:59.469374] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:41.434 [2024-12-13 07:03:59.469526] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:41.434 INFO: Running with entropic power schedule (0xFF, 100). 00:08:41.434 INFO: Seed: 3891174167 00:08:41.694 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:41.694 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:41.694 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:41.694 INFO: A corpus is not provided, starting from an empty corpus 00:08:41.694 #2 INITED exec/s: 0 rss: 60Mb 00:08:41.694 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:41.694 This may also happen if the target rejected all inputs we tried so far 00:08:41.694 [2024-12-13 07:03:59.779391] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:41.953 NEW_FUNC[1/636]: 0x451d68 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:08:41.953 NEW_FUNC[2/636]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:41.953 #12 NEW cov: 10759 ft: 10246 corp: 2/52b lim: 80 exec/s: 0 rss: 65Mb L: 51/51 MS: 5 InsertByte-ChangeBinInt-ChangeByte-CrossOver-InsertRepeatedBytes- 00:08:42.212 [2024-12-13 07:04:00.257243] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:42.212 #14 NEW cov: 10776 ft: 13275 corp: 3/122b lim: 80 exec/s: 0 rss: 67Mb L: 70/70 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:42.212 [2024-12-13 07:04:00.444107] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:42.471 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:42.471 #15 NEW cov: 10793 ft: 14459 corp: 4/192b lim: 80 exec/s: 0 rss: 68Mb L: 70/70 MS: 1 ChangeASCIIInt- 00:08:42.471 [2024-12-13 07:04:00.629091] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:42.731 #16 NEW cov: 10793 ft: 15068 corp: 5/243b lim: 80 exec/s: 16 rss: 68Mb L: 51/70 MS: 1 CopyPart- 00:08:42.731 [2024-12-13 07:04:00.811192] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:42.731 #30 NEW cov: 10793 ft: 15648 corp: 6/276b lim: 80 exec/s: 30 rss: 68Mb L: 33/70 MS: 4 InsertRepeatedBytes-EraseBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:42.989 [2024-12-13 07:04:00.989109] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:42.989 #31 NEW cov: 10793 ft: 16287 corp: 7/301b lim: 80 exec/s: 31 rss: 68Mb L: 25/70 MS: 1 CrossOver- 00:08:42.989 [2024-12-13 07:04:01.176762] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.248 #32 NEW cov: 10793 ft: 16454 corp: 8/352b lim: 80 exec/s: 32 rss: 68Mb L: 51/70 MS: 1 ChangeBit- 00:08:43.248 [2024-12-13 07:04:01.358078] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.248 #33 NEW cov: 10793 ft: 16475 corp: 9/403b lim: 80 exec/s: 33 rss: 68Mb L: 51/70 MS: 1 ChangeBinInt- 00:08:43.507 [2024-12-13 07:04:01.540951] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.507 #34 NEW cov: 10800 ft: 16589 corp: 10/430b lim: 80 exec/s: 34 rss: 68Mb L: 27/70 MS: 1 CrossOver- 00:08:43.507 [2024-12-13 07:04:01.721676] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.765 #35 NEW cov: 10800 ft: 16640 corp: 11/505b lim: 80 exec/s: 17 rss: 68Mb L: 75/75 MS: 1 CrossOver- 00:08:43.765 #35 DONE cov: 10800 ft: 16640 corp: 11/505b lim: 80 exec/s: 17 rss: 68Mb 00:08:43.766 Done 35 runs in 2 second(s) 00:08:44.025 07:04:02 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:08:44.025 07:04:02 -- ../common.sh@72 -- # (( i++ )) 00:08:44.025 07:04:02 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:44.025 07:04:02 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:44.025 07:04:02 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:44.025 07:04:02 -- vfio/run.sh@23 -- # local timen=1 00:08:44.025 07:04:02 -- vfio/run.sh@24 -- # local core=0x1 00:08:44.025 07:04:02 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:44.025 07:04:02 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:44.025 07:04:02 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:44.025 07:04:02 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:44.025 07:04:02 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:44.025 07:04:02 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:44.025 07:04:02 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:44.025 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:44.025 07:04:02 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:44.025 [2024-12-13 07:04:02.123138] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:44.025 [2024-12-13 07:04:02.123237] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid502443 ] 00:08:44.025 EAL: No free 2048 kB hugepages reported on node 1 00:08:44.025 [2024-12-13 07:04:02.195591] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:44.025 [2024-12-13 07:04:02.231723] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:44.025 [2024-12-13 07:04:02.231861] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:44.284 INFO: Running with entropic power schedule (0xFF, 100). 00:08:44.284 INFO: Seed: 2369231963 00:08:44.284 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:44.284 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:44.284 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:44.284 INFO: A corpus is not provided, starting from an empty corpus 00:08:44.284 #2 INITED exec/s: 0 rss: 59Mb 00:08:44.284 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:44.284 This may also happen if the target rejected all inputs we tried so far 00:08:44.543 [2024-12-13 07:04:02.526225] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=323 offset=0 prot=0x3: Invalid argument 00:08:44.543 [2024-12-13 07:04:02.526258] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:44.543 [2024-12-13 07:04:02.526273] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:44.543 [2024-12-13 07:04:02.526290] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:44.802 NEW_FUNC[1/637]: 0x452458 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:08:44.802 NEW_FUNC[2/637]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:44.802 #8 NEW cov: 10776 ft: 10765 corp: 2/78b lim: 320 exec/s: 0 rss: 65Mb L: 77/77 MS: 1 InsertRepeatedBytes- 00:08:44.803 [2024-12-13 07:04:02.965053] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:44.803 [2024-12-13 07:04:02.965086] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:44.803 [2024-12-13 07:04:02.965097] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:44.803 [2024-12-13 07:04:02.965129] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:45.062 NEW_FUNC[1/1]: 0x16f31d8 in nvme_transport_qpair_submit_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_transport.c:592 00:08:45.062 #9 NEW cov: 10791 ft: 13449 corp: 3/156b lim: 320 exec/s: 0 rss: 66Mb L: 78/78 MS: 1 CrossOver- 00:08:45.062 [2024-12-13 07:04:03.133206] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:45.062 [2024-12-13 07:04:03.133229] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:45.062 [2024-12-13 07:04:03.133239] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:45.062 [2024-12-13 07:04:03.133255] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:45.062 #10 NEW cov: 10791 ft: 13681 corp: 4/234b lim: 320 exec/s: 0 rss: 67Mb L: 78/78 MS: 1 ShuffleBytes- 00:08:45.062 [2024-12-13 07:04:03.300175] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:45.062 [2024-12-13 07:04:03.300203] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:45.062 [2024-12-13 07:04:03.300214] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:45.062 [2024-12-13 07:04:03.300231] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:45.321 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:45.321 #16 NEW cov: 10808 ft: 13937 corp: 5/312b lim: 320 exec/s: 0 rss: 67Mb L: 78/78 MS: 1 ChangeBit- 00:08:45.321 [2024-12-13 07:04:03.466112] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:45.321 [2024-12-13 07:04:03.466135] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:45.321 [2024-12-13 07:04:03.466145] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:45.321 [2024-12-13 07:04:03.466176] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:45.580 #17 NEW cov: 10808 ft: 15467 corp: 6/389b lim: 320 exec/s: 17 rss: 67Mb L: 77/78 MS: 1 ChangeBinInt- 00:08:45.580 [2024-12-13 07:04:03.634037] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:45.580 [2024-12-13 07:04:03.634060] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:45.580 [2024-12-13 07:04:03.634074] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:45.580 [2024-12-13 07:04:03.634107] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:45.580 #18 NEW cov: 10808 ft: 15885 corp: 7/467b lim: 320 exec/s: 18 rss: 67Mb L: 78/78 MS: 1 ChangeBinInt- 00:08:45.580 [2024-12-13 07:04:03.803121] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:45.580 [2024-12-13 07:04:03.803145] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:45.581 [2024-12-13 07:04:03.803155] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:45.581 [2024-12-13 07:04:03.803171] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:45.840 #19 NEW cov: 10811 ft: 16523 corp: 8/544b lim: 320 exec/s: 19 rss: 67Mb L: 77/78 MS: 1 CopyPart- 00:08:45.840 [2024-12-13 07:04:03.972038] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:45.840 [2024-12-13 07:04:03.972062] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:45.840 [2024-12-13 07:04:03.972072] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:45.840 [2024-12-13 07:04:03.972089] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:45.840 #25 NEW cov: 10811 ft: 16635 corp: 9/621b lim: 320 exec/s: 25 rss: 67Mb L: 77/78 MS: 1 ShuffleBytes- 00:08:46.099 [2024-12-13 07:04:04.141588] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:46.099 [2024-12-13 07:04:04.141611] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:46.099 [2024-12-13 07:04:04.141622] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:46.099 [2024-12-13 07:04:04.141639] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:46.099 #26 NEW cov: 10818 ft: 16796 corp: 10/699b lim: 320 exec/s: 26 rss: 67Mb L: 78/78 MS: 1 ChangeBinInt- 00:08:46.099 [2024-12-13 07:04:04.310637] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:46.099 [2024-12-13 07:04:04.310660] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:46.099 [2024-12-13 07:04:04.310670] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:46.099 [2024-12-13 07:04:04.310686] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:46.357 #27 NEW cov: 10818 ft: 16856 corp: 11/776b lim: 320 exec/s: 27 rss: 67Mb L: 77/78 MS: 1 ChangeBit- 00:08:46.358 [2024-12-13 07:04:04.480039] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:46.358 [2024-12-13 07:04:04.480064] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:46.358 [2024-12-13 07:04:04.480074] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:46.358 [2024-12-13 07:04:04.480107] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:46.358 #29 NEW cov: 10818 ft: 17203 corp: 12/855b lim: 320 exec/s: 14 rss: 67Mb L: 79/79 MS: 2 ShuffleBytes-CrossOver- 00:08:46.358 #29 DONE cov: 10818 ft: 17203 corp: 12/855b lim: 320 exec/s: 14 rss: 67Mb 00:08:46.358 Done 29 runs in 2 second(s) 00:08:46.616 07:04:04 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:08:46.616 07:04:04 -- ../common.sh@72 -- # (( i++ )) 00:08:46.616 07:04:04 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:46.616 07:04:04 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:46.616 07:04:04 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:46.616 07:04:04 -- vfio/run.sh@23 -- # local timen=1 00:08:46.616 07:04:04 -- vfio/run.sh@24 -- # local core=0x1 00:08:46.616 07:04:04 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:46.616 07:04:04 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:46.616 07:04:04 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:46.616 07:04:04 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:46.616 07:04:04 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:46.616 07:04:04 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:46.616 07:04:04 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:46.616 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:46.875 07:04:04 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:46.875 [2024-12-13 07:04:04.885136] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:46.875 [2024-12-13 07:04:04.885200] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid502969 ] 00:08:46.875 EAL: No free 2048 kB hugepages reported on node 1 00:08:46.875 [2024-12-13 07:04:04.954865] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:46.875 [2024-12-13 07:04:04.991295] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:46.875 [2024-12-13 07:04:04.991434] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.133 INFO: Running with entropic power schedule (0xFF, 100). 00:08:47.133 INFO: Seed: 822256754 00:08:47.133 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:47.133 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:47.133 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:47.133 INFO: A corpus is not provided, starting from an empty corpus 00:08:47.133 #2 INITED exec/s: 0 rss: 60Mb 00:08:47.133 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:47.133 This may also happen if the target rejected all inputs we tried so far 00:08:47.651 NEW_FUNC[1/632]: 0x452cd8 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:08:47.651 NEW_FUNC[2/632]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:47.651 #8 NEW cov: 10754 ft: 10619 corp: 2/51b lim: 320 exec/s: 0 rss: 65Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:08:47.651 #9 NEW cov: 10768 ft: 13700 corp: 3/102b lim: 320 exec/s: 0 rss: 67Mb L: 51/51 MS: 1 CrossOver- 00:08:47.910 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:47.910 #10 NEW cov: 10785 ft: 14840 corp: 4/152b lim: 320 exec/s: 0 rss: 68Mb L: 50/51 MS: 1 CrossOver- 00:08:48.169 #11 NEW cov: 10785 ft: 15810 corp: 5/202b lim: 320 exec/s: 11 rss: 68Mb L: 50/51 MS: 1 ChangeBit- 00:08:48.169 #12 NEW cov: 10785 ft: 16206 corp: 6/252b lim: 320 exec/s: 12 rss: 68Mb L: 50/51 MS: 1 CopyPart- 00:08:48.428 #13 NEW cov: 10785 ft: 16484 corp: 7/303b lim: 320 exec/s: 13 rss: 68Mb L: 51/51 MS: 1 ChangeBit- 00:08:48.687 #19 NEW cov: 10785 ft: 16813 corp: 8/456b lim: 320 exec/s: 19 rss: 68Mb L: 153/153 MS: 1 InsertRepeatedBytes- 00:08:48.946 #20 NEW cov: 10785 ft: 17085 corp: 9/633b lim: 320 exec/s: 20 rss: 68Mb L: 177/177 MS: 1 InsertRepeatedBytes- 00:08:48.946 #21 NEW cov: 10792 ft: 17418 corp: 10/683b lim: 320 exec/s: 21 rss: 68Mb L: 50/177 MS: 1 CrossOver- 00:08:49.205 #22 NEW cov: 10792 ft: 17471 corp: 11/836b lim: 320 exec/s: 11 rss: 68Mb L: 153/177 MS: 1 ChangeBit- 00:08:49.205 #22 DONE cov: 10792 ft: 17471 corp: 11/836b lim: 320 exec/s: 11 rss: 68Mb 00:08:49.205 Done 22 runs in 2 second(s) 00:08:49.465 07:04:07 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:08:49.465 07:04:07 -- ../common.sh@72 -- # (( i++ )) 00:08:49.465 07:04:07 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:49.465 07:04:07 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:49.465 07:04:07 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:49.465 07:04:07 -- vfio/run.sh@23 -- # local timen=1 00:08:49.465 07:04:07 -- vfio/run.sh@24 -- # local core=0x1 00:08:49.465 07:04:07 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:49.465 07:04:07 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:49.465 07:04:07 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:49.465 07:04:07 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:49.465 07:04:07 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:49.465 07:04:07 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:49.465 07:04:07 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:49.465 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:49.465 07:04:07 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:49.465 [2024-12-13 07:04:07.596449] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:49.465 [2024-12-13 07:04:07.596514] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid503520 ] 00:08:49.465 EAL: No free 2048 kB hugepages reported on node 1 00:08:49.465 [2024-12-13 07:04:07.666167] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:49.465 [2024-12-13 07:04:07.702624] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:49.465 [2024-12-13 07:04:07.702763] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:49.724 INFO: Running with entropic power schedule (0xFF, 100). 00:08:49.724 INFO: Seed: 3534260354 00:08:49.724 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:49.724 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:49.724 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:49.724 INFO: A corpus is not provided, starting from an empty corpus 00:08:49.724 #2 INITED exec/s: 0 rss: 60Mb 00:08:49.724 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:49.724 This may also happen if the target rejected all inputs we tried so far 00:08:49.983 [2024-12-13 07:04:08.012821] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:49.983 [2024-12-13 07:04:08.012866] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.242 NEW_FUNC[1/638]: 0x4536d8 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:08:50.242 NEW_FUNC[2/638]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:50.242 #26 NEW cov: 10781 ft: 10661 corp: 2/72b lim: 120 exec/s: 0 rss: 65Mb L: 71/71 MS: 4 ChangeBinInt-ChangeBit-ChangeByte-InsertRepeatedBytes- 00:08:50.501 [2024-12-13 07:04:08.492238] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.502 [2024-12-13 07:04:08.492280] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.502 #31 NEW cov: 10795 ft: 14062 corp: 3/186b lim: 120 exec/s: 0 rss: 66Mb L: 114/114 MS: 5 ShuffleBytes-ChangeByte-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:08:50.502 [2024-12-13 07:04:08.688217] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.502 [2024-12-13 07:04:08.688248] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.761 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:50.761 #33 NEW cov: 10815 ft: 15243 corp: 4/230b lim: 120 exec/s: 0 rss: 67Mb L: 44/114 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:50.761 [2024-12-13 07:04:08.885283] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.761 [2024-12-13 07:04:08.885314] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.761 #39 NEW cov: 10815 ft: 16582 corp: 5/268b lim: 120 exec/s: 39 rss: 67Mb L: 38/114 MS: 1 EraseBytes- 00:08:51.019 [2024-12-13 07:04:09.078230] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.020 [2024-12-13 07:04:09.078260] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.020 #45 NEW cov: 10815 ft: 16708 corp: 6/376b lim: 120 exec/s: 45 rss: 67Mb L: 108/114 MS: 1 CopyPart- 00:08:51.278 [2024-12-13 07:04:09.262005] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.278 [2024-12-13 07:04:09.262036] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.278 #46 NEW cov: 10815 ft: 17162 corp: 7/447b lim: 120 exec/s: 46 rss: 69Mb L: 71/114 MS: 1 ChangeBinInt- 00:08:51.278 [2024-12-13 07:04:09.446651] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.278 [2024-12-13 07:04:09.446681] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.537 #47 NEW cov: 10815 ft: 17329 corp: 8/526b lim: 120 exec/s: 47 rss: 69Mb L: 79/114 MS: 1 CMP- DE: "\377\001\347\304.\234\260x"- 00:08:51.537 [2024-12-13 07:04:09.629955] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.537 [2024-12-13 07:04:09.629985] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.537 #48 NEW cov: 10822 ft: 17350 corp: 9/599b lim: 120 exec/s: 48 rss: 69Mb L: 73/114 MS: 1 CopyPart- 00:08:51.796 [2024-12-13 07:04:09.812828] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.796 [2024-12-13 07:04:09.812858] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.796 #54 NEW cov: 10822 ft: 17668 corp: 10/679b lim: 120 exec/s: 27 rss: 69Mb L: 80/114 MS: 1 InsertByte- 00:08:51.796 #54 DONE cov: 10822 ft: 17668 corp: 10/679b lim: 120 exec/s: 27 rss: 69Mb 00:08:51.796 ###### Recommended dictionary. ###### 00:08:51.796 "\377\001\347\304.\234\260x" # Uses: 1 00:08:51.796 ###### End of recommended dictionary. ###### 00:08:51.796 Done 54 runs in 2 second(s) 00:08:52.055 07:04:10 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:08:52.055 07:04:10 -- ../common.sh@72 -- # (( i++ )) 00:08:52.055 07:04:10 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:52.055 07:04:10 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:52.055 07:04:10 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:52.055 07:04:10 -- vfio/run.sh@23 -- # local timen=1 00:08:52.055 07:04:10 -- vfio/run.sh@24 -- # local core=0x1 00:08:52.055 07:04:10 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:52.055 07:04:10 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:52.055 07:04:10 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:52.055 07:04:10 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:52.055 07:04:10 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:52.055 07:04:10 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:52.055 07:04:10 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:52.055 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:52.055 07:04:10 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:52.055 [2024-12-13 07:04:10.219398] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:52.055 [2024-12-13 07:04:10.219473] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid503995 ] 00:08:52.055 EAL: No free 2048 kB hugepages reported on node 1 00:08:52.055 [2024-12-13 07:04:10.290735] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:52.315 [2024-12-13 07:04:10.327705] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:52.315 [2024-12-13 07:04:10.327863] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:52.315 INFO: Running with entropic power schedule (0xFF, 100). 00:08:52.315 INFO: Seed: 1864280153 00:08:52.315 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:52.315 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:52.315 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:52.315 INFO: A corpus is not provided, starting from an empty corpus 00:08:52.315 #2 INITED exec/s: 0 rss: 59Mb 00:08:52.315 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:52.315 This may also happen if the target rejected all inputs we tried so far 00:08:52.574 [2024-12-13 07:04:10.592212] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:52.574 [2024-12-13 07:04:10.592262] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:52.833 NEW_FUNC[1/638]: 0x4543c8 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:52.833 NEW_FUNC[2/638]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:52.833 #7 NEW cov: 10769 ft: 10324 corp: 2/65b lim: 90 exec/s: 0 rss: 65Mb L: 64/64 MS: 5 CopyPart-ChangeBit-InsertByte-EraseBytes-InsertRepeatedBytes- 00:08:52.833 [2024-12-13 07:04:11.009371] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:52.833 [2024-12-13 07:04:11.009414] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.092 #8 NEW cov: 10786 ft: 13246 corp: 3/114b lim: 90 exec/s: 0 rss: 67Mb L: 49/64 MS: 1 EraseBytes- 00:08:53.092 [2024-12-13 07:04:11.135174] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.092 [2024-12-13 07:04:11.135224] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.092 #11 NEW cov: 10786 ft: 14598 corp: 4/152b lim: 90 exec/s: 0 rss: 68Mb L: 38/64 MS: 3 InsertByte-CopyPart-InsertRepeatedBytes- 00:08:53.092 [2024-12-13 07:04:11.270176] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.092 [2024-12-13 07:04:11.270229] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.351 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:53.351 #13 NEW cov: 10803 ft: 14956 corp: 5/161b lim: 90 exec/s: 0 rss: 68Mb L: 9/64 MS: 2 ChangeBit-CMP- DE: "\001\000\000\000\000\000\000m"- 00:08:53.351 [2024-12-13 07:04:11.385072] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.351 [2024-12-13 07:04:11.385107] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.351 #14 NEW cov: 10803 ft: 15610 corp: 6/199b lim: 90 exec/s: 0 rss: 68Mb L: 38/64 MS: 1 ChangeBit- 00:08:53.351 [2024-12-13 07:04:11.501131] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.351 [2024-12-13 07:04:11.501167] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.351 #15 NEW cov: 10803 ft: 15854 corp: 7/237b lim: 90 exec/s: 15 rss: 68Mb L: 38/64 MS: 1 CrossOver- 00:08:53.610 [2024-12-13 07:04:11.616087] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.610 [2024-12-13 07:04:11.616120] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.610 #16 NEW cov: 10803 ft: 16726 corp: 8/246b lim: 90 exec/s: 16 rss: 68Mb L: 9/64 MS: 1 ChangeByte- 00:08:53.610 [2024-12-13 07:04:11.740961] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.610 [2024-12-13 07:04:11.741005] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.610 #17 NEW cov: 10803 ft: 17089 corp: 9/288b lim: 90 exec/s: 17 rss: 68Mb L: 42/64 MS: 1 CrossOver- 00:08:53.869 [2024-12-13 07:04:11.856848] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.869 [2024-12-13 07:04:11.856883] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.869 #18 NEW cov: 10803 ft: 17163 corp: 10/305b lim: 90 exec/s: 18 rss: 68Mb L: 17/64 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000m"- 00:08:53.869 [2024-12-13 07:04:11.971634] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.869 [2024-12-13 07:04:11.971668] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.869 #19 NEW cov: 10803 ft: 17323 corp: 11/343b lim: 90 exec/s: 19 rss: 68Mb L: 38/64 MS: 1 CMP- DE: "\375\377\377\377"- 00:08:53.869 [2024-12-13 07:04:12.085489] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.869 [2024-12-13 07:04:12.085522] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.128 #20 NEW cov: 10803 ft: 17444 corp: 12/381b lim: 90 exec/s: 20 rss: 69Mb L: 38/64 MS: 1 ShuffleBytes- 00:08:54.128 [2024-12-13 07:04:12.200305] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.128 [2024-12-13 07:04:12.200339] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.128 #21 NEW cov: 10803 ft: 17495 corp: 13/419b lim: 90 exec/s: 21 rss: 69Mb L: 38/64 MS: 1 ChangeBit- 00:08:54.128 [2024-12-13 07:04:12.314159] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.128 [2024-12-13 07:04:12.314194] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.387 #22 NEW cov: 10810 ft: 17557 corp: 14/455b lim: 90 exec/s: 22 rss: 69Mb L: 36/64 MS: 1 CrossOver- 00:08:54.387 [2024-12-13 07:04:12.429013] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.387 [2024-12-13 07:04:12.429044] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.387 #23 NEW cov: 10810 ft: 17602 corp: 15/522b lim: 90 exec/s: 23 rss: 69Mb L: 67/67 MS: 1 InsertRepeatedBytes- 00:08:54.387 [2024-12-13 07:04:12.543798] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.387 [2024-12-13 07:04:12.543829] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.387 #24 NEW cov: 10810 ft: 17624 corp: 16/589b lim: 90 exec/s: 12 rss: 69Mb L: 67/67 MS: 1 ChangeByte- 00:08:54.387 #24 DONE cov: 10810 ft: 17624 corp: 16/589b lim: 90 exec/s: 12 rss: 69Mb 00:08:54.387 ###### Recommended dictionary. ###### 00:08:54.387 "\001\000\000\000\000\000\000m" # Uses: 1 00:08:54.387 "\375\377\377\377" # Uses: 0 00:08:54.387 ###### End of recommended dictionary. ###### 00:08:54.387 Done 24 runs in 2 second(s) 00:08:54.646 07:04:12 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:08:54.646 07:04:12 -- ../common.sh@72 -- # (( i++ )) 00:08:54.646 07:04:12 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:54.646 07:04:12 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:08:54.646 00:08:54.646 real 0m19.274s 00:08:54.646 user 0m26.926s 00:08:54.646 sys 0m1.837s 00:08:54.646 07:04:12 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:54.646 07:04:12 -- common/autotest_common.sh@10 -- # set +x 00:08:54.646 ************************************ 00:08:54.646 END TEST vfio_fuzz 00:08:54.646 ************************************ 00:08:54.905 00:08:54.905 real 1m21.967s 00:08:54.905 user 2m5.956s 00:08:54.905 sys 0m8.988s 00:08:54.905 07:04:12 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:54.905 07:04:12 -- common/autotest_common.sh@10 -- # set +x 00:08:54.906 ************************************ 00:08:54.906 END TEST llvm_fuzz 00:08:54.906 ************************************ 00:08:54.906 07:04:12 -- spdk/autotest.sh@365 -- # [[ 0 -eq 1 ]] 00:08:54.906 07:04:12 -- spdk/autotest.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:08:54.906 07:04:12 -- spdk/autotest.sh@372 -- # timing_enter post_cleanup 00:08:54.906 07:04:12 -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:54.906 07:04:12 -- common/autotest_common.sh@10 -- # set +x 00:08:54.906 07:04:12 -- spdk/autotest.sh@373 -- # autotest_cleanup 00:08:54.906 07:04:12 -- common/autotest_common.sh@1381 -- # local autotest_es=0 00:08:54.906 07:04:12 -- common/autotest_common.sh@1382 -- # xtrace_disable 00:08:54.906 07:04:12 -- common/autotest_common.sh@10 -- # set +x 00:09:01.477 INFO: APP EXITING 00:09:01.478 INFO: killing all VMs 00:09:01.478 INFO: killing vhost app 00:09:01.478 INFO: EXIT DONE 00:09:04.016 Waiting for block devices as requested 00:09:04.016 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:04.016 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:04.016 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:04.016 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:04.016 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:04.016 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:04.275 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:04.275 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:04.275 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:04.535 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:04.535 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:04.535 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:04.535 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:04.794 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:04.794 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:04.794 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:05.053 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:08.343 Cleaning 00:09:08.343 Removing: /dev/shm/spdk_tgt_trace.pid466648 00:09:08.602 Removing: /var/run/dpdk/spdk_pid464167 00:09:08.602 Removing: /var/run/dpdk/spdk_pid465429 00:09:08.602 Removing: /var/run/dpdk/spdk_pid466648 00:09:08.602 Removing: /var/run/dpdk/spdk_pid467448 00:09:08.602 Removing: /var/run/dpdk/spdk_pid467783 00:09:08.602 Removing: /var/run/dpdk/spdk_pid468127 00:09:08.602 Removing: /var/run/dpdk/spdk_pid468466 00:09:08.602 Removing: /var/run/dpdk/spdk_pid468805 00:09:08.602 Removing: /var/run/dpdk/spdk_pid469093 00:09:08.602 Removing: /var/run/dpdk/spdk_pid469378 00:09:08.602 Removing: /var/run/dpdk/spdk_pid469694 00:09:08.602 Removing: /var/run/dpdk/spdk_pid470553 00:09:08.602 Removing: /var/run/dpdk/spdk_pid473752 00:09:08.602 Removing: /var/run/dpdk/spdk_pid474052 00:09:08.603 Removing: /var/run/dpdk/spdk_pid474356 00:09:08.603 Removing: /var/run/dpdk/spdk_pid474566 00:09:08.603 Removing: /var/run/dpdk/spdk_pid475073 00:09:08.603 Removing: /var/run/dpdk/spdk_pid475210 00:09:08.603 Removing: /var/run/dpdk/spdk_pid475788 00:09:08.603 Removing: /var/run/dpdk/spdk_pid476032 00:09:08.603 Removing: /var/run/dpdk/spdk_pid476350 00:09:08.603 Removing: /var/run/dpdk/spdk_pid476372 00:09:08.603 Removing: /var/run/dpdk/spdk_pid476660 00:09:08.603 Removing: /var/run/dpdk/spdk_pid476878 00:09:08.603 Removing: /var/run/dpdk/spdk_pid477313 00:09:08.603 Removing: /var/run/dpdk/spdk_pid477597 00:09:08.603 Removing: /var/run/dpdk/spdk_pid477880 00:09:08.603 Removing: /var/run/dpdk/spdk_pid477962 00:09:08.603 Removing: /var/run/dpdk/spdk_pid478266 00:09:08.603 Removing: /var/run/dpdk/spdk_pid478305 00:09:08.603 Removing: /var/run/dpdk/spdk_pid478574 00:09:08.603 Removing: /var/run/dpdk/spdk_pid478725 00:09:08.603 Removing: /var/run/dpdk/spdk_pid478910 00:09:08.603 Removing: /var/run/dpdk/spdk_pid479167 00:09:08.603 Removing: /var/run/dpdk/spdk_pid479448 00:09:08.603 Removing: /var/run/dpdk/spdk_pid479720 00:09:08.603 Removing: /var/run/dpdk/spdk_pid480004 00:09:08.603 Removing: /var/run/dpdk/spdk_pid480206 00:09:08.603 Removing: /var/run/dpdk/spdk_pid480380 00:09:08.603 Removing: /var/run/dpdk/spdk_pid480583 00:09:08.603 Removing: /var/run/dpdk/spdk_pid480866 00:09:08.603 Removing: /var/run/dpdk/spdk_pid481135 00:09:08.603 Removing: /var/run/dpdk/spdk_pid481423 00:09:08.603 Removing: /var/run/dpdk/spdk_pid481679 00:09:08.603 Removing: /var/run/dpdk/spdk_pid481864 00:09:08.603 Removing: /var/run/dpdk/spdk_pid482018 00:09:08.603 Removing: /var/run/dpdk/spdk_pid482286 00:09:08.603 Removing: /var/run/dpdk/spdk_pid482554 00:09:08.603 Removing: /var/run/dpdk/spdk_pid482835 00:09:08.603 Removing: /var/run/dpdk/spdk_pid483101 00:09:08.603 Removing: /var/run/dpdk/spdk_pid483350 00:09:08.603 Removing: /var/run/dpdk/spdk_pid483496 00:09:08.603 Removing: /var/run/dpdk/spdk_pid483701 00:09:08.603 Removing: /var/run/dpdk/spdk_pid483964 00:09:08.603 Removing: /var/run/dpdk/spdk_pid484253 00:09:08.603 Removing: /var/run/dpdk/spdk_pid484519 00:09:08.603 Removing: /var/run/dpdk/spdk_pid484800 00:09:08.603 Removing: /var/run/dpdk/spdk_pid484954 00:09:08.603 Removing: /var/run/dpdk/spdk_pid485138 00:09:08.603 Removing: /var/run/dpdk/spdk_pid485381 00:09:08.862 Removing: /var/run/dpdk/spdk_pid485662 00:09:08.862 Removing: /var/run/dpdk/spdk_pid485931 00:09:08.862 Removing: /var/run/dpdk/spdk_pid486217 00:09:08.862 Removing: /var/run/dpdk/spdk_pid486448 00:09:08.862 Removing: /var/run/dpdk/spdk_pid486639 00:09:08.862 Removing: /var/run/dpdk/spdk_pid486811 00:09:08.862 Removing: /var/run/dpdk/spdk_pid487091 00:09:08.862 Removing: /var/run/dpdk/spdk_pid487357 00:09:08.862 Removing: /var/run/dpdk/spdk_pid487645 00:09:08.862 Removing: /var/run/dpdk/spdk_pid487912 00:09:08.862 Removing: /var/run/dpdk/spdk_pid488182 00:09:08.862 Removing: /var/run/dpdk/spdk_pid488277 00:09:08.862 Removing: /var/run/dpdk/spdk_pid488612 00:09:08.862 Removing: /var/run/dpdk/spdk_pid489211 00:09:08.862 Removing: /var/run/dpdk/spdk_pid489668 00:09:08.862 Removing: /var/run/dpdk/spdk_pid490213 00:09:08.862 Removing: /var/run/dpdk/spdk_pid490509 00:09:08.862 Removing: /var/run/dpdk/spdk_pid491151 00:09:08.862 Removing: /var/run/dpdk/spdk_pid491777 00:09:08.862 Removing: /var/run/dpdk/spdk_pid492433 00:09:08.862 Removing: /var/run/dpdk/spdk_pid492970 00:09:08.862 Removing: /var/run/dpdk/spdk_pid493306 00:09:08.862 Removing: /var/run/dpdk/spdk_pid493801 00:09:08.862 Removing: /var/run/dpdk/spdk_pid494295 00:09:08.862 Removing: /var/run/dpdk/spdk_pid494634 00:09:08.862 Removing: /var/run/dpdk/spdk_pid495171 00:09:08.862 Removing: /var/run/dpdk/spdk_pid495581 00:09:08.862 Removing: /var/run/dpdk/spdk_pid496008 00:09:08.862 Removing: /var/run/dpdk/spdk_pid496542 00:09:08.862 Removing: /var/run/dpdk/spdk_pid496835 00:09:08.862 Removing: /var/run/dpdk/spdk_pid497372 00:09:08.862 Removing: /var/run/dpdk/spdk_pid497769 00:09:08.862 Removing: /var/run/dpdk/spdk_pid498207 00:09:08.862 Removing: /var/run/dpdk/spdk_pid498747 00:09:08.862 Removing: /var/run/dpdk/spdk_pid499053 00:09:08.862 Removing: /var/run/dpdk/spdk_pid499573 00:09:08.862 Removing: /var/run/dpdk/spdk_pid500077 00:09:08.862 Removing: /var/run/dpdk/spdk_pid500406 00:09:08.862 Removing: /var/run/dpdk/spdk_pid501034 00:09:08.862 Removing: /var/run/dpdk/spdk_pid501588 00:09:08.862 Removing: /var/run/dpdk/spdk_pid502050 00:09:08.862 Removing: /var/run/dpdk/spdk_pid502443 00:09:08.862 Removing: /var/run/dpdk/spdk_pid502969 00:09:08.862 Removing: /var/run/dpdk/spdk_pid503520 00:09:08.862 Removing: /var/run/dpdk/spdk_pid503995 00:09:08.862 Clean 00:09:09.121 killing process with pid 415430 00:09:13.317 killing process with pid 415427 00:09:13.318 killing process with pid 415429 00:09:13.318 killing process with pid 415428 00:09:13.318 07:04:31 -- common/autotest_common.sh@1446 -- # return 0 00:09:13.318 07:04:31 -- spdk/autotest.sh@374 -- # timing_exit post_cleanup 00:09:13.318 07:04:31 -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:13.318 07:04:31 -- common/autotest_common.sh@10 -- # set +x 00:09:13.318 07:04:31 -- spdk/autotest.sh@376 -- # timing_exit autotest 00:09:13.318 07:04:31 -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:13.318 07:04:31 -- common/autotest_common.sh@10 -- # set +x 00:09:13.318 07:04:31 -- spdk/autotest.sh@377 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:13.318 07:04:31 -- spdk/autotest.sh@379 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:13.318 07:04:31 -- spdk/autotest.sh@379 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:13.318 07:04:31 -- spdk/autotest.sh@381 -- # [[ y == y ]] 00:09:13.318 07:04:31 -- spdk/autotest.sh@383 -- # hostname 00:09:13.318 07:04:31 -- spdk/autotest.sh@383 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:09:13.318 geninfo: WARNING: invalid characters removed from testname! 00:09:14.256 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcda 00:09:14.256 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcda 00:09:14.256 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcda 00:09:24.249 07:04:42 -- spdk/autotest.sh@384 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:32.373 07:04:49 -- spdk/autotest.sh@385 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:35.865 07:04:53 -- spdk/autotest.sh@389 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:41.248 07:04:58 -- spdk/autotest.sh@390 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:45.447 07:05:03 -- spdk/autotest.sh@391 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:49.639 07:05:07 -- spdk/autotest.sh@392 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:54.913 07:05:12 -- spdk/autotest.sh@393 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:09:54.913 07:05:12 -- common/autotest_common.sh@1689 -- $ [[ y == y ]] 00:09:54.913 07:05:12 -- common/autotest_common.sh@1690 -- $ lcov --version 00:09:54.913 07:05:12 -- common/autotest_common.sh@1690 -- $ awk '{print $NF}' 00:09:54.913 07:05:12 -- common/autotest_common.sh@1690 -- $ lt 1.15 2 00:09:54.913 07:05:12 -- scripts/common.sh@372 -- $ cmp_versions 1.15 '<' 2 00:09:54.913 07:05:12 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:09:54.913 07:05:12 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:09:54.913 07:05:12 -- scripts/common.sh@335 -- $ IFS=.-: 00:09:54.913 07:05:12 -- scripts/common.sh@335 -- $ read -ra ver1 00:09:54.913 07:05:12 -- scripts/common.sh@336 -- $ IFS=.-: 00:09:54.913 07:05:12 -- scripts/common.sh@336 -- $ read -ra ver2 00:09:54.913 07:05:12 -- scripts/common.sh@337 -- $ local 'op=<' 00:09:54.913 07:05:12 -- scripts/common.sh@339 -- $ ver1_l=2 00:09:54.913 07:05:12 -- scripts/common.sh@340 -- $ ver2_l=1 00:09:54.913 07:05:12 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:09:54.913 07:05:12 -- scripts/common.sh@343 -- $ case "$op" in 00:09:54.913 07:05:12 -- scripts/common.sh@344 -- $ : 1 00:09:54.913 07:05:12 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:09:54.913 07:05:12 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:54.913 07:05:12 -- scripts/common.sh@364 -- $ decimal 1 00:09:54.913 07:05:12 -- scripts/common.sh@352 -- $ local d=1 00:09:54.913 07:05:12 -- scripts/common.sh@353 -- $ [[ 1 =~ ^[0-9]+$ ]] 00:09:54.913 07:05:12 -- scripts/common.sh@354 -- $ echo 1 00:09:54.913 07:05:12 -- scripts/common.sh@364 -- $ ver1[v]=1 00:09:54.913 07:05:12 -- scripts/common.sh@365 -- $ decimal 2 00:09:54.913 07:05:12 -- scripts/common.sh@352 -- $ local d=2 00:09:54.913 07:05:12 -- scripts/common.sh@353 -- $ [[ 2 =~ ^[0-9]+$ ]] 00:09:54.913 07:05:12 -- scripts/common.sh@354 -- $ echo 2 00:09:54.913 07:05:12 -- scripts/common.sh@365 -- $ ver2[v]=2 00:09:54.913 07:05:12 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:09:54.913 07:05:12 -- scripts/common.sh@367 -- $ (( ver1[v] < ver2[v] )) 00:09:54.913 07:05:12 -- scripts/common.sh@367 -- $ return 0 00:09:54.913 07:05:12 -- common/autotest_common.sh@1691 -- $ lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:54.913 07:05:12 -- common/autotest_common.sh@1703 -- $ export 'LCOV_OPTS= 00:09:54.913 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:54.913 --rc genhtml_branch_coverage=1 00:09:54.913 --rc genhtml_function_coverage=1 00:09:54.913 --rc genhtml_legend=1 00:09:54.913 --rc geninfo_all_blocks=1 00:09:54.913 --rc geninfo_unexecuted_blocks=1 00:09:54.913 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:54.913 ' 00:09:54.913 07:05:12 -- common/autotest_common.sh@1703 -- $ LCOV_OPTS=' 00:09:54.913 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:54.913 --rc genhtml_branch_coverage=1 00:09:54.913 --rc genhtml_function_coverage=1 00:09:54.913 --rc genhtml_legend=1 00:09:54.913 --rc geninfo_all_blocks=1 00:09:54.913 --rc geninfo_unexecuted_blocks=1 00:09:54.913 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:54.913 ' 00:09:54.913 07:05:12 -- common/autotest_common.sh@1704 -- $ export 'LCOV=lcov 00:09:54.913 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:54.913 --rc genhtml_branch_coverage=1 00:09:54.913 --rc genhtml_function_coverage=1 00:09:54.913 --rc genhtml_legend=1 00:09:54.913 --rc geninfo_all_blocks=1 00:09:54.913 --rc geninfo_unexecuted_blocks=1 00:09:54.913 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:54.913 ' 00:09:54.913 07:05:12 -- common/autotest_common.sh@1704 -- $ LCOV='lcov 00:09:54.913 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:54.913 --rc genhtml_branch_coverage=1 00:09:54.913 --rc genhtml_function_coverage=1 00:09:54.913 --rc genhtml_legend=1 00:09:54.913 --rc geninfo_all_blocks=1 00:09:54.913 --rc geninfo_unexecuted_blocks=1 00:09:54.913 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:54.913 ' 00:09:54.913 07:05:12 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:54.913 07:05:12 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:09:54.913 07:05:12 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:54.913 07:05:12 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:54.913 07:05:12 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:54.913 07:05:12 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:54.914 07:05:12 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:54.914 07:05:12 -- paths/export.sh@5 -- $ export PATH 00:09:54.914 07:05:12 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:54.914 07:05:12 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:09:54.914 07:05:12 -- common/autobuild_common.sh@440 -- $ date +%s 00:09:54.914 07:05:12 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1734069912.XXXXXX 00:09:54.914 07:05:12 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1734069912.Pi82y5 00:09:54.914 07:05:12 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:09:54.914 07:05:12 -- common/autobuild_common.sh@446 -- $ '[' -n v22.11.4 ']' 00:09:54.914 07:05:12 -- common/autobuild_common.sh@447 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:54.914 07:05:12 -- common/autobuild_common.sh@447 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:09:54.914 07:05:12 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:09:54.914 07:05:12 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:09:54.914 07:05:12 -- common/autobuild_common.sh@456 -- $ get_config_params 00:09:54.914 07:05:12 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:09:54.914 07:05:12 -- common/autotest_common.sh@10 -- $ set +x 00:09:54.914 07:05:12 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:09:54.914 07:05:12 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:09:54.914 07:05:12 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:54.914 07:05:12 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:09:54.914 07:05:12 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:09:54.914 07:05:12 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:09:54.914 07:05:12 -- spdk/autopackage.sh@19 -- $ timing_finish 00:09:54.914 07:05:12 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:54.914 07:05:12 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:09:54.914 07:05:12 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:54.914 07:05:12 -- spdk/autopackage.sh@20 -- $ exit 0 00:09:54.914 + [[ -n 359555 ]] 00:09:54.914 + sudo kill 359555 00:09:54.924 [Pipeline] } 00:09:54.939 [Pipeline] // stage 00:09:54.945 [Pipeline] } 00:09:54.959 [Pipeline] // timeout 00:09:54.965 [Pipeline] } 00:09:54.979 [Pipeline] // catchError 00:09:54.984 [Pipeline] } 00:09:55.000 [Pipeline] // wrap 00:09:55.006 [Pipeline] } 00:09:55.019 [Pipeline] // catchError 00:09:55.029 [Pipeline] stage 00:09:55.031 [Pipeline] { (Epilogue) 00:09:55.045 [Pipeline] catchError 00:09:55.047 [Pipeline] { 00:09:55.059 [Pipeline] echo 00:09:55.061 Cleanup processes 00:09:55.067 [Pipeline] sh 00:09:55.355 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:55.355 513345 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:55.370 [Pipeline] sh 00:09:55.657 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:55.657 ++ awk '{print $1}' 00:09:55.657 ++ grep -v 'sudo pgrep' 00:09:55.657 + sudo kill -9 00:09:55.657 + true 00:09:55.669 [Pipeline] sh 00:09:55.954 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:55.954 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:55.954 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:57.332 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:07.325 [Pipeline] sh 00:10:07.609 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:10:07.609 Artifacts sizes are good 00:10:07.624 [Pipeline] archiveArtifacts 00:10:07.631 Archiving artifacts 00:10:07.755 [Pipeline] sh 00:10:08.042 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:10:08.057 [Pipeline] cleanWs 00:10:08.067 [WS-CLEANUP] Deleting project workspace... 00:10:08.067 [WS-CLEANUP] Deferred wipeout is used... 00:10:08.074 [WS-CLEANUP] done 00:10:08.076 [Pipeline] } 00:10:08.093 [Pipeline] // catchError 00:10:08.104 [Pipeline] sh 00:10:08.388 + logger -p user.info -t JENKINS-CI 00:10:08.397 [Pipeline] } 00:10:08.411 [Pipeline] // stage 00:10:08.416 [Pipeline] } 00:10:08.431 [Pipeline] // node 00:10:08.437 [Pipeline] End of Pipeline 00:10:08.491 Finished: SUCCESS