00:00:00.001 Started by upstream project "autotest-spdk-v24.01-LTS-vs-dpdk-v22.11" build number 968 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3635 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.069 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.070 The recommended git tool is: git 00:00:00.070 using credential 00000000-0000-0000-0000-000000000002 00:00:00.072 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.092 Fetching changes from the remote Git repository 00:00:00.095 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.129 Using shallow fetch with depth 1 00:00:00.129 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.129 > git --version # timeout=10 00:00:00.166 > git --version # 'git version 2.39.2' 00:00:00.166 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.196 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.196 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:04.338 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:04.349 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:04.361 Checking out Revision b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf (FETCH_HEAD) 00:00:04.361 > git config core.sparsecheckout # timeout=10 00:00:04.372 > git read-tree -mu HEAD # timeout=10 00:00:04.387 > git checkout -f b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf # timeout=5 00:00:04.405 Commit message: "jenkins/jjb-config: Ignore OS version mismatch under freebsd" 00:00:04.405 > git rev-list --no-walk b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf # timeout=10 00:00:04.495 [Pipeline] Start of Pipeline 00:00:04.509 [Pipeline] library 00:00:04.510 Loading library shm_lib@master 00:00:04.511 Library shm_lib@master is cached. Copying from home. 00:00:04.528 [Pipeline] node 00:00:04.543 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:04.545 [Pipeline] { 00:00:04.556 [Pipeline] catchError 00:00:04.557 [Pipeline] { 00:00:04.570 [Pipeline] wrap 00:00:04.580 [Pipeline] { 00:00:04.589 [Pipeline] stage 00:00:04.592 [Pipeline] { (Prologue) 00:00:04.804 [Pipeline] sh 00:00:05.094 + logger -p user.info -t JENKINS-CI 00:00:05.112 [Pipeline] echo 00:00:05.113 Node: WFP20 00:00:05.120 [Pipeline] sh 00:00:05.422 [Pipeline] setCustomBuildProperty 00:00:05.433 [Pipeline] echo 00:00:05.434 Cleanup processes 00:00:05.438 [Pipeline] sh 00:00:05.723 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:05.723 351949 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:05.735 [Pipeline] sh 00:00:06.021 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:06.021 ++ grep -v 'sudo pgrep' 00:00:06.021 ++ awk '{print $1}' 00:00:06.021 + sudo kill -9 00:00:06.021 + true 00:00:06.033 [Pipeline] cleanWs 00:00:06.041 [WS-CLEANUP] Deleting project workspace... 00:00:06.041 [WS-CLEANUP] Deferred wipeout is used... 00:00:06.047 [WS-CLEANUP] done 00:00:06.049 [Pipeline] setCustomBuildProperty 00:00:06.059 [Pipeline] sh 00:00:06.340 + sudo git config --global --replace-all safe.directory '*' 00:00:06.401 [Pipeline] httpRequest 00:00:07.152 [Pipeline] echo 00:00:07.153 Sorcerer 10.211.164.20 is alive 00:00:07.163 [Pipeline] retry 00:00:07.165 [Pipeline] { 00:00:07.178 [Pipeline] httpRequest 00:00:07.183 HttpMethod: GET 00:00:07.183 URL: http://10.211.164.20/packages/jbp_b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf.tar.gz 00:00:07.184 Sending request to url: http://10.211.164.20/packages/jbp_b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf.tar.gz 00:00:07.187 Response Code: HTTP/1.1 200 OK 00:00:07.188 Success: Status code 200 is in the accepted range: 200,404 00:00:07.188 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf.tar.gz 00:00:08.573 [Pipeline] } 00:00:08.590 [Pipeline] // retry 00:00:08.597 [Pipeline] sh 00:00:08.883 + tar --no-same-owner -xf jbp_b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf.tar.gz 00:00:08.898 [Pipeline] httpRequest 00:00:09.297 [Pipeline] echo 00:00:09.299 Sorcerer 10.211.164.20 is alive 00:00:09.310 [Pipeline] retry 00:00:09.312 [Pipeline] { 00:00:09.328 [Pipeline] httpRequest 00:00:09.333 HttpMethod: GET 00:00:09.334 URL: http://10.211.164.20/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:09.335 Sending request to url: http://10.211.164.20/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:09.352 Response Code: HTTP/1.1 200 OK 00:00:09.352 Success: Status code 200 is in the accepted range: 200,404 00:00:09.352 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:48.671 [Pipeline] } 00:00:48.689 [Pipeline] // retry 00:00:48.700 [Pipeline] sh 00:00:48.992 + tar --no-same-owner -xf spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:51.547 [Pipeline] sh 00:00:51.838 + git -C spdk log --oneline -n5 00:00:51.838 c13c99a5e test: Various fixes for Fedora40 00:00:51.838 726a04d70 test/nvmf: adjust timeout for bigger nvmes 00:00:51.838 61c96acfb dpdk: Point dpdk submodule at a latest fix from spdk-23.11 00:00:51.838 7db6dcdb8 nvme/fio_plugin: update the way ruhs descriptors are fetched 00:00:51.838 ff6f5c41e nvme/fio_plugin: trim add support for multiple ranges 00:00:51.856 [Pipeline] withCredentials 00:00:51.868 > git --version # timeout=10 00:00:51.879 > git --version # 'git version 2.39.2' 00:00:51.896 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:00:51.898 [Pipeline] { 00:00:51.907 [Pipeline] retry 00:00:51.909 [Pipeline] { 00:00:51.922 [Pipeline] sh 00:00:52.205 + git ls-remote http://dpdk.org/git/dpdk-stable v22.11.4 00:00:52.218 [Pipeline] } 00:00:52.239 [Pipeline] // retry 00:00:52.245 [Pipeline] } 00:00:52.262 [Pipeline] // withCredentials 00:00:52.272 [Pipeline] httpRequest 00:00:52.669 [Pipeline] echo 00:00:52.671 Sorcerer 10.211.164.20 is alive 00:00:52.681 [Pipeline] retry 00:00:52.683 [Pipeline] { 00:00:52.697 [Pipeline] httpRequest 00:00:52.702 HttpMethod: GET 00:00:52.702 URL: http://10.211.164.20/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:00:52.703 Sending request to url: http://10.211.164.20/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:00:52.716 Response Code: HTTP/1.1 200 OK 00:00:52.717 Success: Status code 200 is in the accepted range: 200,404 00:00:52.717 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:09.235 [Pipeline] } 00:01:09.253 [Pipeline] // retry 00:01:09.262 [Pipeline] sh 00:01:09.552 + tar --no-same-owner -xf dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:10.951 [Pipeline] sh 00:01:11.239 + git -C dpdk log --oneline -n5 00:01:11.239 caf0f5d395 version: 22.11.4 00:01:11.239 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:01:11.239 dc9c799c7d vhost: fix missing spinlock unlock 00:01:11.239 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:01:11.239 6ef77f2a5e net/gve: fix RX buffer size alignment 00:01:11.250 [Pipeline] } 00:01:11.276 [Pipeline] // stage 00:01:11.286 [Pipeline] stage 00:01:11.288 [Pipeline] { (Prepare) 00:01:11.310 [Pipeline] writeFile 00:01:11.326 [Pipeline] sh 00:01:11.614 + logger -p user.info -t JENKINS-CI 00:01:11.627 [Pipeline] sh 00:01:11.914 + logger -p user.info -t JENKINS-CI 00:01:11.927 [Pipeline] sh 00:01:12.214 + cat autorun-spdk.conf 00:01:12.214 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:12.214 SPDK_RUN_UBSAN=1 00:01:12.214 SPDK_TEST_FUZZER=1 00:01:12.214 SPDK_TEST_FUZZER_SHORT=1 00:01:12.214 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:12.214 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:12.222 RUN_NIGHTLY=1 00:01:12.227 [Pipeline] readFile 00:01:12.252 [Pipeline] withEnv 00:01:12.254 [Pipeline] { 00:01:12.266 [Pipeline] sh 00:01:12.555 + set -ex 00:01:12.555 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:12.555 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:12.555 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:12.555 ++ SPDK_RUN_UBSAN=1 00:01:12.555 ++ SPDK_TEST_FUZZER=1 00:01:12.555 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:12.555 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:12.555 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:12.555 ++ RUN_NIGHTLY=1 00:01:12.555 + case $SPDK_TEST_NVMF_NICS in 00:01:12.555 + DRIVERS= 00:01:12.555 + [[ -n '' ]] 00:01:12.555 + exit 0 00:01:12.565 [Pipeline] } 00:01:12.581 [Pipeline] // withEnv 00:01:12.598 [Pipeline] } 00:01:12.612 [Pipeline] // stage 00:01:12.622 [Pipeline] catchError 00:01:12.624 [Pipeline] { 00:01:12.639 [Pipeline] timeout 00:01:12.640 Timeout set to expire in 30 min 00:01:12.642 [Pipeline] { 00:01:12.658 [Pipeline] stage 00:01:12.660 [Pipeline] { (Tests) 00:01:12.676 [Pipeline] sh 00:01:12.969 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:12.969 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:12.969 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:12.969 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:12.969 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:12.969 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:12.969 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:12.969 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:12.969 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:12.969 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:12.969 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:12.969 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:12.969 + source /etc/os-release 00:01:12.969 ++ NAME='Fedora Linux' 00:01:12.969 ++ VERSION='39 (Cloud Edition)' 00:01:12.969 ++ ID=fedora 00:01:12.969 ++ VERSION_ID=39 00:01:12.969 ++ VERSION_CODENAME= 00:01:12.969 ++ PLATFORM_ID=platform:f39 00:01:12.969 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:01:12.969 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:12.969 ++ LOGO=fedora-logo-icon 00:01:12.969 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:01:12.969 ++ HOME_URL=https://fedoraproject.org/ 00:01:12.969 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:01:12.969 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:12.969 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:12.969 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:12.969 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:01:12.969 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:12.969 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:01:12.969 ++ SUPPORT_END=2024-11-12 00:01:12.969 ++ VARIANT='Cloud Edition' 00:01:12.969 ++ VARIANT_ID=cloud 00:01:12.970 + uname -a 00:01:12.970 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:01:12.970 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:16.269 Hugepages 00:01:16.269 node hugesize free / total 00:01:16.269 node0 1048576kB 0 / 0 00:01:16.269 node0 2048kB 0 / 0 00:01:16.269 node1 1048576kB 0 / 0 00:01:16.269 node1 2048kB 0 / 0 00:01:16.269 00:01:16.269 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:16.269 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:16.269 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:16.269 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:16.269 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:16.269 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:16.269 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:16.269 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:16.269 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:16.269 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:16.269 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:16.269 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:16.269 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:16.269 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:16.269 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:16.269 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:16.269 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:16.269 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:16.269 + rm -f /tmp/spdk-ld-path 00:01:16.269 + source autorun-spdk.conf 00:01:16.269 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:16.269 ++ SPDK_RUN_UBSAN=1 00:01:16.269 ++ SPDK_TEST_FUZZER=1 00:01:16.269 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:16.269 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:16.269 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:16.269 ++ RUN_NIGHTLY=1 00:01:16.269 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:16.269 + [[ -n '' ]] 00:01:16.269 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:16.269 + for M in /var/spdk/build-*-manifest.txt 00:01:16.269 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:01:16.269 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:16.269 + for M in /var/spdk/build-*-manifest.txt 00:01:16.269 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:16.269 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:16.269 + for M in /var/spdk/build-*-manifest.txt 00:01:16.269 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:16.269 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:16.269 ++ uname 00:01:16.269 + [[ Linux == \L\i\n\u\x ]] 00:01:16.269 + sudo dmesg -T 00:01:16.269 + sudo dmesg --clear 00:01:16.269 + dmesg_pid=353445 00:01:16.269 + [[ Fedora Linux == FreeBSD ]] 00:01:16.269 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:16.269 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:16.269 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:16.269 + [[ -x /usr/src/fio-static/fio ]] 00:01:16.269 + export FIO_BIN=/usr/src/fio-static/fio 00:01:16.269 + FIO_BIN=/usr/src/fio-static/fio 00:01:16.269 + sudo dmesg -Tw 00:01:16.269 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:16.269 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:16.269 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:16.269 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:16.269 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:16.269 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:16.269 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:16.269 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:16.269 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:16.269 Test configuration: 00:01:16.269 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:16.269 SPDK_RUN_UBSAN=1 00:01:16.269 SPDK_TEST_FUZZER=1 00:01:16.269 SPDK_TEST_FUZZER_SHORT=1 00:01:16.269 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:16.269 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:16.269 RUN_NIGHTLY=1 16:37:01 -- common/autotest_common.sh@1689 -- $ [[ n == y ]] 00:01:16.269 16:37:01 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:16.269 16:37:01 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:16.269 16:37:01 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:16.269 16:37:01 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:16.269 16:37:01 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:16.269 16:37:01 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:16.269 16:37:01 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:16.269 16:37:01 -- paths/export.sh@5 -- $ export PATH 00:01:16.269 16:37:01 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:16.269 16:37:01 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:16.269 16:37:01 -- common/autobuild_common.sh@440 -- $ date +%s 00:01:16.269 16:37:01 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1731771421.XXXXXX 00:01:16.269 16:37:01 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1731771421.RF1W8c 00:01:16.269 16:37:01 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:01:16.269 16:37:01 -- common/autobuild_common.sh@446 -- $ '[' -n v22.11.4 ']' 00:01:16.269 16:37:01 -- common/autobuild_common.sh@447 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:16.269 16:37:01 -- common/autobuild_common.sh@447 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:01:16.269 16:37:01 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:16.269 16:37:01 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:16.269 16:37:01 -- common/autobuild_common.sh@456 -- $ get_config_params 00:01:16.269 16:37:01 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:01:16.269 16:37:01 -- common/autotest_common.sh@10 -- $ set +x 00:01:16.269 16:37:01 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:01:16.269 16:37:01 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:16.269 16:37:01 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:16.269 16:37:01 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:16.269 16:37:01 -- spdk/autobuild.sh@16 -- $ date -u 00:01:16.269 Sat Nov 16 03:37:01 PM UTC 2024 00:01:16.269 16:37:01 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:16.269 LTS-67-gc13c99a5e 00:01:16.269 16:37:01 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:16.269 16:37:01 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:16.269 16:37:01 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:16.270 16:37:01 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:01:16.270 16:37:01 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:16.270 16:37:01 -- common/autotest_common.sh@10 -- $ set +x 00:01:16.270 ************************************ 00:01:16.270 START TEST ubsan 00:01:16.270 ************************************ 00:01:16.270 16:37:01 -- common/autotest_common.sh@1114 -- $ echo 'using ubsan' 00:01:16.270 using ubsan 00:01:16.270 00:01:16.270 real 0m0.000s 00:01:16.270 user 0m0.000s 00:01:16.270 sys 0m0.000s 00:01:16.270 16:37:01 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:01:16.270 16:37:01 -- common/autotest_common.sh@10 -- $ set +x 00:01:16.270 ************************************ 00:01:16.270 END TEST ubsan 00:01:16.270 ************************************ 00:01:16.270 16:37:01 -- spdk/autobuild.sh@27 -- $ '[' -n v22.11.4 ']' 00:01:16.270 16:37:01 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:16.270 16:37:01 -- common/autobuild_common.sh@432 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:16.270 16:37:01 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:01:16.270 16:37:01 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:16.270 16:37:01 -- common/autotest_common.sh@10 -- $ set +x 00:01:16.270 ************************************ 00:01:16.270 START TEST build_native_dpdk 00:01:16.270 ************************************ 00:01:16.270 16:37:01 -- common/autotest_common.sh@1114 -- $ _build_native_dpdk 00:01:16.270 16:37:01 -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:16.270 16:37:01 -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:16.270 16:37:01 -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:16.270 16:37:01 -- common/autobuild_common.sh@51 -- $ local compiler 00:01:16.270 16:37:01 -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:16.270 16:37:01 -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:16.270 16:37:01 -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:16.270 16:37:01 -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:16.270 16:37:01 -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:16.270 16:37:01 -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:16.270 16:37:01 -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:16.270 16:37:01 -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:16.270 16:37:01 -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:16.270 16:37:01 -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:16.270 16:37:01 -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:16.270 16:37:01 -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:16.270 16:37:01 -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:16.270 16:37:01 -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:01:16.270 16:37:01 -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:16.270 16:37:01 -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:01:16.270 caf0f5d395 version: 22.11.4 00:01:16.270 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:01:16.270 dc9c799c7d vhost: fix missing spinlock unlock 00:01:16.270 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:01:16.270 6ef77f2a5e net/gve: fix RX buffer size alignment 00:01:16.270 16:37:01 -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:16.270 16:37:01 -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:16.270 16:37:01 -- common/autobuild_common.sh@87 -- $ dpdk_ver=22.11.4 00:01:16.270 16:37:01 -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:16.270 16:37:01 -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:16.270 16:37:01 -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:16.270 16:37:01 -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:16.270 16:37:01 -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:16.270 16:37:01 -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:16.270 16:37:01 -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:01:16.270 16:37:01 -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:01:16.270 16:37:01 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:16.270 16:37:01 -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:16.270 16:37:01 -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:01:16.270 16:37:01 -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:16.270 16:37:02 -- common/autobuild_common.sh@168 -- $ uname -s 00:01:16.270 16:37:02 -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:01:16.270 16:37:02 -- common/autobuild_common.sh@169 -- $ lt 22.11.4 21.11.0 00:01:16.270 16:37:02 -- scripts/common.sh@372 -- $ cmp_versions 22.11.4 '<' 21.11.0 00:01:16.270 16:37:02 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:16.270 16:37:02 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:16.270 16:37:02 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:16.270 16:37:02 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:16.270 16:37:02 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:16.270 16:37:02 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:16.270 16:37:02 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:16.270 16:37:02 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:16.270 16:37:02 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:16.270 16:37:02 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:16.270 16:37:02 -- scripts/common.sh@343 -- $ case "$op" in 00:01:16.270 16:37:02 -- scripts/common.sh@344 -- $ : 1 00:01:16.270 16:37:02 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:16.270 16:37:02 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:16.270 16:37:02 -- scripts/common.sh@364 -- $ decimal 22 00:01:16.270 16:37:02 -- scripts/common.sh@352 -- $ local d=22 00:01:16.270 16:37:02 -- scripts/common.sh@353 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:01:16.270 16:37:02 -- scripts/common.sh@354 -- $ echo 22 00:01:16.531 16:37:02 -- scripts/common.sh@364 -- $ ver1[v]=22 00:01:16.531 16:37:02 -- scripts/common.sh@365 -- $ decimal 21 00:01:16.531 16:37:02 -- scripts/common.sh@352 -- $ local d=21 00:01:16.531 16:37:02 -- scripts/common.sh@353 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:16.531 16:37:02 -- scripts/common.sh@354 -- $ echo 21 00:01:16.531 16:37:02 -- scripts/common.sh@365 -- $ ver2[v]=21 00:01:16.531 16:37:02 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:16.531 16:37:02 -- scripts/common.sh@366 -- $ return 1 00:01:16.531 16:37:02 -- common/autobuild_common.sh@173 -- $ patch -p1 00:01:16.531 patching file config/rte_config.h 00:01:16.531 Hunk #1 succeeded at 60 (offset 1 line). 00:01:16.531 16:37:02 -- common/autobuild_common.sh@176 -- $ lt 22.11.4 24.07.0 00:01:16.531 16:37:02 -- scripts/common.sh@372 -- $ cmp_versions 22.11.4 '<' 24.07.0 00:01:16.531 16:37:02 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:01:16.531 16:37:02 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:01:16.531 16:37:02 -- scripts/common.sh@335 -- $ IFS=.-: 00:01:16.531 16:37:02 -- scripts/common.sh@335 -- $ read -ra ver1 00:01:16.531 16:37:02 -- scripts/common.sh@336 -- $ IFS=.-: 00:01:16.531 16:37:02 -- scripts/common.sh@336 -- $ read -ra ver2 00:01:16.531 16:37:02 -- scripts/common.sh@337 -- $ local 'op=<' 00:01:16.531 16:37:02 -- scripts/common.sh@339 -- $ ver1_l=3 00:01:16.531 16:37:02 -- scripts/common.sh@340 -- $ ver2_l=3 00:01:16.531 16:37:02 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:01:16.531 16:37:02 -- scripts/common.sh@343 -- $ case "$op" in 00:01:16.531 16:37:02 -- scripts/common.sh@344 -- $ : 1 00:01:16.531 16:37:02 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:01:16.531 16:37:02 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:16.531 16:37:02 -- scripts/common.sh@364 -- $ decimal 22 00:01:16.531 16:37:02 -- scripts/common.sh@352 -- $ local d=22 00:01:16.531 16:37:02 -- scripts/common.sh@353 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:01:16.531 16:37:02 -- scripts/common.sh@354 -- $ echo 22 00:01:16.531 16:37:02 -- scripts/common.sh@364 -- $ ver1[v]=22 00:01:16.531 16:37:02 -- scripts/common.sh@365 -- $ decimal 24 00:01:16.531 16:37:02 -- scripts/common.sh@352 -- $ local d=24 00:01:16.531 16:37:02 -- scripts/common.sh@353 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:16.531 16:37:02 -- scripts/common.sh@354 -- $ echo 24 00:01:16.531 16:37:02 -- scripts/common.sh@365 -- $ ver2[v]=24 00:01:16.531 16:37:02 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:01:16.531 16:37:02 -- scripts/common.sh@367 -- $ (( ver1[v] < ver2[v] )) 00:01:16.531 16:37:02 -- scripts/common.sh@367 -- $ return 0 00:01:16.531 16:37:02 -- common/autobuild_common.sh@177 -- $ patch -p1 00:01:16.531 patching file lib/pcapng/rte_pcapng.c 00:01:16.531 Hunk #1 succeeded at 110 (offset -18 lines). 00:01:16.531 16:37:02 -- common/autobuild_common.sh@180 -- $ dpdk_kmods=false 00:01:16.531 16:37:02 -- common/autobuild_common.sh@181 -- $ uname -s 00:01:16.531 16:37:02 -- common/autobuild_common.sh@181 -- $ '[' Linux = FreeBSD ']' 00:01:16.531 16:37:02 -- common/autobuild_common.sh@185 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:01:16.531 16:37:02 -- common/autobuild_common.sh@185 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:21.814 The Meson build system 00:01:21.814 Version: 1.5.0 00:01:21.814 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:21.814 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:01:21.814 Build type: native build 00:01:21.814 Program cat found: YES (/usr/bin/cat) 00:01:21.814 Project name: DPDK 00:01:21.814 Project version: 22.11.4 00:01:21.814 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:01:21.814 C linker for the host machine: gcc ld.bfd 2.40-14 00:01:21.814 Host machine cpu family: x86_64 00:01:21.814 Host machine cpu: x86_64 00:01:21.814 Message: ## Building in Developer Mode ## 00:01:21.814 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:21.814 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:01:21.814 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:01:21.814 Program objdump found: YES (/usr/bin/objdump) 00:01:21.814 Program python3 found: YES (/usr/bin/python3) 00:01:21.814 Program cat found: YES (/usr/bin/cat) 00:01:21.814 config/meson.build:83: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:01:21.814 Checking for size of "void *" : 8 00:01:21.814 Checking for size of "void *" : 8 (cached) 00:01:21.814 Library m found: YES 00:01:21.814 Library numa found: YES 00:01:21.814 Has header "numaif.h" : YES 00:01:21.814 Library fdt found: NO 00:01:21.814 Library execinfo found: NO 00:01:21.814 Has header "execinfo.h" : YES 00:01:21.814 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:01:21.814 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:21.814 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:21.814 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:21.814 Run-time dependency openssl found: YES 3.1.1 00:01:21.814 Run-time dependency libpcap found: YES 1.10.4 00:01:21.814 Has header "pcap.h" with dependency libpcap: YES 00:01:21.814 Compiler for C supports arguments -Wcast-qual: YES 00:01:21.814 Compiler for C supports arguments -Wdeprecated: YES 00:01:21.814 Compiler for C supports arguments -Wformat: YES 00:01:21.814 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:21.814 Compiler for C supports arguments -Wformat-security: NO 00:01:21.814 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:21.814 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:21.814 Compiler for C supports arguments -Wnested-externs: YES 00:01:21.814 Compiler for C supports arguments -Wold-style-definition: YES 00:01:21.814 Compiler for C supports arguments -Wpointer-arith: YES 00:01:21.814 Compiler for C supports arguments -Wsign-compare: YES 00:01:21.814 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:21.814 Compiler for C supports arguments -Wundef: YES 00:01:21.814 Compiler for C supports arguments -Wwrite-strings: YES 00:01:21.814 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:21.814 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:21.814 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:21.814 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:21.814 Compiler for C supports arguments -mavx512f: YES 00:01:21.814 Checking if "AVX512 checking" compiles: YES 00:01:21.814 Fetching value of define "__SSE4_2__" : 1 00:01:21.814 Fetching value of define "__AES__" : 1 00:01:21.814 Fetching value of define "__AVX__" : 1 00:01:21.814 Fetching value of define "__AVX2__" : 1 00:01:21.814 Fetching value of define "__AVX512BW__" : 1 00:01:21.814 Fetching value of define "__AVX512CD__" : 1 00:01:21.814 Fetching value of define "__AVX512DQ__" : 1 00:01:21.814 Fetching value of define "__AVX512F__" : 1 00:01:21.814 Fetching value of define "__AVX512VL__" : 1 00:01:21.814 Fetching value of define "__PCLMUL__" : 1 00:01:21.814 Fetching value of define "__RDRND__" : 1 00:01:21.814 Fetching value of define "__RDSEED__" : 1 00:01:21.814 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:21.814 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:21.814 Message: lib/kvargs: Defining dependency "kvargs" 00:01:21.814 Message: lib/telemetry: Defining dependency "telemetry" 00:01:21.814 Checking for function "getentropy" : YES 00:01:21.814 Message: lib/eal: Defining dependency "eal" 00:01:21.814 Message: lib/ring: Defining dependency "ring" 00:01:21.814 Message: lib/rcu: Defining dependency "rcu" 00:01:21.814 Message: lib/mempool: Defining dependency "mempool" 00:01:21.814 Message: lib/mbuf: Defining dependency "mbuf" 00:01:21.814 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:21.814 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:21.814 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:21.814 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:21.814 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:21.814 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:21.814 Compiler for C supports arguments -mpclmul: YES 00:01:21.815 Compiler for C supports arguments -maes: YES 00:01:21.815 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:21.815 Compiler for C supports arguments -mavx512bw: YES 00:01:21.815 Compiler for C supports arguments -mavx512dq: YES 00:01:21.815 Compiler for C supports arguments -mavx512vl: YES 00:01:21.815 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:21.815 Compiler for C supports arguments -mavx2: YES 00:01:21.815 Compiler for C supports arguments -mavx: YES 00:01:21.815 Message: lib/net: Defining dependency "net" 00:01:21.815 Message: lib/meter: Defining dependency "meter" 00:01:21.815 Message: lib/ethdev: Defining dependency "ethdev" 00:01:21.815 Message: lib/pci: Defining dependency "pci" 00:01:21.815 Message: lib/cmdline: Defining dependency "cmdline" 00:01:21.815 Message: lib/metrics: Defining dependency "metrics" 00:01:21.815 Message: lib/hash: Defining dependency "hash" 00:01:21.815 Message: lib/timer: Defining dependency "timer" 00:01:21.815 Fetching value of define "__AVX2__" : 1 (cached) 00:01:21.815 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:21.815 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:21.815 Fetching value of define "__AVX512CD__" : 1 (cached) 00:01:21.815 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:21.815 Message: lib/acl: Defining dependency "acl" 00:01:21.815 Message: lib/bbdev: Defining dependency "bbdev" 00:01:21.815 Message: lib/bitratestats: Defining dependency "bitratestats" 00:01:21.815 Run-time dependency libelf found: YES 0.191 00:01:21.815 Message: lib/bpf: Defining dependency "bpf" 00:01:21.815 Message: lib/cfgfile: Defining dependency "cfgfile" 00:01:21.815 Message: lib/compressdev: Defining dependency "compressdev" 00:01:21.815 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:21.815 Message: lib/distributor: Defining dependency "distributor" 00:01:21.815 Message: lib/efd: Defining dependency "efd" 00:01:21.815 Message: lib/eventdev: Defining dependency "eventdev" 00:01:21.815 Message: lib/gpudev: Defining dependency "gpudev" 00:01:21.815 Message: lib/gro: Defining dependency "gro" 00:01:21.815 Message: lib/gso: Defining dependency "gso" 00:01:21.815 Message: lib/ip_frag: Defining dependency "ip_frag" 00:01:21.815 Message: lib/jobstats: Defining dependency "jobstats" 00:01:21.815 Message: lib/latencystats: Defining dependency "latencystats" 00:01:21.815 Message: lib/lpm: Defining dependency "lpm" 00:01:21.815 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:21.815 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:21.815 Fetching value of define "__AVX512IFMA__" : (undefined) 00:01:21.815 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:01:21.815 Message: lib/member: Defining dependency "member" 00:01:21.815 Message: lib/pcapng: Defining dependency "pcapng" 00:01:21.815 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:21.815 Message: lib/power: Defining dependency "power" 00:01:21.815 Message: lib/rawdev: Defining dependency "rawdev" 00:01:21.815 Message: lib/regexdev: Defining dependency "regexdev" 00:01:21.815 Message: lib/dmadev: Defining dependency "dmadev" 00:01:21.815 Message: lib/rib: Defining dependency "rib" 00:01:21.815 Message: lib/reorder: Defining dependency "reorder" 00:01:21.815 Message: lib/sched: Defining dependency "sched" 00:01:21.815 Message: lib/security: Defining dependency "security" 00:01:21.815 Message: lib/stack: Defining dependency "stack" 00:01:21.815 Has header "linux/userfaultfd.h" : YES 00:01:21.815 Message: lib/vhost: Defining dependency "vhost" 00:01:21.815 Message: lib/ipsec: Defining dependency "ipsec" 00:01:21.815 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:21.815 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:21.815 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:21.815 Message: lib/fib: Defining dependency "fib" 00:01:21.815 Message: lib/port: Defining dependency "port" 00:01:21.815 Message: lib/pdump: Defining dependency "pdump" 00:01:21.815 Message: lib/table: Defining dependency "table" 00:01:21.815 Message: lib/pipeline: Defining dependency "pipeline" 00:01:21.815 Message: lib/graph: Defining dependency "graph" 00:01:21.815 Message: lib/node: Defining dependency "node" 00:01:21.815 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:21.815 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:21.815 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:21.815 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:21.815 Compiler for C supports arguments -Wno-sign-compare: YES 00:01:21.815 Compiler for C supports arguments -Wno-unused-value: YES 00:01:21.815 Compiler for C supports arguments -Wno-format: YES 00:01:21.815 Compiler for C supports arguments -Wno-format-security: YES 00:01:21.815 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:01:22.399 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:01:22.399 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:01:22.399 Compiler for C supports arguments -Wno-unused-parameter: YES 00:01:22.399 Fetching value of define "__AVX2__" : 1 (cached) 00:01:22.399 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:22.399 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:22.399 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:22.399 Compiler for C supports arguments -mavx512bw: YES (cached) 00:01:22.399 Compiler for C supports arguments -march=skylake-avx512: YES 00:01:22.399 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:01:22.399 Program doxygen found: YES (/usr/local/bin/doxygen) 00:01:22.399 Configuring doxy-api.conf using configuration 00:01:22.399 Program sphinx-build found: NO 00:01:22.399 Configuring rte_build_config.h using configuration 00:01:22.399 Message: 00:01:22.399 ================= 00:01:22.399 Applications Enabled 00:01:22.399 ================= 00:01:22.399 00:01:22.399 apps: 00:01:22.399 dumpcap, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, test-crypto-perf, 00:01:22.399 test-eventdev, test-fib, test-flow-perf, test-gpudev, test-pipeline, test-pmd, test-regex, test-sad, 00:01:22.399 test-security-perf, 00:01:22.399 00:01:22.399 Message: 00:01:22.399 ================= 00:01:22.399 Libraries Enabled 00:01:22.399 ================= 00:01:22.399 00:01:22.399 libs: 00:01:22.399 kvargs, telemetry, eal, ring, rcu, mempool, mbuf, net, 00:01:22.399 meter, ethdev, pci, cmdline, metrics, hash, timer, acl, 00:01:22.399 bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, efd, 00:01:22.399 eventdev, gpudev, gro, gso, ip_frag, jobstats, latencystats, lpm, 00:01:22.399 member, pcapng, power, rawdev, regexdev, dmadev, rib, reorder, 00:01:22.399 sched, security, stack, vhost, ipsec, fib, port, pdump, 00:01:22.399 table, pipeline, graph, node, 00:01:22.399 00:01:22.399 Message: 00:01:22.399 =============== 00:01:22.399 Drivers Enabled 00:01:22.399 =============== 00:01:22.399 00:01:22.399 common: 00:01:22.399 00:01:22.399 bus: 00:01:22.399 pci, vdev, 00:01:22.399 mempool: 00:01:22.399 ring, 00:01:22.399 dma: 00:01:22.399 00:01:22.399 net: 00:01:22.400 i40e, 00:01:22.400 raw: 00:01:22.400 00:01:22.400 crypto: 00:01:22.400 00:01:22.400 compress: 00:01:22.400 00:01:22.400 regex: 00:01:22.400 00:01:22.400 vdpa: 00:01:22.400 00:01:22.400 event: 00:01:22.400 00:01:22.400 baseband: 00:01:22.400 00:01:22.400 gpu: 00:01:22.400 00:01:22.400 00:01:22.400 Message: 00:01:22.400 ================= 00:01:22.400 Content Skipped 00:01:22.400 ================= 00:01:22.400 00:01:22.400 apps: 00:01:22.400 00:01:22.400 libs: 00:01:22.400 kni: explicitly disabled via build config (deprecated lib) 00:01:22.400 flow_classify: explicitly disabled via build config (deprecated lib) 00:01:22.400 00:01:22.400 drivers: 00:01:22.400 common/cpt: not in enabled drivers build config 00:01:22.400 common/dpaax: not in enabled drivers build config 00:01:22.400 common/iavf: not in enabled drivers build config 00:01:22.400 common/idpf: not in enabled drivers build config 00:01:22.400 common/mvep: not in enabled drivers build config 00:01:22.400 common/octeontx: not in enabled drivers build config 00:01:22.400 bus/auxiliary: not in enabled drivers build config 00:01:22.400 bus/dpaa: not in enabled drivers build config 00:01:22.400 bus/fslmc: not in enabled drivers build config 00:01:22.400 bus/ifpga: not in enabled drivers build config 00:01:22.400 bus/vmbus: not in enabled drivers build config 00:01:22.400 common/cnxk: not in enabled drivers build config 00:01:22.400 common/mlx5: not in enabled drivers build config 00:01:22.400 common/qat: not in enabled drivers build config 00:01:22.400 common/sfc_efx: not in enabled drivers build config 00:01:22.400 mempool/bucket: not in enabled drivers build config 00:01:22.400 mempool/cnxk: not in enabled drivers build config 00:01:22.400 mempool/dpaa: not in enabled drivers build config 00:01:22.400 mempool/dpaa2: not in enabled drivers build config 00:01:22.400 mempool/octeontx: not in enabled drivers build config 00:01:22.400 mempool/stack: not in enabled drivers build config 00:01:22.400 dma/cnxk: not in enabled drivers build config 00:01:22.400 dma/dpaa: not in enabled drivers build config 00:01:22.400 dma/dpaa2: not in enabled drivers build config 00:01:22.400 dma/hisilicon: not in enabled drivers build config 00:01:22.400 dma/idxd: not in enabled drivers build config 00:01:22.400 dma/ioat: not in enabled drivers build config 00:01:22.400 dma/skeleton: not in enabled drivers build config 00:01:22.400 net/af_packet: not in enabled drivers build config 00:01:22.400 net/af_xdp: not in enabled drivers build config 00:01:22.400 net/ark: not in enabled drivers build config 00:01:22.400 net/atlantic: not in enabled drivers build config 00:01:22.400 net/avp: not in enabled drivers build config 00:01:22.400 net/axgbe: not in enabled drivers build config 00:01:22.400 net/bnx2x: not in enabled drivers build config 00:01:22.400 net/bnxt: not in enabled drivers build config 00:01:22.400 net/bonding: not in enabled drivers build config 00:01:22.400 net/cnxk: not in enabled drivers build config 00:01:22.400 net/cxgbe: not in enabled drivers build config 00:01:22.400 net/dpaa: not in enabled drivers build config 00:01:22.400 net/dpaa2: not in enabled drivers build config 00:01:22.400 net/e1000: not in enabled drivers build config 00:01:22.400 net/ena: not in enabled drivers build config 00:01:22.400 net/enetc: not in enabled drivers build config 00:01:22.400 net/enetfec: not in enabled drivers build config 00:01:22.400 net/enic: not in enabled drivers build config 00:01:22.400 net/failsafe: not in enabled drivers build config 00:01:22.400 net/fm10k: not in enabled drivers build config 00:01:22.400 net/gve: not in enabled drivers build config 00:01:22.400 net/hinic: not in enabled drivers build config 00:01:22.400 net/hns3: not in enabled drivers build config 00:01:22.400 net/iavf: not in enabled drivers build config 00:01:22.400 net/ice: not in enabled drivers build config 00:01:22.400 net/idpf: not in enabled drivers build config 00:01:22.400 net/igc: not in enabled drivers build config 00:01:22.400 net/ionic: not in enabled drivers build config 00:01:22.400 net/ipn3ke: not in enabled drivers build config 00:01:22.400 net/ixgbe: not in enabled drivers build config 00:01:22.400 net/kni: not in enabled drivers build config 00:01:22.400 net/liquidio: not in enabled drivers build config 00:01:22.400 net/mana: not in enabled drivers build config 00:01:22.400 net/memif: not in enabled drivers build config 00:01:22.400 net/mlx4: not in enabled drivers build config 00:01:22.400 net/mlx5: not in enabled drivers build config 00:01:22.400 net/mvneta: not in enabled drivers build config 00:01:22.400 net/mvpp2: not in enabled drivers build config 00:01:22.400 net/netvsc: not in enabled drivers build config 00:01:22.400 net/nfb: not in enabled drivers build config 00:01:22.400 net/nfp: not in enabled drivers build config 00:01:22.400 net/ngbe: not in enabled drivers build config 00:01:22.400 net/null: not in enabled drivers build config 00:01:22.400 net/octeontx: not in enabled drivers build config 00:01:22.400 net/octeon_ep: not in enabled drivers build config 00:01:22.400 net/pcap: not in enabled drivers build config 00:01:22.400 net/pfe: not in enabled drivers build config 00:01:22.400 net/qede: not in enabled drivers build config 00:01:22.400 net/ring: not in enabled drivers build config 00:01:22.400 net/sfc: not in enabled drivers build config 00:01:22.400 net/softnic: not in enabled drivers build config 00:01:22.400 net/tap: not in enabled drivers build config 00:01:22.400 net/thunderx: not in enabled drivers build config 00:01:22.400 net/txgbe: not in enabled drivers build config 00:01:22.400 net/vdev_netvsc: not in enabled drivers build config 00:01:22.400 net/vhost: not in enabled drivers build config 00:01:22.400 net/virtio: not in enabled drivers build config 00:01:22.400 net/vmxnet3: not in enabled drivers build config 00:01:22.400 raw/cnxk_bphy: not in enabled drivers build config 00:01:22.400 raw/cnxk_gpio: not in enabled drivers build config 00:01:22.400 raw/dpaa2_cmdif: not in enabled drivers build config 00:01:22.400 raw/ifpga: not in enabled drivers build config 00:01:22.400 raw/ntb: not in enabled drivers build config 00:01:22.400 raw/skeleton: not in enabled drivers build config 00:01:22.400 crypto/armv8: not in enabled drivers build config 00:01:22.400 crypto/bcmfs: not in enabled drivers build config 00:01:22.400 crypto/caam_jr: not in enabled drivers build config 00:01:22.400 crypto/ccp: not in enabled drivers build config 00:01:22.400 crypto/cnxk: not in enabled drivers build config 00:01:22.400 crypto/dpaa_sec: not in enabled drivers build config 00:01:22.400 crypto/dpaa2_sec: not in enabled drivers build config 00:01:22.400 crypto/ipsec_mb: not in enabled drivers build config 00:01:22.400 crypto/mlx5: not in enabled drivers build config 00:01:22.400 crypto/mvsam: not in enabled drivers build config 00:01:22.400 crypto/nitrox: not in enabled drivers build config 00:01:22.400 crypto/null: not in enabled drivers build config 00:01:22.400 crypto/octeontx: not in enabled drivers build config 00:01:22.400 crypto/openssl: not in enabled drivers build config 00:01:22.400 crypto/scheduler: not in enabled drivers build config 00:01:22.400 crypto/uadk: not in enabled drivers build config 00:01:22.400 crypto/virtio: not in enabled drivers build config 00:01:22.400 compress/isal: not in enabled drivers build config 00:01:22.400 compress/mlx5: not in enabled drivers build config 00:01:22.400 compress/octeontx: not in enabled drivers build config 00:01:22.400 compress/zlib: not in enabled drivers build config 00:01:22.400 regex/mlx5: not in enabled drivers build config 00:01:22.400 regex/cn9k: not in enabled drivers build config 00:01:22.400 vdpa/ifc: not in enabled drivers build config 00:01:22.400 vdpa/mlx5: not in enabled drivers build config 00:01:22.400 vdpa/sfc: not in enabled drivers build config 00:01:22.400 event/cnxk: not in enabled drivers build config 00:01:22.400 event/dlb2: not in enabled drivers build config 00:01:22.400 event/dpaa: not in enabled drivers build config 00:01:22.400 event/dpaa2: not in enabled drivers build config 00:01:22.400 event/dsw: not in enabled drivers build config 00:01:22.400 event/opdl: not in enabled drivers build config 00:01:22.400 event/skeleton: not in enabled drivers build config 00:01:22.400 event/sw: not in enabled drivers build config 00:01:22.400 event/octeontx: not in enabled drivers build config 00:01:22.400 baseband/acc: not in enabled drivers build config 00:01:22.400 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:01:22.400 baseband/fpga_lte_fec: not in enabled drivers build config 00:01:22.400 baseband/la12xx: not in enabled drivers build config 00:01:22.400 baseband/null: not in enabled drivers build config 00:01:22.400 baseband/turbo_sw: not in enabled drivers build config 00:01:22.400 gpu/cuda: not in enabled drivers build config 00:01:22.400 00:01:22.400 00:01:22.400 Build targets in project: 311 00:01:22.400 00:01:22.400 DPDK 22.11.4 00:01:22.400 00:01:22.400 User defined options 00:01:22.400 libdir : lib 00:01:22.400 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:22.400 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:01:22.400 c_link_args : 00:01:22.400 enable_docs : false 00:01:22.400 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:01:22.400 enable_kmods : false 00:01:22.400 machine : native 00:01:22.400 tests : false 00:01:22.400 00:01:22.400 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:22.400 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:01:22.400 16:37:07 -- common/autobuild_common.sh@189 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:01:22.400 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:22.400 [1/740] Generating lib/rte_kvargs_def with a custom command 00:01:22.400 [2/740] Generating lib/rte_kvargs_mingw with a custom command 00:01:22.400 [3/740] Generating lib/rte_telemetry_def with a custom command 00:01:22.400 [4/740] Generating lib/rte_telemetry_mingw with a custom command 00:01:22.400 [5/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:22.401 [6/740] Generating lib/rte_ring_mingw with a custom command 00:01:22.401 [7/740] Generating lib/rte_rcu_mingw with a custom command 00:01:22.667 [8/740] Generating lib/rte_ring_def with a custom command 00:01:22.668 [9/740] Generating lib/rte_mbuf_mingw with a custom command 00:01:22.668 [10/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:22.668 [11/740] Generating lib/rte_mempool_mingw with a custom command 00:01:22.668 [12/740] Generating lib/rte_mempool_def with a custom command 00:01:22.668 [13/740] Generating lib/rte_mbuf_def with a custom command 00:01:22.668 [14/740] Generating lib/rte_net_def with a custom command 00:01:22.668 [15/740] Generating lib/rte_eal_def with a custom command 00:01:22.668 [16/740] Generating lib/rte_rcu_def with a custom command 00:01:22.668 [17/740] Generating lib/rte_meter_def with a custom command 00:01:22.668 [18/740] Generating lib/rte_eal_mingw with a custom command 00:01:22.668 [19/740] Generating lib/rte_net_mingw with a custom command 00:01:22.668 [20/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:22.668 [21/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:22.668 [22/740] Generating lib/rte_meter_mingw with a custom command 00:01:22.668 [23/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:22.668 [24/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:22.668 [25/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:22.668 [26/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:22.668 [27/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:22.668 [28/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:22.668 [29/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_log.c.o 00:01:22.668 [30/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:22.668 [31/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:22.668 [32/740] Generating lib/rte_ethdev_mingw with a custom command 00:01:22.668 [33/740] Generating lib/rte_pci_mingw with a custom command 00:01:22.668 [34/740] Generating lib/rte_pci_def with a custom command 00:01:22.668 [35/740] Generating lib/rte_ethdev_def with a custom command 00:01:22.668 [36/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:22.668 [37/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:22.668 [38/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:22.668 [39/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:22.668 [40/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:22.668 [41/740] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:22.668 [42/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:22.668 [43/740] Linking static target lib/librte_kvargs.a 00:01:22.668 [44/740] Generating lib/rte_cmdline_mingw with a custom command 00:01:22.668 [45/740] Generating lib/rte_metrics_def with a custom command 00:01:22.668 [46/740] Generating lib/rte_cmdline_def with a custom command 00:01:22.668 [47/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:22.668 [48/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:22.668 [49/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:22.668 [50/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:22.668 [51/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:22.668 [52/740] Generating lib/rte_metrics_mingw with a custom command 00:01:22.668 [53/740] Generating lib/rte_hash_def with a custom command 00:01:22.668 [54/740] Generating lib/rte_hash_mingw with a custom command 00:01:22.668 [55/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:22.668 [56/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:22.668 [57/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:22.668 [58/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:22.668 [59/740] Generating lib/rte_timer_mingw with a custom command 00:01:22.668 [60/740] Generating lib/rte_timer_def with a custom command 00:01:22.668 [61/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:22.668 [62/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:22.668 [63/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:22.668 [64/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:22.668 [65/740] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:22.668 [66/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:22.668 [67/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:22.668 [68/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:22.668 [69/740] Generating lib/rte_acl_mingw with a custom command 00:01:22.668 [70/740] Generating lib/rte_acl_def with a custom command 00:01:22.668 [71/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:22.668 [72/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:22.668 [73/740] Generating lib/rte_bbdev_def with a custom command 00:01:22.668 [74/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:22.668 [75/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:22.668 [76/740] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:22.668 [77/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:22.668 [78/740] Generating lib/rte_bbdev_mingw with a custom command 00:01:22.668 [79/740] Generating lib/rte_bitratestats_def with a custom command 00:01:22.668 [80/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:22.668 [81/740] Generating lib/rte_bitratestats_mingw with a custom command 00:01:22.668 [82/740] Linking static target lib/librte_pci.a 00:01:22.668 [83/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:22.668 [84/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:22.668 [85/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:22.668 [86/740] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:22.668 [87/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:22.927 [88/740] Generating lib/rte_bpf_def with a custom command 00:01:22.927 [89/740] Generating lib/rte_bpf_mingw with a custom command 00:01:22.927 [90/740] Generating lib/rte_cfgfile_mingw with a custom command 00:01:22.927 [91/740] Generating lib/rte_cfgfile_def with a custom command 00:01:22.927 [92/740] Linking static target lib/librte_meter.a 00:01:22.927 [93/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:22.927 [94/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:22.927 [95/740] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:22.927 [96/740] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:22.927 [97/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:22.927 [98/740] Generating lib/rte_compressdev_def with a custom command 00:01:22.927 [99/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:22.927 [100/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:22.927 [101/740] Linking static target lib/librte_ring.a 00:01:22.927 [102/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:22.927 [103/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:22.927 [104/740] Generating lib/rte_compressdev_mingw with a custom command 00:01:22.927 [105/740] Generating lib/rte_cryptodev_def with a custom command 00:01:22.927 [106/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_log.c.o 00:01:22.927 [107/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:22.927 [108/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:22.927 [109/740] Generating lib/rte_cryptodev_mingw with a custom command 00:01:22.927 [110/740] Generating lib/rte_distributor_mingw with a custom command 00:01:22.927 [111/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:22.927 [112/740] Generating lib/rte_distributor_def with a custom command 00:01:22.927 [113/740] Generating lib/rte_efd_def with a custom command 00:01:22.927 [114/740] Generating lib/rte_efd_mingw with a custom command 00:01:22.927 [115/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:22.927 [116/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:22.927 [117/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:22.927 [118/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:22.927 [119/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:22.927 [120/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:22.927 [121/740] Generating lib/rte_gpudev_def with a custom command 00:01:22.927 [122/740] Generating lib/rte_eventdev_def with a custom command 00:01:22.927 [123/740] Generating lib/rte_eventdev_mingw with a custom command 00:01:22.927 [124/740] Generating lib/rte_gpudev_mingw with a custom command 00:01:22.927 [125/740] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:22.927 [126/740] Generating lib/rte_gro_def with a custom command 00:01:22.927 [127/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:01:22.927 [128/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:22.927 [129/740] Generating lib/rte_gro_mingw with a custom command 00:01:22.927 [130/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:22.927 [131/740] Generating lib/rte_gso_def with a custom command 00:01:22.927 [132/740] Generating lib/rte_gso_mingw with a custom command 00:01:22.927 [133/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:23.194 [134/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:23.194 [135/740] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.194 [136/740] Generating lib/rte_ip_frag_def with a custom command 00:01:23.194 [137/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:23.194 [138/740] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.194 [139/740] Generating lib/rte_ip_frag_mingw with a custom command 00:01:23.194 [140/740] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.194 [141/740] Generating lib/rte_jobstats_def with a custom command 00:01:23.194 [142/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:23.194 [143/740] Generating lib/rte_jobstats_mingw with a custom command 00:01:23.194 [144/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:23.194 [145/740] Linking target lib/librte_kvargs.so.23.0 00:01:23.194 [146/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:23.194 [147/740] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:01:23.194 [148/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:23.194 [149/740] Generating lib/rte_latencystats_def with a custom command 00:01:23.194 [150/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:23.194 [151/740] Linking static target lib/librte_cfgfile.a 00:01:23.194 [152/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:23.194 [153/740] Generating lib/rte_latencystats_mingw with a custom command 00:01:23.194 [154/740] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:23.194 [155/740] Generating lib/rte_lpm_def with a custom command 00:01:23.194 [156/740] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:23.194 [157/740] Generating lib/rte_lpm_mingw with a custom command 00:01:23.194 [158/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:23.194 [159/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:23.194 [160/740] Generating lib/rte_member_def with a custom command 00:01:23.194 [161/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:23.194 [162/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:23.194 [163/740] Generating lib/rte_member_mingw with a custom command 00:01:23.194 [164/740] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:01:23.194 [165/740] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.194 [166/740] Generating lib/rte_pcapng_def with a custom command 00:01:23.194 [167/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:23.195 [168/740] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:23.195 [169/740] Generating lib/rte_pcapng_mingw with a custom command 00:01:23.195 [170/740] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:01:23.195 [171/740] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:23.195 [172/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:23.195 [173/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:23.195 [174/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:23.195 [175/740] Linking static target lib/librte_jobstats.a 00:01:23.195 [176/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:23.464 [177/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:23.464 [178/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:23.464 [179/740] Generating lib/rte_power_def with a custom command 00:01:23.464 [180/740] Linking static target lib/librte_cmdline.a 00:01:23.464 [181/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:23.464 [182/740] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:23.464 [183/740] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:23.464 [184/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:01:23.464 [185/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:23.464 [186/740] Linking static target lib/librte_timer.a 00:01:23.464 [187/740] Generating lib/rte_power_mingw with a custom command 00:01:23.464 [188/740] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:23.464 [189/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:01:23.464 [190/740] Linking static target lib/librte_metrics.a 00:01:23.464 [191/740] Linking static target lib/librte_telemetry.a 00:01:23.464 [192/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:01:23.464 [193/740] Generating lib/rte_rawdev_mingw with a custom command 00:01:23.464 [194/740] Generating lib/rte_rawdev_def with a custom command 00:01:23.464 [195/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:23.464 [196/740] Generating lib/rte_regexdev_mingw with a custom command 00:01:23.464 [197/740] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:23.464 [198/740] Generating lib/rte_regexdev_def with a custom command 00:01:23.464 [199/740] Generating symbol file lib/librte_kvargs.so.23.0.p/librte_kvargs.so.23.0.symbols 00:01:23.464 [200/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:01:23.464 [201/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:01:23.464 [202/740] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:23.464 [203/740] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:23.464 [204/740] Generating lib/rte_dmadev_mingw with a custom command 00:01:23.464 [205/740] Generating lib/rte_rib_mingw with a custom command 00:01:23.464 [206/740] Generating lib/rte_rib_def with a custom command 00:01:23.464 [207/740] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:23.464 [208/740] Generating lib/rte_dmadev_def with a custom command 00:01:23.464 [209/740] Generating lib/rte_reorder_def with a custom command 00:01:23.464 [210/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:23.464 [211/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:23.464 [212/740] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:01:23.464 [213/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:23.464 [214/740] Generating lib/rte_reorder_mingw with a custom command 00:01:23.464 [215/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:23.464 [216/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:01:23.464 [217/740] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:23.464 [218/740] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:01:23.464 [219/740] Generating lib/rte_sched_def with a custom command 00:01:23.464 [220/740] Generating lib/rte_sched_mingw with a custom command 00:01:23.464 [221/740] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:23.464 [222/740] Generating lib/rte_security_mingw with a custom command 00:01:23.464 [223/740] Generating lib/rte_security_def with a custom command 00:01:23.464 [224/740] Linking static target lib/librte_net.a 00:01:23.464 [225/740] Linking static target lib/librte_bitratestats.a 00:01:23.464 [226/740] Generating lib/rte_stack_mingw with a custom command 00:01:23.464 [227/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:01:23.464 [228/740] Generating lib/rte_stack_def with a custom command 00:01:23.464 [229/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:23.464 [230/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:23.464 [231/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:01:23.464 [232/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:23.464 [233/740] Generating lib/rte_vhost_def with a custom command 00:01:23.464 [234/740] Generating lib/rte_vhost_mingw with a custom command 00:01:23.464 [235/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:23.465 [236/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:23.465 [237/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:23.465 [238/740] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:01:23.465 [239/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:01:23.465 [240/740] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:01:23.465 [241/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:23.465 [242/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:01:23.465 [243/740] Generating lib/rte_ipsec_def with a custom command 00:01:23.465 [244/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:01:23.465 [245/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:01:23.465 [246/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:23.465 [247/740] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:01:23.465 [248/740] Generating lib/rte_ipsec_mingw with a custom command 00:01:23.465 [249/740] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:01:23.731 [250/740] Compiling C object lib/librte_power.a.p/power_rte_power_empty_poll.c.o 00:01:23.731 [251/740] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:01:23.731 [252/740] Generating lib/rte_fib_def with a custom command 00:01:23.731 [253/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:01:23.731 [254/740] Generating lib/rte_fib_mingw with a custom command 00:01:23.731 [255/740] Linking static target lib/librte_stack.a 00:01:23.731 [256/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:01:23.731 [257/740] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:01:23.731 [258/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:01:23.731 [259/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:01:23.731 [260/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:23.731 [261/740] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:01:23.732 [262/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:01:23.732 [263/740] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:23.732 [264/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:01:23.732 [265/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:23.732 [266/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:23.732 [267/740] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.732 [268/740] Generating lib/rte_port_def with a custom command 00:01:23.732 [269/740] Linking static target lib/librte_compressdev.a 00:01:23.732 [270/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:23.732 [271/740] Generating lib/rte_pdump_def with a custom command 00:01:23.732 [272/740] Generating lib/rte_port_mingw with a custom command 00:01:23.732 [273/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:01:23.732 [274/740] Generating lib/rte_pdump_mingw with a custom command 00:01:23.732 [275/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:01:23.732 [276/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:01:23.732 [277/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:01:23.732 [278/740] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:01:23.732 [279/740] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:23.732 [280/740] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:01:23.732 [281/740] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.732 [282/740] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.732 [283/740] Linking static target lib/librte_rcu.a 00:01:23.732 [284/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:23.732 [285/740] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:01:23.732 [286/740] Linking static target lib/librte_mempool.a 00:01:23.732 [287/740] Linking static target lib/librte_rawdev.a 00:01:23.732 [288/740] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:01:23.732 [289/740] Generating lib/rte_table_mingw with a custom command 00:01:23.991 [290/740] Generating lib/rte_table_def with a custom command 00:01:23.991 [291/740] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.991 [292/740] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:01:23.991 [293/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:01:23.991 [294/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:01:23.991 [295/740] Linking static target lib/librte_bbdev.a 00:01:23.991 [296/740] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:23.991 [297/740] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:01:23.991 [298/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:01:23.991 [299/740] Linking static target lib/librte_dmadev.a 00:01:23.991 [300/740] Linking static target lib/librte_gpudev.a 00:01:23.991 [301/740] Linking static target lib/librte_gro.a 00:01:23.991 [302/740] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:01:23.991 [303/740] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.991 [304/740] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:23.991 [305/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:01:23.991 [306/740] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.991 [307/740] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:01:23.991 [308/740] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.991 [309/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:01:23.991 [310/740] Compiling C object lib/librte_power.a.p/power_rte_power_intel_uncore.c.o 00:01:23.992 [311/740] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:01:23.992 [312/740] Linking static target lib/librte_gso.a 00:01:23.992 [313/740] Generating lib/rte_pipeline_def with a custom command 00:01:23.992 [314/740] Generating lib/rte_pipeline_mingw with a custom command 00:01:23.992 [315/740] Linking target lib/librte_telemetry.so.23.0 00:01:23.992 [316/740] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:01:23.992 [317/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:01:23.992 [318/740] Generating lib/rte_graph_def with a custom command 00:01:23.992 [319/740] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:01:23.992 [320/740] Generating lib/rte_graph_mingw with a custom command 00:01:23.992 [321/740] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:01:23.992 [322/740] Linking static target lib/librte_latencystats.a 00:01:23.992 [323/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:01:23.992 [324/740] Linking static target lib/member/libsketch_avx512_tmp.a 00:01:23.992 [325/740] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:23.992 [326/740] Linking static target lib/librte_distributor.a 00:01:23.992 [327/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:01:23.992 [328/740] Linking static target lib/librte_ip_frag.a 00:01:23.992 [329/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:01:23.992 [330/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:01:24.260 [331/740] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:24.260 [332/740] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:01:24.260 [333/740] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:24.260 [334/740] Linking static target lib/librte_regexdev.a 00:01:24.260 [335/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:01:24.260 [336/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:01:24.260 [337/740] Generating symbol file lib/librte_telemetry.so.23.0.p/librte_telemetry.so.23.0.symbols 00:01:24.260 [338/740] Compiling C object lib/librte_node.a.p/node_null.c.o 00:01:24.260 [339/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:01:24.260 [340/740] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:01:24.260 [341/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:01:24.260 [342/740] Generating lib/rte_node_def with a custom command 00:01:24.260 [343/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:24.260 [344/740] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.260 [345/740] Generating lib/rte_node_mingw with a custom command 00:01:24.260 [346/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:01:24.260 [347/740] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.260 [348/740] Linking static target lib/librte_eal.a 00:01:24.260 [349/740] Generating drivers/rte_bus_pci_def with a custom command 00:01:24.260 [350/740] Generating drivers/rte_bus_pci_mingw with a custom command 00:01:24.260 [351/740] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:01:24.260 [352/740] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:24.260 [353/740] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:24.260 [354/740] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:24.260 [355/740] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.260 [356/740] Generating drivers/rte_bus_vdev_def with a custom command 00:01:24.260 [357/740] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:24.260 [358/740] Linking static target lib/librte_power.a 00:01:24.260 [359/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:01:24.260 [360/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:01:24.260 [361/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:24.260 [362/740] Linking static target lib/librte_reorder.a 00:01:24.260 [363/740] Generating drivers/rte_bus_vdev_mingw with a custom command 00:01:24.260 [364/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:24.260 [365/740] Generating drivers/rte_mempool_ring_mingw with a custom command 00:01:24.260 [366/740] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.260 [367/740] Generating drivers/rte_mempool_ring_def with a custom command 00:01:24.260 [368/740] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:01:24.521 [369/740] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:24.521 [370/740] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:01:24.521 [371/740] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:01:24.521 [372/740] Linking static target lib/librte_security.a 00:01:24.521 [373/740] Linking static target lib/librte_pcapng.a 00:01:24.521 [374/740] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:01:24.521 [375/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:24.521 [376/740] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.521 [377/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:01:24.521 [378/740] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:01:24.521 [379/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:01:24.521 [380/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:01:24.521 [381/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:24.521 [382/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:01:24.521 [383/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:01:24.521 [384/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:01:24.521 [385/740] Linking static target lib/librte_mbuf.a 00:01:24.521 [386/740] Linking static target lib/librte_bpf.a 00:01:24.521 [387/740] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:01:24.521 [388/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:01:24.522 [389/740] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.522 [390/740] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.522 [391/740] Generating drivers/rte_net_i40e_def with a custom command 00:01:24.522 [392/740] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:01:24.522 [393/740] Generating drivers/rte_net_i40e_mingw with a custom command 00:01:24.522 [394/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:24.522 [395/740] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:24.522 [396/740] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:01:24.522 [397/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:01:24.522 [398/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:01:24.522 [399/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:24.787 [400/740] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:24.787 [401/740] Compiling C object lib/librte_node.a.p/node_log.c.o 00:01:24.787 [402/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:01:24.787 [403/740] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:01:24.787 [404/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:01:24.787 [405/740] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:01:24.787 [406/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:01:24.787 [407/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:24.787 [408/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:01:24.787 [409/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:01:24.787 [410/740] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:01:24.787 [411/740] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:01:24.787 [412/740] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:01:24.787 [413/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:01:24.787 [414/740] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:01:24.787 [415/740] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.787 [416/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:01:24.787 [417/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:01:24.787 [418/740] Linking static target lib/librte_rib.a 00:01:24.787 [419/740] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.787 [420/740] Linking static target lib/librte_lpm.a 00:01:24.787 [421/740] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:01:24.787 [422/740] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.787 [423/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:01:24.787 [424/740] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:01:24.787 [425/740] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:01:24.787 [426/740] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:24.787 [427/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:24.787 [428/740] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:01:24.788 [429/740] Linking static target lib/librte_graph.a 00:01:24.788 [430/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:01:24.788 [431/740] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:01:24.788 [432/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:01:24.788 [433/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:24.788 [434/740] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:01:24.788 [435/740] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:01:24.788 [436/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:01:24.788 [437/740] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:01:24.788 [438/740] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:01:25.053 [439/740] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:01:25.053 [440/740] Linking static target lib/librte_efd.a 00:01:25.053 [441/740] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:01:25.053 [442/740] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.053 [443/740] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:01:25.053 [444/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:01:25.053 [445/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:25.053 [446/740] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:25.053 [447/740] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:25.053 [448/740] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.053 [449/740] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:25.053 [450/740] Linking static target drivers/librte_bus_vdev.a 00:01:25.053 [451/740] Compiling C object drivers/librte_bus_vdev.so.23.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:25.053 [452/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:01:25.053 [453/740] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.053 [454/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:01:25.053 [455/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:01:25.053 [456/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:01:25.053 [457/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:01:25.053 [458/740] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.053 [459/740] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.053 [460/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:01:25.316 [461/740] Linking static target lib/librte_fib.a 00:01:25.316 [462/740] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.316 [463/740] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:01:25.316 [464/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:01:25.316 [465/740] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.316 [466/740] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.316 [467/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:25.316 [468/740] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:01:25.316 [469/740] Linking static target lib/librte_pdump.a 00:01:25.316 [470/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:01:25.316 [471/740] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.316 [472/740] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:25.316 [473/740] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:01:25.316 [474/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:01:25.316 [475/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:01:25.316 [476/740] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.316 [477/740] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:25.316 [478/740] Compiling C object drivers/librte_bus_pci.so.23.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:25.580 [479/740] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.580 [480/740] Linking static target drivers/librte_bus_pci.a 00:01:25.580 [481/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:01:25.580 [482/740] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.580 [483/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:01:25.580 [484/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:25.580 [485/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:01:25.580 [486/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:01:25.580 [487/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:01:25.580 [488/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:01:25.580 [489/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:01:25.580 [490/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:01:25.580 [491/740] Linking static target lib/librte_table.a 00:01:25.580 [492/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:01:25.580 [493/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:01:25.580 [494/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:01:25.580 [495/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:01:25.580 [496/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:01:25.580 [497/740] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:01:25.580 [498/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:01:25.839 [499/740] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:01:25.839 [500/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:01:25.839 [501/740] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:01:25.839 [502/740] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.839 [503/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:01:25.839 [504/740] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:01:25.839 [505/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:01:25.839 [506/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:01:25.839 [507/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:01:25.839 [508/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:01:25.839 [509/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:01:25.839 [510/740] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.839 [511/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:01:25.839 [512/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:01:25.839 [513/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:01:25.839 [514/740] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.839 [515/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:01:25.839 [516/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:01:25.839 [517/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:01:25.839 [518/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:01:25.839 [519/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:01:25.839 [520/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:01:25.839 [521/740] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:25.839 [522/740] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:25.839 [523/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:01:25.839 [524/740] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:25.839 [525/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:25.839 [526/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:01:25.839 [527/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:01:26.099 [528/740] Linking static target lib/librte_cryptodev.a 00:01:26.099 [529/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:01:26.099 [530/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:01:26.099 [531/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:01:26.099 [532/740] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:01:26.099 [533/740] Linking static target lib/librte_sched.a 00:01:26.099 [534/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:01:26.099 [535/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:01:26.099 [536/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:01:26.099 [537/740] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:01:26.099 [538/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:01:26.099 [539/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:01:26.099 [540/740] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:01:26.099 [541/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:01:26.099 [542/740] Linking static target lib/librte_node.a 00:01:26.099 [543/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:01:26.099 [544/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:01:26.099 [545/740] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:01:26.099 [546/740] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:01:26.099 [547/740] Linking static target lib/librte_ipsec.a 00:01:26.099 [548/740] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:26.099 [549/740] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:01:26.099 [550/740] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:26.099 [551/740] Compiling C object drivers/librte_mempool_ring.so.23.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:26.099 [552/740] Linking static target drivers/librte_mempool_ring.a 00:01:26.099 [553/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:01:26.099 [554/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:26.099 [555/740] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:01:26.099 [556/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:01:26.099 [557/740] Linking static target lib/librte_ethdev.a 00:01:26.359 [558/740] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:01:26.359 [559/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:01:26.359 [560/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:01:26.359 [561/740] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:01:26.359 [562/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:01:26.359 [563/740] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:01:26.359 [564/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:01:26.359 [565/740] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:01:26.359 [566/740] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:01:26.359 [567/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:01:26.359 [568/740] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:01:26.359 [569/740] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:01:26.359 [570/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:01:26.359 [571/740] Linking static target lib/librte_port.a 00:01:26.359 [572/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:01:26.359 [573/740] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:01:26.359 [574/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:01:26.359 [575/740] Linking static target lib/librte_member.a 00:01:26.359 [576/740] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:01:26.359 [577/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:01:26.359 [578/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:01:26.359 [579/740] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:01:26.359 [580/740] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:01:26.359 [581/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:01:26.359 [582/740] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:01:26.359 [583/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:01:26.359 [584/740] Linking static target lib/librte_eventdev.a 00:01:26.359 [585/740] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:01:26.359 [586/740] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:01:26.359 [587/740] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:01:26.359 [588/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:01:26.619 [589/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:01:26.620 [590/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:01:26.620 [591/740] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:01:26.620 [592/740] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:01:26.620 [593/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx2.c.o 00:01:26.620 [594/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:01:26.620 [595/740] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:01:26.620 [596/740] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:01:26.620 [597/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:01:26.620 [598/740] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:26.620 [599/740] Linking static target lib/librte_hash.a 00:01:26.879 [600/740] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:01:26.879 [601/740] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:01:26.879 [602/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:01:26.879 [603/740] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:01:26.879 [604/740] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:01:26.879 [605/740] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:01:26.879 [606/740] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:01:26.879 [607/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_avx2.c.o 00:01:26.879 [608/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:01:26.879 [609/740] Linking static target drivers/net/i40e/base/libi40e_base.a 00:01:27.138 [610/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:01:27.138 [611/740] Linking static target lib/librte_acl.a 00:01:27.138 [612/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:01:27.138 [613/740] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:01:27.398 [614/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:01:27.398 [615/740] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:01:27.398 [616/740] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:01:27.658 [617/740] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:01:27.918 [618/740] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:27.918 [619/740] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:01:28.178 [620/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:01:28.178 [621/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:01:28.747 [622/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:01:28.747 [623/740] Linking static target drivers/libtmp_rte_net_i40e.a 00:01:29.007 [624/740] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:01:29.007 [625/740] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:29.007 [626/740] Compiling C object drivers/librte_net_i40e.so.23.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:01:29.007 [627/740] Linking static target drivers/librte_net_i40e.a 00:01:29.576 [628/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:29.576 [629/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:01:29.836 [630/740] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:29.836 [631/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:01:29.836 [632/740] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:30.096 [633/740] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.379 [634/740] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.639 [635/740] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:35.639 [636/740] Linking static target lib/librte_vhost.a 00:01:36.211 [637/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:01:36.211 [638/740] Linking static target lib/librte_pipeline.a 00:01:36.470 [639/740] Linking target app/dpdk-dumpcap 00:01:36.470 [640/740] Linking target app/dpdk-test-cmdline 00:01:36.470 [641/740] Linking target app/dpdk-test-compress-perf 00:01:36.470 [642/740] Linking target app/dpdk-test-acl 00:01:36.470 [643/740] Linking target app/dpdk-pdump 00:01:36.470 [644/740] Linking target app/dpdk-test-gpudev 00:01:36.470 [645/740] Linking target app/dpdk-test-bbdev 00:01:36.470 [646/740] Linking target app/dpdk-proc-info 00:01:36.470 [647/740] Linking target app/dpdk-test-sad 00:01:36.470 [648/740] Linking target app/dpdk-test-fib 00:01:36.470 [649/740] Linking target app/dpdk-test-flow-perf 00:01:36.470 [650/740] Linking target app/dpdk-test-crypto-perf 00:01:36.470 [651/740] Linking target app/dpdk-test-regex 00:01:36.470 [652/740] Linking target app/dpdk-test-security-perf 00:01:36.470 [653/740] Linking target app/dpdk-test-pipeline 00:01:36.470 [654/740] Linking target app/dpdk-test-eventdev 00:01:36.730 [655/740] Linking target app/dpdk-testpmd 00:01:38.115 [656/740] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.375 [657/740] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.635 [658/740] Linking target lib/librte_eal.so.23.0 00:01:38.635 [659/740] Generating symbol file lib/librte_eal.so.23.0.p/librte_eal.so.23.0.symbols 00:01:38.895 [660/740] Linking target lib/librte_ring.so.23.0 00:01:38.895 [661/740] Linking target lib/librte_timer.so.23.0 00:01:38.895 [662/740] Linking target lib/librte_meter.so.23.0 00:01:38.895 [663/740] Linking target lib/librte_stack.so.23.0 00:01:38.895 [664/740] Linking target lib/librte_jobstats.so.23.0 00:01:38.895 [665/740] Linking target lib/librte_cfgfile.so.23.0 00:01:38.895 [666/740] Linking target lib/librte_rawdev.so.23.0 00:01:38.895 [667/740] Linking target lib/librte_pci.so.23.0 00:01:38.895 [668/740] Linking target lib/librte_dmadev.so.23.0 00:01:38.895 [669/740] Linking target drivers/librte_bus_vdev.so.23.0 00:01:38.895 [670/740] Linking target lib/librte_graph.so.23.0 00:01:38.895 [671/740] Linking target lib/librte_acl.so.23.0 00:01:38.895 [672/740] Generating symbol file lib/librte_timer.so.23.0.p/librte_timer.so.23.0.symbols 00:01:38.895 [673/740] Generating symbol file lib/librte_meter.so.23.0.p/librte_meter.so.23.0.symbols 00:01:38.895 [674/740] Generating symbol file lib/librte_dmadev.so.23.0.p/librte_dmadev.so.23.0.symbols 00:01:38.895 [675/740] Generating symbol file lib/librte_pci.so.23.0.p/librte_pci.so.23.0.symbols 00:01:38.895 [676/740] Generating symbol file lib/librte_acl.so.23.0.p/librte_acl.so.23.0.symbols 00:01:38.895 [677/740] Generating symbol file drivers/librte_bus_vdev.so.23.0.p/librte_bus_vdev.so.23.0.symbols 00:01:38.895 [678/740] Generating symbol file lib/librte_ring.so.23.0.p/librte_ring.so.23.0.symbols 00:01:38.895 [679/740] Generating symbol file lib/librte_graph.so.23.0.p/librte_graph.so.23.0.symbols 00:01:38.895 [680/740] Linking target lib/librte_rcu.so.23.0 00:01:38.895 [681/740] Linking target drivers/librte_bus_pci.so.23.0 00:01:38.895 [682/740] Linking target lib/librte_mempool.so.23.0 00:01:39.155 [683/740] Generating symbol file lib/librte_mempool.so.23.0.p/librte_mempool.so.23.0.symbols 00:01:39.155 [684/740] Generating symbol file lib/librte_rcu.so.23.0.p/librte_rcu.so.23.0.symbols 00:01:39.155 [685/740] Generating symbol file drivers/librte_bus_pci.so.23.0.p/librte_bus_pci.so.23.0.symbols 00:01:39.155 [686/740] Linking target lib/librte_rib.so.23.0 00:01:39.155 [687/740] Linking target lib/librte_mbuf.so.23.0 00:01:39.155 [688/740] Linking target drivers/librte_mempool_ring.so.23.0 00:01:39.155 [689/740] Generating symbol file lib/librte_mbuf.so.23.0.p/librte_mbuf.so.23.0.symbols 00:01:39.155 [690/740] Generating symbol file lib/librte_rib.so.23.0.p/librte_rib.so.23.0.symbols 00:01:39.415 [691/740] Linking target lib/librte_distributor.so.23.0 00:01:39.415 [692/740] Linking target lib/librte_bbdev.so.23.0 00:01:39.415 [693/740] Linking target lib/librte_compressdev.so.23.0 00:01:39.415 [694/740] Linking target lib/librte_net.so.23.0 00:01:39.415 [695/740] Linking target lib/librte_sched.so.23.0 00:01:39.415 [696/740] Linking target lib/librte_gpudev.so.23.0 00:01:39.415 [697/740] Linking target lib/librte_regexdev.so.23.0 00:01:39.415 [698/740] Linking target lib/librte_reorder.so.23.0 00:01:39.415 [699/740] Linking target lib/librte_cryptodev.so.23.0 00:01:39.415 [700/740] Linking target lib/librte_fib.so.23.0 00:01:39.415 [701/740] Generating symbol file lib/librte_net.so.23.0.p/librte_net.so.23.0.symbols 00:01:39.415 [702/740] Generating symbol file lib/librte_sched.so.23.0.p/librte_sched.so.23.0.symbols 00:01:39.415 [703/740] Generating symbol file lib/librte_cryptodev.so.23.0.p/librte_cryptodev.so.23.0.symbols 00:01:39.415 [704/740] Linking target lib/librte_cmdline.so.23.0 00:01:39.415 [705/740] Linking target lib/librte_hash.so.23.0 00:01:39.415 [706/740] Linking target lib/librte_security.so.23.0 00:01:39.415 [707/740] Linking target lib/librte_ethdev.so.23.0 00:01:39.676 [708/740] Generating symbol file lib/librte_ethdev.so.23.0.p/librte_ethdev.so.23.0.symbols 00:01:39.676 [709/740] Generating symbol file lib/librte_hash.so.23.0.p/librte_hash.so.23.0.symbols 00:01:39.676 [710/740] Generating symbol file lib/librte_security.so.23.0.p/librte_security.so.23.0.symbols 00:01:39.676 [711/740] Linking target lib/librte_gro.so.23.0 00:01:39.676 [712/740] Linking target lib/librte_metrics.so.23.0 00:01:39.676 [713/740] Linking target lib/librte_ip_frag.so.23.0 00:01:39.676 [714/740] Linking target lib/librte_pcapng.so.23.0 00:01:39.676 [715/740] Linking target lib/librte_bpf.so.23.0 00:01:39.676 [716/740] Linking target lib/librte_gso.so.23.0 00:01:39.676 [717/740] Linking target lib/librte_efd.so.23.0 00:01:39.676 [718/740] Linking target lib/librte_lpm.so.23.0 00:01:39.676 [719/740] Linking target lib/librte_power.so.23.0 00:01:39.676 [720/740] Linking target lib/librte_member.so.23.0 00:01:39.676 [721/740] Linking target lib/librte_ipsec.so.23.0 00:01:39.676 [722/740] Linking target lib/librte_eventdev.so.23.0 00:01:39.676 [723/740] Linking target lib/librte_vhost.so.23.0 00:01:39.676 [724/740] Linking target drivers/librte_net_i40e.so.23.0 00:01:39.936 [725/740] Generating symbol file lib/librte_pcapng.so.23.0.p/librte_pcapng.so.23.0.symbols 00:01:39.936 [726/740] Generating symbol file lib/librte_bpf.so.23.0.p/librte_bpf.so.23.0.symbols 00:01:39.936 [727/740] Generating symbol file lib/librte_ip_frag.so.23.0.p/librte_ip_frag.so.23.0.symbols 00:01:39.936 [728/740] Generating symbol file lib/librte_metrics.so.23.0.p/librte_metrics.so.23.0.symbols 00:01:39.936 [729/740] Generating symbol file lib/librte_lpm.so.23.0.p/librte_lpm.so.23.0.symbols 00:01:39.936 [730/740] Generating symbol file lib/librte_eventdev.so.23.0.p/librte_eventdev.so.23.0.symbols 00:01:39.936 [731/740] Linking target lib/librte_bitratestats.so.23.0 00:01:39.936 [732/740] Linking target lib/librte_latencystats.so.23.0 00:01:39.936 [733/740] Linking target lib/librte_pdump.so.23.0 00:01:39.936 [734/740] Linking target lib/librte_node.so.23.0 00:01:39.936 [735/740] Linking target lib/librte_port.so.23.0 00:01:40.196 [736/740] Generating symbol file lib/librte_port.so.23.0.p/librte_port.so.23.0.symbols 00:01:40.196 [737/740] Linking target lib/librte_table.so.23.0 00:01:40.196 [738/740] Generating symbol file lib/librte_table.so.23.0.p/librte_table.so.23.0.symbols 00:01:41.577 [739/740] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:41.577 [740/740] Linking target lib/librte_pipeline.so.23.0 00:01:41.577 16:37:27 -- common/autobuild_common.sh@190 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:01:41.577 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:01:41.577 [0/1] Installing files. 00:01:41.842 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:41.842 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/flow_classify.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/ipv4_rules_file.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:41.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:41.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:01:41.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/kni.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:01:41.847 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:01:41.847 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.847 Installing lib/librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:41.848 Installing lib/librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing lib/librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing drivers/librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:01:42.112 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing drivers/librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:01:42.112 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing drivers/librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:01:42.112 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.112 Installing drivers/librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:01:42.112 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.112 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.112 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.112 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.112 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.112 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.112 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.112 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.112 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.112 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.112 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.113 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.113 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.113 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.113 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.113 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.113 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.113 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.114 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_empty_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_intel_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.115 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:01:42.116 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:01:42.116 Installing symlink pointing to librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.23 00:01:42.116 Installing symlink pointing to librte_kvargs.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:01:42.116 Installing symlink pointing to librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.23 00:01:42.116 Installing symlink pointing to librte_telemetry.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:01:42.116 Installing symlink pointing to librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.23 00:01:42.116 Installing symlink pointing to librte_eal.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:01:42.116 Installing symlink pointing to librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.23 00:01:42.116 Installing symlink pointing to librte_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:01:42.116 Installing symlink pointing to librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.23 00:01:42.116 Installing symlink pointing to librte_rcu.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:01:42.116 Installing symlink pointing to librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.23 00:01:42.116 Installing symlink pointing to librte_mempool.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:01:42.116 Installing symlink pointing to librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.23 00:01:42.116 Installing symlink pointing to librte_mbuf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:01:42.116 Installing symlink pointing to librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.23 00:01:42.116 Installing symlink pointing to librte_net.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:01:42.116 Installing symlink pointing to librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.23 00:01:42.117 Installing symlink pointing to librte_meter.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:01:42.117 Installing symlink pointing to librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.23 00:01:42.117 Installing symlink pointing to librte_ethdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:01:42.117 Installing symlink pointing to librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.23 00:01:42.117 Installing symlink pointing to librte_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:01:42.117 Installing symlink pointing to librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.23 00:01:42.117 Installing symlink pointing to librte_cmdline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:01:42.117 Installing symlink pointing to librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.23 00:01:42.117 Installing symlink pointing to librte_metrics.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:01:42.117 Installing symlink pointing to librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.23 00:01:42.117 Installing symlink pointing to librte_hash.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:01:42.117 Installing symlink pointing to librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.23 00:01:42.117 Installing symlink pointing to librte_timer.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:01:42.117 Installing symlink pointing to librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.23 00:01:42.117 Installing symlink pointing to librte_acl.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:01:42.117 Installing symlink pointing to librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.23 00:01:42.117 Installing symlink pointing to librte_bbdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:01:42.117 Installing symlink pointing to librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.23 00:01:42.117 Installing symlink pointing to librte_bitratestats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:01:42.117 Installing symlink pointing to librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.23 00:01:42.117 Installing symlink pointing to librte_bpf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:01:42.117 Installing symlink pointing to librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.23 00:01:42.117 Installing symlink pointing to librte_cfgfile.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:01:42.117 Installing symlink pointing to librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.23 00:01:42.117 Installing symlink pointing to librte_compressdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:01:42.117 Installing symlink pointing to librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.23 00:01:42.117 Installing symlink pointing to librte_cryptodev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:01:42.117 Installing symlink pointing to librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.23 00:01:42.117 Installing symlink pointing to librte_distributor.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:01:42.117 Installing symlink pointing to librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.23 00:01:42.117 Installing symlink pointing to librte_efd.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:01:42.117 Installing symlink pointing to librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.23 00:01:42.117 Installing symlink pointing to librte_eventdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:01:42.117 Installing symlink pointing to librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.23 00:01:42.117 Installing symlink pointing to librte_gpudev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:01:42.117 Installing symlink pointing to librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.23 00:01:42.117 Installing symlink pointing to librte_gro.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:01:42.117 Installing symlink pointing to librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.23 00:01:42.117 Installing symlink pointing to librte_gso.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:01:42.117 Installing symlink pointing to librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.23 00:01:42.117 Installing symlink pointing to librte_ip_frag.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:01:42.117 Installing symlink pointing to librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.23 00:01:42.117 Installing symlink pointing to librte_jobstats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:01:42.117 Installing symlink pointing to librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.23 00:01:42.117 Installing symlink pointing to librte_latencystats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:01:42.117 Installing symlink pointing to librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.23 00:01:42.117 Installing symlink pointing to librte_lpm.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:01:42.117 Installing symlink pointing to librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.23 00:01:42.117 Installing symlink pointing to librte_member.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:01:42.117 Installing symlink pointing to librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.23 00:01:42.117 Installing symlink pointing to librte_pcapng.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:01:42.117 Installing symlink pointing to librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.23 00:01:42.117 Installing symlink pointing to librte_power.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:01:42.117 Installing symlink pointing to librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.23 00:01:42.117 Installing symlink pointing to librte_rawdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:01:42.117 Installing symlink pointing to librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.23 00:01:42.117 Installing symlink pointing to librte_regexdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:01:42.117 Installing symlink pointing to librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.23 00:01:42.117 Installing symlink pointing to librte_dmadev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:01:42.117 Installing symlink pointing to librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.23 00:01:42.117 Installing symlink pointing to librte_rib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:01:42.117 Installing symlink pointing to librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.23 00:01:42.117 Installing symlink pointing to librte_reorder.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:01:42.117 Installing symlink pointing to librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.23 00:01:42.117 Installing symlink pointing to librte_sched.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:01:42.117 Installing symlink pointing to librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.23 00:01:42.117 Installing symlink pointing to librte_security.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:01:42.117 Installing symlink pointing to librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.23 00:01:42.117 Installing symlink pointing to librte_stack.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:01:42.117 Installing symlink pointing to librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.23 00:01:42.117 Installing symlink pointing to librte_vhost.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:01:42.117 './librte_bus_pci.so' -> 'dpdk/pmds-23.0/librte_bus_pci.so' 00:01:42.117 './librte_bus_pci.so.23' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23' 00:01:42.117 './librte_bus_pci.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23.0' 00:01:42.117 './librte_bus_vdev.so' -> 'dpdk/pmds-23.0/librte_bus_vdev.so' 00:01:42.117 './librte_bus_vdev.so.23' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23' 00:01:42.117 './librte_bus_vdev.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23.0' 00:01:42.117 './librte_mempool_ring.so' -> 'dpdk/pmds-23.0/librte_mempool_ring.so' 00:01:42.117 './librte_mempool_ring.so.23' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23' 00:01:42.117 './librte_mempool_ring.so.23.0' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23.0' 00:01:42.117 './librte_net_i40e.so' -> 'dpdk/pmds-23.0/librte_net_i40e.so' 00:01:42.117 './librte_net_i40e.so.23' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23' 00:01:42.117 './librte_net_i40e.so.23.0' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23.0' 00:01:42.118 Installing symlink pointing to librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.23 00:01:42.118 Installing symlink pointing to librte_ipsec.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:01:42.118 Installing symlink pointing to librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.23 00:01:42.118 Installing symlink pointing to librte_fib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:01:42.118 Installing symlink pointing to librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.23 00:01:42.118 Installing symlink pointing to librte_port.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:01:42.118 Installing symlink pointing to librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.23 00:01:42.118 Installing symlink pointing to librte_pdump.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:01:42.118 Installing symlink pointing to librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.23 00:01:42.118 Installing symlink pointing to librte_table.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:01:42.118 Installing symlink pointing to librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.23 00:01:42.118 Installing symlink pointing to librte_pipeline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:01:42.118 Installing symlink pointing to librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.23 00:01:42.118 Installing symlink pointing to librte_graph.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:01:42.118 Installing symlink pointing to librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.23 00:01:42.118 Installing symlink pointing to librte_node.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:01:42.118 Installing symlink pointing to librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so.23 00:01:42.118 Installing symlink pointing to librte_bus_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so 00:01:42.118 Installing symlink pointing to librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so.23 00:01:42.118 Installing symlink pointing to librte_bus_vdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so 00:01:42.118 Installing symlink pointing to librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so.23 00:01:42.118 Installing symlink pointing to librte_mempool_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so 00:01:42.118 Installing symlink pointing to librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so.23 00:01:42.118 Installing symlink pointing to librte_net_i40e.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so 00:01:42.118 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-23.0' 00:01:42.118 16:37:27 -- common/autobuild_common.sh@192 -- $ uname -s 00:01:42.118 16:37:27 -- common/autobuild_common.sh@192 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:01:42.118 16:37:27 -- common/autobuild_common.sh@203 -- $ cat 00:01:42.118 16:37:27 -- common/autobuild_common.sh@208 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:42.118 00:01:42.118 real 0m25.882s 00:01:42.118 user 6m35.178s 00:01:42.118 sys 2m12.499s 00:01:42.118 16:37:27 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:01:42.118 16:37:27 -- common/autotest_common.sh@10 -- $ set +x 00:01:42.118 ************************************ 00:01:42.118 END TEST build_native_dpdk 00:01:42.118 ************************************ 00:01:42.378 16:37:27 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:42.378 16:37:27 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:42.378 16:37:27 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:01:42.378 16:37:27 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:01:42.378 16:37:27 -- common/autobuild_common.sh@428 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:01:42.378 16:37:27 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:01:42.378 16:37:27 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:42.378 16:37:27 -- common/autotest_common.sh@10 -- $ set +x 00:01:42.378 ************************************ 00:01:42.378 START TEST autobuild_llvm_precompile 00:01:42.378 ************************************ 00:01:42.378 16:37:27 -- common/autotest_common.sh@1114 -- $ _llvm_precompile 00:01:42.378 16:37:27 -- common/autobuild_common.sh@32 -- $ clang --version 00:01:42.378 16:37:27 -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:01:42.378 Target: x86_64-redhat-linux-gnu 00:01:42.378 Thread model: posix 00:01:42.378 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:01:42.378 16:37:27 -- common/autobuild_common.sh@33 -- $ clang_num=17 00:01:42.378 16:37:27 -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:01:42.378 16:37:27 -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:01:42.378 16:37:27 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:01:42.378 16:37:27 -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:01:42.378 16:37:27 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:01:42.378 16:37:27 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:42.378 16:37:27 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:01:42.378 16:37:27 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:01:42.378 16:37:27 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:42.638 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:01:42.638 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:01:42.638 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:01:42.898 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:43.157 Using 'verbs' RDMA provider 00:01:58.624 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:13.556 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:13.556 Creating mk/config.mk...done. 00:02:13.556 Creating mk/cc.flags.mk...done. 00:02:13.556 Type 'make' to build. 00:02:13.556 00:02:13.556 real 0m29.394s 00:02:13.556 user 0m12.828s 00:02:13.556 sys 0m16.018s 00:02:13.556 16:37:57 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:02:13.556 16:37:57 -- common/autotest_common.sh@10 -- $ set +x 00:02:13.556 ************************************ 00:02:13.556 END TEST autobuild_llvm_precompile 00:02:13.556 ************************************ 00:02:13.556 16:37:57 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:13.556 16:37:57 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:13.556 16:37:57 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:13.556 16:37:57 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:13.556 16:37:57 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:13.556 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:13.556 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:13.556 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:13.556 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:13.556 Using 'verbs' RDMA provider 00:02:25.780 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:38.011 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:38.011 Creating mk/config.mk...done. 00:02:38.011 Creating mk/cc.flags.mk...done. 00:02:38.011 Type 'make' to build. 00:02:38.011 16:38:22 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:02:38.011 16:38:22 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:02:38.011 16:38:22 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:02:38.011 16:38:22 -- common/autotest_common.sh@10 -- $ set +x 00:02:38.011 ************************************ 00:02:38.011 START TEST make 00:02:38.011 ************************************ 00:02:38.011 16:38:22 -- common/autotest_common.sh@1114 -- $ make -j112 00:02:38.011 make[1]: Nothing to be done for 'all'. 00:02:38.949 The Meson build system 00:02:38.949 Version: 1.5.0 00:02:38.949 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:02:38.949 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:38.949 Build type: native build 00:02:38.949 Project name: libvfio-user 00:02:38.949 Project version: 0.0.1 00:02:38.949 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:02:38.949 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:02:38.949 Host machine cpu family: x86_64 00:02:38.949 Host machine cpu: x86_64 00:02:38.949 Run-time dependency threads found: YES 00:02:38.949 Library dl found: YES 00:02:38.949 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:38.949 Run-time dependency json-c found: YES 0.17 00:02:38.949 Run-time dependency cmocka found: YES 1.1.7 00:02:38.949 Program pytest-3 found: NO 00:02:38.949 Program flake8 found: NO 00:02:38.949 Program misspell-fixer found: NO 00:02:38.949 Program restructuredtext-lint found: NO 00:02:38.949 Program valgrind found: YES (/usr/bin/valgrind) 00:02:38.949 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:38.949 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:38.949 Compiler for C supports arguments -Wwrite-strings: YES 00:02:38.949 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:38.949 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:02:38.949 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:02:38.949 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:38.949 Build targets in project: 8 00:02:38.949 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:02:38.949 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:02:38.949 00:02:38.949 libvfio-user 0.0.1 00:02:38.949 00:02:38.949 User defined options 00:02:38.949 buildtype : debug 00:02:38.949 default_library: static 00:02:38.949 libdir : /usr/local/lib 00:02:38.949 00:02:38.949 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:39.209 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:39.209 [1/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:02:39.209 [2/36] Compiling C object samples/lspci.p/lspci.c.o 00:02:39.209 [3/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:02:39.209 [4/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:02:39.209 [5/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:02:39.209 [6/36] Compiling C object samples/null.p/null.c.o 00:02:39.209 [7/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:02:39.209 [8/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:02:39.209 [9/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:02:39.209 [10/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:02:39.209 [11/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:02:39.209 [12/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:02:39.209 [13/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:02:39.209 [14/36] Compiling C object test/unit_tests.p/mocks.c.o 00:02:39.209 [15/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:02:39.209 [16/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:02:39.209 [17/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:02:39.209 [18/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:02:39.209 [19/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:02:39.209 [20/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:02:39.209 [21/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:02:39.209 [22/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:02:39.209 [23/36] Compiling C object samples/server.p/server.c.o 00:02:39.209 [24/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:02:39.209 [25/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:02:39.209 [26/36] Compiling C object samples/client.p/client.c.o 00:02:39.469 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:02:39.469 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:02:39.469 [29/36] Linking static target lib/libvfio-user.a 00:02:39.469 [30/36] Linking target samples/client 00:02:39.469 [31/36] Linking target test/unit_tests 00:02:39.469 [32/36] Linking target samples/server 00:02:39.469 [33/36] Linking target samples/null 00:02:39.469 [34/36] Linking target samples/lspci 00:02:39.469 [35/36] Linking target samples/shadow_ioeventfd_server 00:02:39.469 [36/36] Linking target samples/gpio-pci-idio-16 00:02:39.469 INFO: autodetecting backend as ninja 00:02:39.469 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:39.469 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:39.752 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:39.752 ninja: no work to do. 00:02:43.152 CC lib/log/log.o 00:02:43.152 CC lib/log/log_flags.o 00:02:43.152 CC lib/ut_mock/mock.o 00:02:43.152 CC lib/log/log_deprecated.o 00:02:43.152 CC lib/ut/ut.o 00:02:43.152 LIB libspdk_ut_mock.a 00:02:43.153 LIB libspdk_log.a 00:02:43.153 LIB libspdk_ut.a 00:02:43.413 CC lib/ioat/ioat.o 00:02:43.413 CC lib/dma/dma.o 00:02:43.413 CC lib/util/base64.o 00:02:43.413 CC lib/util/bit_array.o 00:02:43.413 CC lib/util/cpuset.o 00:02:43.413 CC lib/util/crc16.o 00:02:43.413 CXX lib/trace_parser/trace.o 00:02:43.413 CC lib/util/crc32.o 00:02:43.413 CC lib/util/crc32c.o 00:02:43.413 CC lib/util/crc32_ieee.o 00:02:43.413 CC lib/util/crc64.o 00:02:43.413 CC lib/util/dif.o 00:02:43.413 CC lib/util/fd.o 00:02:43.413 CC lib/util/file.o 00:02:43.413 CC lib/util/hexlify.o 00:02:43.413 CC lib/util/iov.o 00:02:43.413 CC lib/util/math.o 00:02:43.413 CC lib/util/pipe.o 00:02:43.413 CC lib/util/string.o 00:02:43.413 CC lib/util/strerror_tls.o 00:02:43.413 CC lib/util/uuid.o 00:02:43.413 CC lib/util/fd_group.o 00:02:43.413 CC lib/util/xor.o 00:02:43.413 CC lib/util/zipf.o 00:02:43.671 CC lib/vfio_user/host/vfio_user.o 00:02:43.671 CC lib/vfio_user/host/vfio_user_pci.o 00:02:43.671 LIB libspdk_dma.a 00:02:43.671 LIB libspdk_ioat.a 00:02:43.671 LIB libspdk_vfio_user.a 00:02:43.671 LIB libspdk_util.a 00:02:43.931 LIB libspdk_trace_parser.a 00:02:43.931 CC lib/vmd/vmd.o 00:02:43.931 CC lib/vmd/led.o 00:02:43.931 CC lib/rdma/common.o 00:02:43.931 CC lib/json/json_parse.o 00:02:43.931 CC lib/rdma/rdma_verbs.o 00:02:43.931 CC lib/conf/conf.o 00:02:43.931 CC lib/json/json_write.o 00:02:43.931 CC lib/json/json_util.o 00:02:43.931 CC lib/env_dpdk/env.o 00:02:43.931 CC lib/env_dpdk/memory.o 00:02:43.931 CC lib/env_dpdk/pci.o 00:02:43.931 CC lib/env_dpdk/init.o 00:02:43.931 CC lib/idxd/idxd.o 00:02:43.931 CC lib/env_dpdk/threads.o 00:02:43.931 CC lib/idxd/idxd_user.o 00:02:43.931 CC lib/idxd/idxd_kernel.o 00:02:43.931 CC lib/env_dpdk/pci_ioat.o 00:02:43.931 CC lib/env_dpdk/pci_virtio.o 00:02:43.931 CC lib/env_dpdk/pci_vmd.o 00:02:44.191 CC lib/env_dpdk/pci_idxd.o 00:02:44.191 CC lib/env_dpdk/pci_event.o 00:02:44.191 CC lib/env_dpdk/sigbus_handler.o 00:02:44.191 CC lib/env_dpdk/pci_dpdk.o 00:02:44.191 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:44.191 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:44.191 LIB libspdk_conf.a 00:02:44.191 LIB libspdk_rdma.a 00:02:44.191 LIB libspdk_json.a 00:02:44.450 LIB libspdk_idxd.a 00:02:44.450 LIB libspdk_vmd.a 00:02:44.450 CC lib/jsonrpc/jsonrpc_server.o 00:02:44.450 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:44.450 CC lib/jsonrpc/jsonrpc_client.o 00:02:44.450 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:44.710 LIB libspdk_jsonrpc.a 00:02:44.969 LIB libspdk_env_dpdk.a 00:02:44.969 CC lib/rpc/rpc.o 00:02:45.228 LIB libspdk_rpc.a 00:02:45.489 CC lib/notify/notify.o 00:02:45.489 CC lib/notify/notify_rpc.o 00:02:45.489 CC lib/trace/trace.o 00:02:45.489 CC lib/trace/trace_flags.o 00:02:45.489 CC lib/trace/trace_rpc.o 00:02:45.489 CC lib/sock/sock.o 00:02:45.489 CC lib/sock/sock_rpc.o 00:02:45.489 LIB libspdk_notify.a 00:02:45.749 LIB libspdk_trace.a 00:02:45.749 LIB libspdk_sock.a 00:02:46.009 CC lib/thread/thread.o 00:02:46.009 CC lib/thread/iobuf.o 00:02:46.009 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:46.009 CC lib/nvme/nvme_ctrlr.o 00:02:46.009 CC lib/nvme/nvme_fabric.o 00:02:46.009 CC lib/nvme/nvme_ns_cmd.o 00:02:46.009 CC lib/nvme/nvme_ns.o 00:02:46.009 CC lib/nvme/nvme_pcie_common.o 00:02:46.009 CC lib/nvme/nvme_pcie.o 00:02:46.009 CC lib/nvme/nvme_qpair.o 00:02:46.009 CC lib/nvme/nvme.o 00:02:46.009 CC lib/nvme/nvme_quirks.o 00:02:46.009 CC lib/nvme/nvme_transport.o 00:02:46.009 CC lib/nvme/nvme_discovery.o 00:02:46.009 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:46.009 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:46.009 CC lib/nvme/nvme_tcp.o 00:02:46.009 CC lib/nvme/nvme_opal.o 00:02:46.009 CC lib/nvme/nvme_io_msg.o 00:02:46.009 CC lib/nvme/nvme_poll_group.o 00:02:46.009 CC lib/nvme/nvme_zns.o 00:02:46.009 CC lib/nvme/nvme_cuse.o 00:02:46.009 CC lib/nvme/nvme_vfio_user.o 00:02:46.009 CC lib/nvme/nvme_rdma.o 00:02:46.948 LIB libspdk_thread.a 00:02:46.948 CC lib/virtio/virtio.o 00:02:46.948 CC lib/virtio/virtio_vhost_user.o 00:02:46.948 CC lib/vfu_tgt/tgt_endpoint.o 00:02:46.948 CC lib/virtio/virtio_vfio_user.o 00:02:46.948 CC lib/vfu_tgt/tgt_rpc.o 00:02:46.948 CC lib/virtio/virtio_pci.o 00:02:46.948 CC lib/blob/blobstore.o 00:02:46.948 CC lib/blob/request.o 00:02:46.948 CC lib/init/json_config.o 00:02:46.948 CC lib/blob/zeroes.o 00:02:46.948 CC lib/init/subsystem.o 00:02:46.948 CC lib/blob/blob_bs_dev.o 00:02:46.948 CC lib/init/subsystem_rpc.o 00:02:46.948 CC lib/init/rpc.o 00:02:46.948 CC lib/accel/accel.o 00:02:46.948 CC lib/accel/accel_rpc.o 00:02:46.948 CC lib/accel/accel_sw.o 00:02:47.207 LIB libspdk_init.a 00:02:47.207 LIB libspdk_virtio.a 00:02:47.207 LIB libspdk_vfu_tgt.a 00:02:47.207 LIB libspdk_nvme.a 00:02:47.467 CC lib/event/app.o 00:02:47.467 CC lib/event/reactor.o 00:02:47.467 CC lib/event/log_rpc.o 00:02:47.467 CC lib/event/app_rpc.o 00:02:47.467 CC lib/event/scheduler_static.o 00:02:47.727 LIB libspdk_accel.a 00:02:47.727 LIB libspdk_event.a 00:02:47.987 CC lib/bdev/bdev.o 00:02:47.987 CC lib/bdev/bdev_rpc.o 00:02:47.987 CC lib/bdev/bdev_zone.o 00:02:47.987 CC lib/bdev/part.o 00:02:47.987 CC lib/bdev/scsi_nvme.o 00:02:48.557 LIB libspdk_blob.a 00:02:48.817 CC lib/lvol/lvol.o 00:02:48.817 CC lib/blobfs/blobfs.o 00:02:48.817 CC lib/blobfs/tree.o 00:02:49.387 LIB libspdk_lvol.a 00:02:49.387 LIB libspdk_blobfs.a 00:02:49.648 LIB libspdk_bdev.a 00:02:49.907 CC lib/ublk/ublk.o 00:02:49.907 CC lib/ublk/ublk_rpc.o 00:02:49.907 CC lib/nvmf/ctrlr.o 00:02:49.907 CC lib/nvmf/ctrlr_discovery.o 00:02:49.907 CC lib/nvmf/ctrlr_bdev.o 00:02:49.907 CC lib/nvmf/subsystem.o 00:02:49.907 CC lib/nbd/nbd_rpc.o 00:02:49.907 CC lib/nvmf/nvmf_rpc.o 00:02:49.907 CC lib/nvmf/nvmf.o 00:02:49.907 CC lib/nbd/nbd.o 00:02:49.907 CC lib/ftl/ftl_core.o 00:02:49.907 CC lib/nvmf/transport.o 00:02:49.907 CC lib/nvmf/tcp.o 00:02:49.907 CC lib/ftl/ftl_init.o 00:02:49.907 CC lib/scsi/dev.o 00:02:49.907 CC lib/scsi/port.o 00:02:49.907 CC lib/nvmf/vfio_user.o 00:02:49.907 CC lib/scsi/lun.o 00:02:49.907 CC lib/ftl/ftl_layout.o 00:02:49.907 CC lib/scsi/scsi.o 00:02:49.907 CC lib/ftl/ftl_debug.o 00:02:49.907 CC lib/nvmf/rdma.o 00:02:49.907 CC lib/ftl/ftl_io.o 00:02:49.907 CC lib/scsi/scsi_bdev.o 00:02:49.907 CC lib/ftl/ftl_sb.o 00:02:49.907 CC lib/scsi/scsi_pr.o 00:02:49.907 CC lib/ftl/ftl_l2p.o 00:02:49.907 CC lib/scsi/task.o 00:02:49.907 CC lib/scsi/scsi_rpc.o 00:02:49.907 CC lib/ftl/ftl_nv_cache.o 00:02:49.907 CC lib/ftl/ftl_l2p_flat.o 00:02:49.907 CC lib/ftl/ftl_band.o 00:02:49.907 CC lib/ftl/ftl_band_ops.o 00:02:49.908 CC lib/ftl/ftl_writer.o 00:02:49.908 CC lib/ftl/ftl_rq.o 00:02:49.908 CC lib/ftl/ftl_reloc.o 00:02:49.908 CC lib/ftl/ftl_l2p_cache.o 00:02:49.908 CC lib/ftl/ftl_p2l.o 00:02:49.908 CC lib/ftl/mngt/ftl_mngt.o 00:02:49.908 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:49.908 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:49.908 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:49.908 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:49.908 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:49.908 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:49.908 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:49.908 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:49.908 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:49.908 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:49.908 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:49.908 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:49.908 CC lib/ftl/utils/ftl_conf.o 00:02:49.908 CC lib/ftl/utils/ftl_md.o 00:02:49.908 CC lib/ftl/utils/ftl_mempool.o 00:02:49.908 CC lib/ftl/utils/ftl_bitmap.o 00:02:49.908 CC lib/ftl/utils/ftl_property.o 00:02:49.908 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:49.908 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:49.908 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:49.908 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:49.908 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:49.908 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:49.908 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:49.908 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:49.908 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:49.908 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:49.908 CC lib/ftl/base/ftl_base_dev.o 00:02:49.908 CC lib/ftl/base/ftl_base_bdev.o 00:02:49.908 CC lib/ftl/ftl_trace.o 00:02:50.167 LIB libspdk_nbd.a 00:02:50.427 LIB libspdk_scsi.a 00:02:50.427 LIB libspdk_ublk.a 00:02:50.427 LIB libspdk_ftl.a 00:02:50.687 CC lib/vhost/vhost.o 00:02:50.687 CC lib/vhost/vhost_rpc.o 00:02:50.687 CC lib/vhost/vhost_scsi.o 00:02:50.687 CC lib/vhost/vhost_blk.o 00:02:50.687 CC lib/vhost/rte_vhost_user.o 00:02:50.687 CC lib/iscsi/conn.o 00:02:50.687 CC lib/iscsi/init_grp.o 00:02:50.687 CC lib/iscsi/iscsi.o 00:02:50.687 CC lib/iscsi/md5.o 00:02:50.687 CC lib/iscsi/param.o 00:02:50.687 CC lib/iscsi/portal_grp.o 00:02:50.687 CC lib/iscsi/tgt_node.o 00:02:50.687 CC lib/iscsi/iscsi_subsystem.o 00:02:50.687 CC lib/iscsi/iscsi_rpc.o 00:02:50.687 CC lib/iscsi/task.o 00:02:51.258 LIB libspdk_nvmf.a 00:02:51.258 LIB libspdk_vhost.a 00:02:51.258 LIB libspdk_iscsi.a 00:02:51.828 CC module/vfu_device/vfu_virtio.o 00:02:51.828 CC module/vfu_device/vfu_virtio_blk.o 00:02:51.828 CC module/vfu_device/vfu_virtio_scsi.o 00:02:51.828 CC module/vfu_device/vfu_virtio_rpc.o 00:02:51.828 CC module/env_dpdk/env_dpdk_rpc.o 00:02:51.828 LIB libspdk_env_dpdk_rpc.a 00:02:52.088 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:52.088 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:52.088 CC module/accel/error/accel_error.o 00:02:52.089 CC module/accel/error/accel_error_rpc.o 00:02:52.089 CC module/sock/posix/posix.o 00:02:52.089 CC module/accel/iaa/accel_iaa.o 00:02:52.089 CC module/scheduler/gscheduler/gscheduler.o 00:02:52.089 CC module/accel/iaa/accel_iaa_rpc.o 00:02:52.089 CC module/accel/ioat/accel_ioat.o 00:02:52.089 CC module/accel/ioat/accel_ioat_rpc.o 00:02:52.089 CC module/accel/dsa/accel_dsa.o 00:02:52.089 CC module/blob/bdev/blob_bdev.o 00:02:52.089 CC module/accel/dsa/accel_dsa_rpc.o 00:02:52.089 LIB libspdk_scheduler_dpdk_governor.a 00:02:52.089 LIB libspdk_accel_error.a 00:02:52.089 LIB libspdk_scheduler_gscheduler.a 00:02:52.089 LIB libspdk_scheduler_dynamic.a 00:02:52.089 LIB libspdk_accel_ioat.a 00:02:52.089 LIB libspdk_accel_iaa.a 00:02:52.089 LIB libspdk_accel_dsa.a 00:02:52.089 LIB libspdk_blob_bdev.a 00:02:52.349 LIB libspdk_vfu_device.a 00:02:52.349 LIB libspdk_sock_posix.a 00:02:52.608 CC module/bdev/null/bdev_null.o 00:02:52.608 CC module/bdev/null/bdev_null_rpc.o 00:02:52.608 CC module/bdev/passthru/vbdev_passthru.o 00:02:52.608 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:52.608 CC module/bdev/gpt/gpt.o 00:02:52.608 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:52.608 CC module/bdev/gpt/vbdev_gpt.o 00:02:52.608 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:52.608 CC module/bdev/ftl/bdev_ftl.o 00:02:52.608 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:52.608 CC module/bdev/delay/vbdev_delay.o 00:02:52.608 CC module/bdev/raid/bdev_raid_rpc.o 00:02:52.608 CC module/bdev/aio/bdev_aio_rpc.o 00:02:52.608 CC module/bdev/raid/bdev_raid.o 00:02:52.608 CC module/bdev/aio/bdev_aio.o 00:02:52.608 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:52.608 CC module/bdev/raid/raid0.o 00:02:52.608 CC module/bdev/raid/bdev_raid_sb.o 00:02:52.608 CC module/bdev/raid/raid1.o 00:02:52.608 CC module/bdev/lvol/vbdev_lvol.o 00:02:52.608 CC module/blobfs/bdev/blobfs_bdev.o 00:02:52.608 CC module/bdev/nvme/bdev_nvme.o 00:02:52.608 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:52.608 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:52.608 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:52.608 CC module/bdev/raid/concat.o 00:02:52.608 CC module/bdev/nvme/bdev_mdns_client.o 00:02:52.608 CC module/bdev/nvme/nvme_rpc.o 00:02:52.608 CC module/bdev/nvme/vbdev_opal.o 00:02:52.608 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:52.608 CC module/bdev/iscsi/bdev_iscsi.o 00:02:52.608 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:52.608 CC module/bdev/malloc/bdev_malloc.o 00:02:52.608 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:52.608 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:52.608 CC module/bdev/error/vbdev_error_rpc.o 00:02:52.608 CC module/bdev/error/vbdev_error.o 00:02:52.608 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:52.608 CC module/bdev/split/vbdev_split.o 00:02:52.608 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:52.608 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:52.608 CC module/bdev/split/vbdev_split_rpc.o 00:02:52.868 LIB libspdk_blobfs_bdev.a 00:02:52.868 LIB libspdk_bdev_null.a 00:02:52.868 LIB libspdk_bdev_split.a 00:02:52.868 LIB libspdk_bdev_gpt.a 00:02:52.868 LIB libspdk_bdev_error.a 00:02:52.868 LIB libspdk_bdev_ftl.a 00:02:52.868 LIB libspdk_bdev_passthru.a 00:02:52.868 LIB libspdk_bdev_aio.a 00:02:52.868 LIB libspdk_bdev_zone_block.a 00:02:52.868 LIB libspdk_bdev_iscsi.a 00:02:52.868 LIB libspdk_bdev_delay.a 00:02:52.868 LIB libspdk_bdev_malloc.a 00:02:52.868 LIB libspdk_bdev_lvol.a 00:02:52.868 LIB libspdk_bdev_virtio.a 00:02:53.128 LIB libspdk_bdev_raid.a 00:02:53.698 LIB libspdk_bdev_nvme.a 00:02:54.269 CC module/event/subsystems/iobuf/iobuf.o 00:02:54.269 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:54.269 CC module/event/subsystems/sock/sock.o 00:02:54.269 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:02:54.269 CC module/event/subsystems/vmd/vmd.o 00:02:54.269 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:54.269 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:54.269 CC module/event/subsystems/scheduler/scheduler.o 00:02:54.530 LIB libspdk_event_sock.a 00:02:54.530 LIB libspdk_event_vfu_tgt.a 00:02:54.530 LIB libspdk_event_vhost_blk.a 00:02:54.530 LIB libspdk_event_iobuf.a 00:02:54.530 LIB libspdk_event_vmd.a 00:02:54.530 LIB libspdk_event_scheduler.a 00:02:54.790 CC module/event/subsystems/accel/accel.o 00:02:54.790 LIB libspdk_event_accel.a 00:02:55.360 CC module/event/subsystems/bdev/bdev.o 00:02:55.360 LIB libspdk_event_bdev.a 00:02:55.621 CC module/event/subsystems/nbd/nbd.o 00:02:55.621 CC module/event/subsystems/ublk/ublk.o 00:02:55.621 CC module/event/subsystems/scsi/scsi.o 00:02:55.621 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:55.621 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:55.881 LIB libspdk_event_nbd.a 00:02:55.881 LIB libspdk_event_ublk.a 00:02:55.881 LIB libspdk_event_scsi.a 00:02:55.881 LIB libspdk_event_nvmf.a 00:02:56.142 CC module/event/subsystems/iscsi/iscsi.o 00:02:56.142 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:56.143 LIB libspdk_event_vhost_scsi.a 00:02:56.143 LIB libspdk_event_iscsi.a 00:02:56.717 CC app/trace_record/trace_record.o 00:02:56.717 CC app/spdk_nvme_perf/perf.o 00:02:56.717 CXX app/trace/trace.o 00:02:56.717 CC app/spdk_nvme_identify/identify.o 00:02:56.717 CC app/spdk_lspci/spdk_lspci.o 00:02:56.717 CC app/spdk_nvme_discover/discovery_aer.o 00:02:56.717 TEST_HEADER include/spdk/accel_module.h 00:02:56.717 TEST_HEADER include/spdk/accel.h 00:02:56.717 TEST_HEADER include/spdk/assert.h 00:02:56.717 TEST_HEADER include/spdk/base64.h 00:02:56.717 CC app/spdk_top/spdk_top.o 00:02:56.717 TEST_HEADER include/spdk/bdev.h 00:02:56.717 TEST_HEADER include/spdk/barrier.h 00:02:56.717 TEST_HEADER include/spdk/bdev_module.h 00:02:56.717 TEST_HEADER include/spdk/bdev_zone.h 00:02:56.717 TEST_HEADER include/spdk/bit_array.h 00:02:56.717 CC test/rpc_client/rpc_client_test.o 00:02:56.717 TEST_HEADER include/spdk/bit_pool.h 00:02:56.717 TEST_HEADER include/spdk/blob_bdev.h 00:02:56.717 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:56.717 TEST_HEADER include/spdk/blobfs.h 00:02:56.717 TEST_HEADER include/spdk/blob.h 00:02:56.717 TEST_HEADER include/spdk/conf.h 00:02:56.717 TEST_HEADER include/spdk/config.h 00:02:56.717 TEST_HEADER include/spdk/cpuset.h 00:02:56.717 TEST_HEADER include/spdk/crc16.h 00:02:56.717 TEST_HEADER include/spdk/crc32.h 00:02:56.717 TEST_HEADER include/spdk/crc64.h 00:02:56.717 TEST_HEADER include/spdk/dif.h 00:02:56.717 TEST_HEADER include/spdk/dma.h 00:02:56.717 TEST_HEADER include/spdk/endian.h 00:02:56.717 TEST_HEADER include/spdk/env_dpdk.h 00:02:56.717 TEST_HEADER include/spdk/env.h 00:02:56.717 TEST_HEADER include/spdk/fd_group.h 00:02:56.717 TEST_HEADER include/spdk/event.h 00:02:56.717 TEST_HEADER include/spdk/fd.h 00:02:56.717 TEST_HEADER include/spdk/file.h 00:02:56.717 TEST_HEADER include/spdk/ftl.h 00:02:56.717 TEST_HEADER include/spdk/hexlify.h 00:02:56.717 TEST_HEADER include/spdk/gpt_spec.h 00:02:56.717 TEST_HEADER include/spdk/histogram_data.h 00:02:56.717 TEST_HEADER include/spdk/idxd.h 00:02:56.717 TEST_HEADER include/spdk/idxd_spec.h 00:02:56.717 TEST_HEADER include/spdk/init.h 00:02:56.717 TEST_HEADER include/spdk/ioat.h 00:02:56.717 TEST_HEADER include/spdk/ioat_spec.h 00:02:56.717 TEST_HEADER include/spdk/iscsi_spec.h 00:02:56.717 TEST_HEADER include/spdk/json.h 00:02:56.717 TEST_HEADER include/spdk/jsonrpc.h 00:02:56.717 TEST_HEADER include/spdk/likely.h 00:02:56.717 TEST_HEADER include/spdk/log.h 00:02:56.717 TEST_HEADER include/spdk/lvol.h 00:02:56.717 TEST_HEADER include/spdk/memory.h 00:02:56.717 TEST_HEADER include/spdk/mmio.h 00:02:56.717 TEST_HEADER include/spdk/nbd.h 00:02:56.717 TEST_HEADER include/spdk/notify.h 00:02:56.717 TEST_HEADER include/spdk/nvme_intel.h 00:02:56.717 TEST_HEADER include/spdk/nvme.h 00:02:56.717 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:56.717 CC app/nvmf_tgt/nvmf_main.o 00:02:56.717 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:56.717 TEST_HEADER include/spdk/nvme_spec.h 00:02:56.717 TEST_HEADER include/spdk/nvme_zns.h 00:02:56.717 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:56.717 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:56.717 TEST_HEADER include/spdk/nvmf.h 00:02:56.717 TEST_HEADER include/spdk/nvmf_spec.h 00:02:56.717 TEST_HEADER include/spdk/nvmf_transport.h 00:02:56.717 CC app/iscsi_tgt/iscsi_tgt.o 00:02:56.717 TEST_HEADER include/spdk/opal.h 00:02:56.717 TEST_HEADER include/spdk/pci_ids.h 00:02:56.717 TEST_HEADER include/spdk/opal_spec.h 00:02:56.717 TEST_HEADER include/spdk/pipe.h 00:02:56.717 CC app/vhost/vhost.o 00:02:56.717 CC app/spdk_dd/spdk_dd.o 00:02:56.717 TEST_HEADER include/spdk/queue.h 00:02:56.717 TEST_HEADER include/spdk/reduce.h 00:02:56.717 TEST_HEADER include/spdk/rpc.h 00:02:56.717 TEST_HEADER include/spdk/scheduler.h 00:02:56.717 TEST_HEADER include/spdk/scsi_spec.h 00:02:56.717 TEST_HEADER include/spdk/scsi.h 00:02:56.717 TEST_HEADER include/spdk/stdinc.h 00:02:56.717 TEST_HEADER include/spdk/sock.h 00:02:56.717 TEST_HEADER include/spdk/string.h 00:02:56.717 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:56.717 TEST_HEADER include/spdk/thread.h 00:02:56.717 TEST_HEADER include/spdk/trace.h 00:02:56.717 TEST_HEADER include/spdk/tree.h 00:02:56.717 CC app/spdk_tgt/spdk_tgt.o 00:02:56.717 TEST_HEADER include/spdk/trace_parser.h 00:02:56.717 TEST_HEADER include/spdk/util.h 00:02:56.718 TEST_HEADER include/spdk/ublk.h 00:02:56.718 TEST_HEADER include/spdk/uuid.h 00:02:56.718 TEST_HEADER include/spdk/version.h 00:02:56.718 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:56.718 TEST_HEADER include/spdk/vhost.h 00:02:56.718 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:56.718 TEST_HEADER include/spdk/vmd.h 00:02:56.718 TEST_HEADER include/spdk/xor.h 00:02:56.718 TEST_HEADER include/spdk/zipf.h 00:02:56.718 CXX test/cpp_headers/accel.o 00:02:56.718 CXX test/cpp_headers/accel_module.o 00:02:56.718 CXX test/cpp_headers/assert.o 00:02:56.718 CXX test/cpp_headers/base64.o 00:02:56.718 CXX test/cpp_headers/barrier.o 00:02:56.718 CXX test/cpp_headers/bdev_module.o 00:02:56.718 CXX test/cpp_headers/bdev.o 00:02:56.718 CXX test/cpp_headers/bdev_zone.o 00:02:56.718 CXX test/cpp_headers/bit_array.o 00:02:56.718 CXX test/cpp_headers/bit_pool.o 00:02:56.718 CXX test/cpp_headers/blob_bdev.o 00:02:56.718 CXX test/cpp_headers/blobfs_bdev.o 00:02:56.718 CXX test/cpp_headers/blobfs.o 00:02:56.718 CXX test/cpp_headers/blob.o 00:02:56.718 CXX test/cpp_headers/conf.o 00:02:56.718 CXX test/cpp_headers/config.o 00:02:56.718 CXX test/cpp_headers/cpuset.o 00:02:56.718 CXX test/cpp_headers/crc16.o 00:02:56.718 CXX test/cpp_headers/crc32.o 00:02:56.718 CXX test/cpp_headers/crc64.o 00:02:56.718 CXX test/cpp_headers/dif.o 00:02:56.718 CXX test/cpp_headers/dma.o 00:02:56.718 CXX test/cpp_headers/endian.o 00:02:56.718 CXX test/cpp_headers/env_dpdk.o 00:02:56.718 CXX test/cpp_headers/env.o 00:02:56.718 CXX test/cpp_headers/event.o 00:02:56.718 CXX test/cpp_headers/fd_group.o 00:02:56.718 CXX test/cpp_headers/file.o 00:02:56.718 CXX test/cpp_headers/fd.o 00:02:56.718 CXX test/cpp_headers/gpt_spec.o 00:02:56.718 CXX test/cpp_headers/ftl.o 00:02:56.718 CXX test/cpp_headers/hexlify.o 00:02:56.718 CXX test/cpp_headers/histogram_data.o 00:02:56.718 CXX test/cpp_headers/idxd.o 00:02:56.718 CXX test/cpp_headers/idxd_spec.o 00:02:56.718 CXX test/cpp_headers/init.o 00:02:56.718 CC examples/ioat/verify/verify.o 00:02:56.718 CC test/nvme/aer/aer.o 00:02:56.718 CC test/nvme/fused_ordering/fused_ordering.o 00:02:56.718 CC test/nvme/reset/reset.o 00:02:56.718 CC examples/vmd/lsvmd/lsvmd.o 00:02:56.718 CC examples/ioat/perf/perf.o 00:02:56.718 CC test/nvme/fdp/fdp.o 00:02:56.718 CC test/nvme/overhead/overhead.o 00:02:56.718 CC examples/vmd/led/led.o 00:02:56.718 CC test/nvme/e2edp/nvme_dp.o 00:02:56.718 CC test/nvme/err_injection/err_injection.o 00:02:56.718 CC test/nvme/connect_stress/connect_stress.o 00:02:56.718 CC test/nvme/boot_partition/boot_partition.o 00:02:56.718 CC test/nvme/startup/startup.o 00:02:56.718 CC test/nvme/simple_copy/simple_copy.o 00:02:56.718 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:56.718 CC test/nvme/reserve/reserve.o 00:02:56.718 CC test/event/reactor/reactor.o 00:02:56.718 CC test/nvme/cuse/cuse.o 00:02:56.718 CC test/env/vtophys/vtophys.o 00:02:56.718 CC test/nvme/compliance/nvme_compliance.o 00:02:56.718 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:56.718 CC test/env/pci/pci_ut.o 00:02:56.718 CC test/event/reactor_perf/reactor_perf.o 00:02:56.718 CC test/env/memory/memory_ut.o 00:02:56.718 CC examples/accel/perf/accel_perf.o 00:02:56.718 CC test/event/event_perf/event_perf.o 00:02:56.718 CC test/nvme/sgl/sgl.o 00:02:56.718 CC examples/util/zipf/zipf.o 00:02:56.718 CC examples/nvme/reconnect/reconnect.o 00:02:56.718 CC examples/sock/hello_world/hello_sock.o 00:02:56.718 CC examples/nvme/hello_world/hello_world.o 00:02:56.718 LINK spdk_lspci 00:02:56.718 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:56.718 CC examples/nvme/hotplug/hotplug.o 00:02:56.718 CC test/app/histogram_perf/histogram_perf.o 00:02:56.718 CXX test/cpp_headers/ioat.o 00:02:56.718 CC test/app/jsoncat/jsoncat.o 00:02:56.718 CC examples/nvme/arbitration/arbitration.o 00:02:56.718 CC examples/nvme/abort/abort.o 00:02:56.718 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:56.718 CC examples/idxd/perf/perf.o 00:02:56.718 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:56.718 CC app/fio/nvme/fio_plugin.o 00:02:56.718 CC test/app/stub/stub.o 00:02:56.718 CC test/thread/poller_perf/poller_perf.o 00:02:56.718 CC test/thread/lock/spdk_lock.o 00:02:56.718 CC test/event/app_repeat/app_repeat.o 00:02:56.718 CC examples/bdev/bdevperf/bdevperf.o 00:02:56.718 CC examples/bdev/hello_world/hello_bdev.o 00:02:56.718 CC test/blobfs/mkfs/mkfs.o 00:02:56.718 CC examples/blob/hello_world/hello_blob.o 00:02:56.718 CC examples/nvmf/nvmf/nvmf.o 00:02:56.718 CC examples/blob/cli/blobcli.o 00:02:56.718 CC test/accel/dif/dif.o 00:02:56.718 CC test/bdev/bdevio/bdevio.o 00:02:56.718 CC examples/thread/thread/thread_ex.o 00:02:56.718 CC test/event/scheduler/scheduler.o 00:02:56.718 CC test/dma/test_dma/test_dma.o 00:02:56.718 CC app/fio/bdev/fio_plugin.o 00:02:56.718 LINK spdk_nvme_discover 00:02:56.718 CC test/app/bdev_svc/bdev_svc.o 00:02:56.718 LINK rpc_client_test 00:02:56.718 CC test/env/mem_callbacks/mem_callbacks.o 00:02:56.718 CC test/lvol/esnap/esnap.o 00:02:56.986 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:56.986 LINK spdk_trace_record 00:02:56.986 LINK lsvmd 00:02:56.986 LINK interrupt_tgt 00:02:56.986 CXX test/cpp_headers/ioat_spec.o 00:02:56.986 CXX test/cpp_headers/iscsi_spec.o 00:02:56.986 LINK nvmf_tgt 00:02:56.986 LINK led 00:02:56.986 CXX test/cpp_headers/json.o 00:02:56.986 CXX test/cpp_headers/jsonrpc.o 00:02:56.986 CXX test/cpp_headers/likely.o 00:02:56.986 CXX test/cpp_headers/log.o 00:02:56.986 CXX test/cpp_headers/lvol.o 00:02:56.987 CXX test/cpp_headers/memory.o 00:02:56.987 LINK vhost 00:02:56.987 CXX test/cpp_headers/mmio.o 00:02:56.987 CXX test/cpp_headers/nbd.o 00:02:56.987 CXX test/cpp_headers/notify.o 00:02:56.987 CXX test/cpp_headers/nvme.o 00:02:56.987 CXX test/cpp_headers/nvme_intel.o 00:02:56.987 CXX test/cpp_headers/nvme_ocssd.o 00:02:56.987 LINK reactor 00:02:56.987 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:56.987 CXX test/cpp_headers/nvme_spec.o 00:02:56.987 LINK vtophys 00:02:56.987 LINK event_perf 00:02:56.987 CXX test/cpp_headers/nvme_zns.o 00:02:56.987 CXX test/cpp_headers/nvmf_cmd.o 00:02:56.987 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:56.987 LINK jsoncat 00:02:56.987 CXX test/cpp_headers/nvmf.o 00:02:56.987 CXX test/cpp_headers/nvmf_spec.o 00:02:56.987 CXX test/cpp_headers/nvmf_transport.o 00:02:56.987 CXX test/cpp_headers/opal.o 00:02:56.987 CXX test/cpp_headers/opal_spec.o 00:02:56.987 LINK reactor_perf 00:02:56.987 LINK zipf 00:02:56.987 CXX test/cpp_headers/pci_ids.o 00:02:56.987 LINK histogram_perf 00:02:56.987 LINK connect_stress 00:02:56.987 LINK env_dpdk_post_init 00:02:56.987 LINK poller_perf 00:02:56.987 CXX test/cpp_headers/pipe.o 00:02:56.987 LINK startup 00:02:56.987 LINK iscsi_tgt 00:02:56.987 CXX test/cpp_headers/queue.o 00:02:56.987 LINK boot_partition 00:02:56.987 CXX test/cpp_headers/reduce.o 00:02:56.987 CXX test/cpp_headers/rpc.o 00:02:56.987 CXX test/cpp_headers/scheduler.o 00:02:56.987 CXX test/cpp_headers/scsi.o 00:02:56.987 LINK spdk_tgt 00:02:56.987 LINK err_injection 00:02:56.987 LINK app_repeat 00:02:56.987 LINK verify 00:02:56.987 CXX test/cpp_headers/scsi_spec.o 00:02:56.987 LINK fused_ordering 00:02:56.987 CXX test/cpp_headers/sock.o 00:02:56.987 LINK doorbell_aers 00:02:56.987 LINK ioat_perf 00:02:56.987 LINK pmr_persistence 00:02:56.987 LINK reserve 00:02:56.987 LINK stub 00:02:56.987 LINK simple_copy 00:02:56.987 LINK cmb_copy 00:02:56.987 CXX test/cpp_headers/stdinc.o 00:02:56.987 LINK hello_world 00:02:56.987 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:56.987 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:56.987 LINK hello_sock 00:02:56.987 LINK aer 00:02:56.987 LINK reset 00:02:56.987 LINK hotplug 00:02:56.987 LINK nvme_dp 00:02:56.987 LINK mkfs 00:02:56.987 LINK bdev_svc 00:02:56.987 LINK fdp 00:02:56.987 LINK hello_bdev 00:02:56.987 LINK overhead 00:02:56.987 LINK sgl 00:02:56.987 LINK hello_blob 00:02:56.987 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:02:56.987 LINK mem_callbacks 00:02:56.987 LINK scheduler 00:02:56.987 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:02:56.987 LINK thread 00:02:56.987 CXX test/cpp_headers/string.o 00:02:56.987 CXX test/cpp_headers/thread.o 00:02:56.987 LINK spdk_trace 00:02:57.248 CXX test/cpp_headers/trace.o 00:02:57.248 CXX test/cpp_headers/trace_parser.o 00:02:57.248 CXX test/cpp_headers/tree.o 00:02:57.248 CXX test/cpp_headers/ublk.o 00:02:57.248 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:57.248 CXX test/cpp_headers/util.o 00:02:57.248 CXX test/cpp_headers/uuid.o 00:02:57.248 CXX test/cpp_headers/version.o 00:02:57.248 CXX test/cpp_headers/vfio_user_pci.o 00:02:57.248 LINK nvmf 00:02:57.248 CXX test/cpp_headers/vfio_user_spec.o 00:02:57.248 LINK idxd_perf 00:02:57.248 CXX test/cpp_headers/vhost.o 00:02:57.248 CXX test/cpp_headers/vmd.o 00:02:57.248 CXX test/cpp_headers/xor.o 00:02:57.248 CXX test/cpp_headers/zipf.o 00:02:57.248 LINK reconnect 00:02:57.248 LINK arbitration 00:02:57.248 LINK abort 00:02:57.248 LINK spdk_dd 00:02:57.248 LINK dif 00:02:57.248 LINK test_dma 00:02:57.248 LINK accel_perf 00:02:57.248 LINK bdevio 00:02:57.248 LINK pci_ut 00:02:57.248 LINK nvme_compliance 00:02:57.248 LINK blobcli 00:02:57.248 LINK nvme_manage 00:02:57.507 LINK llvm_vfio_fuzz 00:02:57.507 LINK nvme_fuzz 00:02:57.507 LINK spdk_bdev 00:02:57.507 LINK memory_ut 00:02:57.507 LINK spdk_nvme 00:02:57.507 LINK spdk_nvme_identify 00:02:57.507 LINK spdk_nvme_perf 00:02:57.766 LINK vhost_fuzz 00:02:57.766 LINK bdevperf 00:02:57.766 LINK spdk_top 00:02:57.766 LINK cuse 00:02:58.026 LINK llvm_nvme_fuzz 00:02:58.026 LINK spdk_lock 00:02:58.286 LINK iscsi_fuzz 00:03:00.197 LINK esnap 00:03:00.458 00:03:00.458 real 0m23.620s 00:03:00.458 user 4m14.550s 00:03:00.458 sys 2m1.270s 00:03:00.458 16:38:46 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:03:00.458 16:38:46 -- common/autotest_common.sh@10 -- $ set +x 00:03:00.458 ************************************ 00:03:00.458 END TEST make 00:03:00.458 ************************************ 00:03:00.720 16:38:46 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:00.720 16:38:46 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:00.720 16:38:46 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:00.720 16:38:46 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:00.720 16:38:46 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:00.720 16:38:46 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:00.720 16:38:46 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:00.720 16:38:46 -- scripts/common.sh@335 -- # IFS=.-: 00:03:00.720 16:38:46 -- scripts/common.sh@335 -- # read -ra ver1 00:03:00.720 16:38:46 -- scripts/common.sh@336 -- # IFS=.-: 00:03:00.720 16:38:46 -- scripts/common.sh@336 -- # read -ra ver2 00:03:00.720 16:38:46 -- scripts/common.sh@337 -- # local 'op=<' 00:03:00.720 16:38:46 -- scripts/common.sh@339 -- # ver1_l=2 00:03:00.720 16:38:46 -- scripts/common.sh@340 -- # ver2_l=1 00:03:00.720 16:38:46 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:00.720 16:38:46 -- scripts/common.sh@343 -- # case "$op" in 00:03:00.720 16:38:46 -- scripts/common.sh@344 -- # : 1 00:03:00.720 16:38:46 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:00.720 16:38:46 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:00.720 16:38:46 -- scripts/common.sh@364 -- # decimal 1 00:03:00.720 16:38:46 -- scripts/common.sh@352 -- # local d=1 00:03:00.720 16:38:46 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:00.720 16:38:46 -- scripts/common.sh@354 -- # echo 1 00:03:00.720 16:38:46 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:00.720 16:38:46 -- scripts/common.sh@365 -- # decimal 2 00:03:00.720 16:38:46 -- scripts/common.sh@352 -- # local d=2 00:03:00.720 16:38:46 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:00.720 16:38:46 -- scripts/common.sh@354 -- # echo 2 00:03:00.720 16:38:46 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:00.720 16:38:46 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:00.720 16:38:46 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:00.720 16:38:46 -- scripts/common.sh@367 -- # return 0 00:03:00.720 16:38:46 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:00.720 16:38:46 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:00.720 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:00.720 --rc genhtml_branch_coverage=1 00:03:00.720 --rc genhtml_function_coverage=1 00:03:00.720 --rc genhtml_legend=1 00:03:00.720 --rc geninfo_all_blocks=1 00:03:00.720 --rc geninfo_unexecuted_blocks=1 00:03:00.720 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:00.720 ' 00:03:00.720 16:38:46 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:00.720 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:00.720 --rc genhtml_branch_coverage=1 00:03:00.720 --rc genhtml_function_coverage=1 00:03:00.720 --rc genhtml_legend=1 00:03:00.720 --rc geninfo_all_blocks=1 00:03:00.720 --rc geninfo_unexecuted_blocks=1 00:03:00.720 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:00.720 ' 00:03:00.720 16:38:46 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:00.720 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:00.720 --rc genhtml_branch_coverage=1 00:03:00.720 --rc genhtml_function_coverage=1 00:03:00.720 --rc genhtml_legend=1 00:03:00.720 --rc geninfo_all_blocks=1 00:03:00.720 --rc geninfo_unexecuted_blocks=1 00:03:00.720 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:00.720 ' 00:03:00.720 16:38:46 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:00.720 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:00.720 --rc genhtml_branch_coverage=1 00:03:00.720 --rc genhtml_function_coverage=1 00:03:00.720 --rc genhtml_legend=1 00:03:00.720 --rc geninfo_all_blocks=1 00:03:00.720 --rc geninfo_unexecuted_blocks=1 00:03:00.720 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:00.720 ' 00:03:00.720 16:38:46 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:03:00.720 16:38:46 -- nvmf/common.sh@7 -- # uname -s 00:03:00.720 16:38:46 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:00.720 16:38:46 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:00.720 16:38:46 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:00.720 16:38:46 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:00.720 16:38:46 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:00.720 16:38:46 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:00.720 16:38:46 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:00.720 16:38:46 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:00.720 16:38:46 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:00.720 16:38:46 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:00.720 16:38:46 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:03:00.720 16:38:46 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:03:00.720 16:38:46 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:00.720 16:38:46 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:00.721 16:38:46 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:00.721 16:38:46 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:03:00.721 16:38:46 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:00.721 16:38:46 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:00.721 16:38:46 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:00.721 16:38:46 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:00.721 16:38:46 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:00.721 16:38:46 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:00.721 16:38:46 -- paths/export.sh@5 -- # export PATH 00:03:00.721 16:38:46 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:00.721 16:38:46 -- nvmf/common.sh@46 -- # : 0 00:03:00.721 16:38:46 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:03:00.721 16:38:46 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:03:00.721 16:38:46 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:03:00.721 16:38:46 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:00.721 16:38:46 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:00.721 16:38:46 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:03:00.721 16:38:46 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:03:00.721 16:38:46 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:03:00.721 16:38:46 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:00.721 16:38:46 -- spdk/autotest.sh@32 -- # uname -s 00:03:00.721 16:38:46 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:00.721 16:38:46 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:00.721 16:38:46 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:00.721 16:38:46 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:00.721 16:38:46 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:00.721 16:38:46 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:00.721 16:38:46 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:00.721 16:38:46 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:00.721 16:38:46 -- spdk/autotest.sh@48 -- # udevadm_pid=408759 00:03:00.721 16:38:46 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:00.721 16:38:46 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:00.721 16:38:46 -- spdk/autotest.sh@54 -- # echo 408761 00:03:00.721 16:38:46 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:00.721 16:38:46 -- spdk/autotest.sh@56 -- # echo 408762 00:03:00.721 16:38:46 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:03:00.721 16:38:46 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:03:00.721 16:38:46 -- spdk/autotest.sh@60 -- # echo 408763 00:03:00.721 16:38:46 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:00.721 16:38:46 -- spdk/autotest.sh@62 -- # echo 408764 00:03:00.721 16:38:46 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:00.721 16:38:46 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:03:00.721 16:38:46 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:03:00.721 16:38:46 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:00.721 16:38:46 -- common/autotest_common.sh@10 -- # set +x 00:03:00.721 16:38:46 -- spdk/autotest.sh@70 -- # create_test_list 00:03:00.721 16:38:46 -- common/autotest_common.sh@746 -- # xtrace_disable 00:03:00.721 16:38:46 -- common/autotest_common.sh@10 -- # set +x 00:03:00.721 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:03:00.721 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:03:00.982 16:38:46 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:03:00.982 16:38:46 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:00.982 16:38:46 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:00.982 16:38:46 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:03:00.982 16:38:46 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:00.982 16:38:46 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:03:00.982 16:38:46 -- common/autotest_common.sh@1450 -- # uname 00:03:00.982 16:38:46 -- common/autotest_common.sh@1450 -- # '[' Linux = FreeBSD ']' 00:03:00.982 16:38:46 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:03:00.982 16:38:46 -- common/autotest_common.sh@1470 -- # uname 00:03:00.982 16:38:46 -- common/autotest_common.sh@1470 -- # [[ Linux = FreeBSD ]] 00:03:00.982 16:38:46 -- spdk/autotest.sh@79 -- # [[ y == y ]] 00:03:00.982 16:38:46 -- spdk/autotest.sh@81 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:03:00.982 lcov: LCOV version 1.15 00:03:00.982 16:38:46 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:03:02.895 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcno 00:03:02.895 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcno 00:03:02.895 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcno 00:03:15.123 16:39:00 -- spdk/autotest.sh@87 -- # timing_enter pre_cleanup 00:03:15.123 16:39:00 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:15.123 16:39:00 -- common/autotest_common.sh@10 -- # set +x 00:03:15.123 16:39:00 -- spdk/autotest.sh@89 -- # rm -f 00:03:15.123 16:39:00 -- spdk/autotest.sh@92 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:18.418 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:18.418 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:18.418 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:18.418 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:18.418 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:18.418 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:18.418 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:18.679 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:18.679 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:18.679 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:18.679 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:18.679 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:18.679 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:18.679 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:18.679 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:18.679 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:18.939 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:03:18.939 16:39:04 -- spdk/autotest.sh@94 -- # get_zoned_devs 00:03:18.939 16:39:04 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:03:18.939 16:39:04 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:03:18.939 16:39:04 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:03:18.939 16:39:04 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:03:18.939 16:39:04 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:03:18.939 16:39:04 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:03:18.939 16:39:04 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:18.939 16:39:04 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:03:18.939 16:39:04 -- spdk/autotest.sh@96 -- # (( 0 > 0 )) 00:03:18.939 16:39:04 -- spdk/autotest.sh@108 -- # ls /dev/nvme0n1 00:03:18.939 16:39:04 -- spdk/autotest.sh@108 -- # grep -v p 00:03:18.939 16:39:04 -- spdk/autotest.sh@108 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:03:18.939 16:39:04 -- spdk/autotest.sh@110 -- # [[ -z '' ]] 00:03:18.939 16:39:04 -- spdk/autotest.sh@111 -- # block_in_use /dev/nvme0n1 00:03:18.939 16:39:04 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:03:18.939 16:39:04 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:18.939 No valid GPT data, bailing 00:03:18.939 16:39:04 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:18.939 16:39:04 -- scripts/common.sh@393 -- # pt= 00:03:18.939 16:39:04 -- scripts/common.sh@394 -- # return 1 00:03:18.939 16:39:04 -- spdk/autotest.sh@112 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:18.939 1+0 records in 00:03:18.939 1+0 records out 00:03:18.939 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00640302 s, 164 MB/s 00:03:18.939 16:39:04 -- spdk/autotest.sh@116 -- # sync 00:03:18.939 16:39:04 -- spdk/autotest.sh@118 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:18.939 16:39:04 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:18.939 16:39:04 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:27.078 16:39:11 -- spdk/autotest.sh@122 -- # uname -s 00:03:27.078 16:39:11 -- spdk/autotest.sh@122 -- # '[' Linux = Linux ']' 00:03:27.078 16:39:11 -- spdk/autotest.sh@123 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:27.078 16:39:11 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:27.078 16:39:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:27.078 16:39:11 -- common/autotest_common.sh@10 -- # set +x 00:03:27.078 ************************************ 00:03:27.078 START TEST setup.sh 00:03:27.078 ************************************ 00:03:27.078 16:39:11 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:27.078 * Looking for test storage... 00:03:27.078 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:27.078 16:39:11 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:27.078 16:39:11 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:27.078 16:39:11 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:27.078 16:39:11 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:27.078 16:39:11 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:27.078 16:39:11 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:27.078 16:39:11 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:27.078 16:39:11 -- scripts/common.sh@335 -- # IFS=.-: 00:03:27.078 16:39:11 -- scripts/common.sh@335 -- # read -ra ver1 00:03:27.078 16:39:11 -- scripts/common.sh@336 -- # IFS=.-: 00:03:27.078 16:39:11 -- scripts/common.sh@336 -- # read -ra ver2 00:03:27.078 16:39:11 -- scripts/common.sh@337 -- # local 'op=<' 00:03:27.078 16:39:11 -- scripts/common.sh@339 -- # ver1_l=2 00:03:27.078 16:39:11 -- scripts/common.sh@340 -- # ver2_l=1 00:03:27.078 16:39:11 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:27.078 16:39:11 -- scripts/common.sh@343 -- # case "$op" in 00:03:27.078 16:39:11 -- scripts/common.sh@344 -- # : 1 00:03:27.078 16:39:11 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:27.078 16:39:11 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:27.078 16:39:11 -- scripts/common.sh@364 -- # decimal 1 00:03:27.078 16:39:11 -- scripts/common.sh@352 -- # local d=1 00:03:27.078 16:39:11 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:27.078 16:39:11 -- scripts/common.sh@354 -- # echo 1 00:03:27.078 16:39:11 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:27.078 16:39:11 -- scripts/common.sh@365 -- # decimal 2 00:03:27.078 16:39:11 -- scripts/common.sh@352 -- # local d=2 00:03:27.078 16:39:11 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:27.078 16:39:11 -- scripts/common.sh@354 -- # echo 2 00:03:27.078 16:39:11 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:27.078 16:39:11 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:27.078 16:39:11 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:27.078 16:39:11 -- scripts/common.sh@367 -- # return 0 00:03:27.078 16:39:11 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:27.078 16:39:11 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:27.078 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:27.078 --rc genhtml_branch_coverage=1 00:03:27.078 --rc genhtml_function_coverage=1 00:03:27.078 --rc genhtml_legend=1 00:03:27.078 --rc geninfo_all_blocks=1 00:03:27.078 --rc geninfo_unexecuted_blocks=1 00:03:27.078 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:27.078 ' 00:03:27.078 16:39:11 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:27.078 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:27.078 --rc genhtml_branch_coverage=1 00:03:27.078 --rc genhtml_function_coverage=1 00:03:27.078 --rc genhtml_legend=1 00:03:27.078 --rc geninfo_all_blocks=1 00:03:27.078 --rc geninfo_unexecuted_blocks=1 00:03:27.078 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:27.078 ' 00:03:27.078 16:39:11 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:27.078 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:27.078 --rc genhtml_branch_coverage=1 00:03:27.078 --rc genhtml_function_coverage=1 00:03:27.078 --rc genhtml_legend=1 00:03:27.078 --rc geninfo_all_blocks=1 00:03:27.078 --rc geninfo_unexecuted_blocks=1 00:03:27.078 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:27.078 ' 00:03:27.078 16:39:11 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:27.078 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:27.078 --rc genhtml_branch_coverage=1 00:03:27.078 --rc genhtml_function_coverage=1 00:03:27.078 --rc genhtml_legend=1 00:03:27.078 --rc geninfo_all_blocks=1 00:03:27.078 --rc geninfo_unexecuted_blocks=1 00:03:27.078 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:27.078 ' 00:03:27.078 16:39:11 -- setup/test-setup.sh@10 -- # uname -s 00:03:27.078 16:39:11 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:27.078 16:39:11 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:27.078 16:39:11 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:27.078 16:39:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:27.078 16:39:11 -- common/autotest_common.sh@10 -- # set +x 00:03:27.078 ************************************ 00:03:27.078 START TEST acl 00:03:27.078 ************************************ 00:03:27.078 16:39:11 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:27.078 * Looking for test storage... 00:03:27.079 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:27.079 16:39:12 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:27.079 16:39:12 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:27.079 16:39:12 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:27.079 16:39:12 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:27.079 16:39:12 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:27.079 16:39:12 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:27.079 16:39:12 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:27.079 16:39:12 -- scripts/common.sh@335 -- # IFS=.-: 00:03:27.079 16:39:12 -- scripts/common.sh@335 -- # read -ra ver1 00:03:27.079 16:39:12 -- scripts/common.sh@336 -- # IFS=.-: 00:03:27.079 16:39:12 -- scripts/common.sh@336 -- # read -ra ver2 00:03:27.079 16:39:12 -- scripts/common.sh@337 -- # local 'op=<' 00:03:27.079 16:39:12 -- scripts/common.sh@339 -- # ver1_l=2 00:03:27.079 16:39:12 -- scripts/common.sh@340 -- # ver2_l=1 00:03:27.079 16:39:12 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:27.079 16:39:12 -- scripts/common.sh@343 -- # case "$op" in 00:03:27.079 16:39:12 -- scripts/common.sh@344 -- # : 1 00:03:27.079 16:39:12 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:27.079 16:39:12 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:27.079 16:39:12 -- scripts/common.sh@364 -- # decimal 1 00:03:27.079 16:39:12 -- scripts/common.sh@352 -- # local d=1 00:03:27.079 16:39:12 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:27.079 16:39:12 -- scripts/common.sh@354 -- # echo 1 00:03:27.079 16:39:12 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:27.079 16:39:12 -- scripts/common.sh@365 -- # decimal 2 00:03:27.079 16:39:12 -- scripts/common.sh@352 -- # local d=2 00:03:27.079 16:39:12 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:27.079 16:39:12 -- scripts/common.sh@354 -- # echo 2 00:03:27.079 16:39:12 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:27.079 16:39:12 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:27.079 16:39:12 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:27.079 16:39:12 -- scripts/common.sh@367 -- # return 0 00:03:27.079 16:39:12 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:27.079 16:39:12 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:27.079 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:27.079 --rc genhtml_branch_coverage=1 00:03:27.079 --rc genhtml_function_coverage=1 00:03:27.079 --rc genhtml_legend=1 00:03:27.079 --rc geninfo_all_blocks=1 00:03:27.079 --rc geninfo_unexecuted_blocks=1 00:03:27.079 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:27.079 ' 00:03:27.079 16:39:12 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:27.079 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:27.079 --rc genhtml_branch_coverage=1 00:03:27.079 --rc genhtml_function_coverage=1 00:03:27.079 --rc genhtml_legend=1 00:03:27.079 --rc geninfo_all_blocks=1 00:03:27.079 --rc geninfo_unexecuted_blocks=1 00:03:27.079 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:27.079 ' 00:03:27.079 16:39:12 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:27.079 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:27.079 --rc genhtml_branch_coverage=1 00:03:27.079 --rc genhtml_function_coverage=1 00:03:27.079 --rc genhtml_legend=1 00:03:27.079 --rc geninfo_all_blocks=1 00:03:27.079 --rc geninfo_unexecuted_blocks=1 00:03:27.079 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:27.079 ' 00:03:27.079 16:39:12 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:27.079 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:27.079 --rc genhtml_branch_coverage=1 00:03:27.079 --rc genhtml_function_coverage=1 00:03:27.079 --rc genhtml_legend=1 00:03:27.079 --rc geninfo_all_blocks=1 00:03:27.079 --rc geninfo_unexecuted_blocks=1 00:03:27.079 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:27.079 ' 00:03:27.079 16:39:12 -- setup/acl.sh@10 -- # get_zoned_devs 00:03:27.079 16:39:12 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:03:27.079 16:39:12 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:03:27.079 16:39:12 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:03:27.079 16:39:12 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:03:27.079 16:39:12 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:03:27.079 16:39:12 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:03:27.079 16:39:12 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:27.079 16:39:12 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:03:27.079 16:39:12 -- setup/acl.sh@12 -- # devs=() 00:03:27.079 16:39:12 -- setup/acl.sh@12 -- # declare -a devs 00:03:27.079 16:39:12 -- setup/acl.sh@13 -- # drivers=() 00:03:27.079 16:39:12 -- setup/acl.sh@13 -- # declare -A drivers 00:03:27.079 16:39:12 -- setup/acl.sh@51 -- # setup reset 00:03:27.079 16:39:12 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:27.079 16:39:12 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:30.373 16:39:16 -- setup/acl.sh@52 -- # collect_setup_devs 00:03:30.373 16:39:16 -- setup/acl.sh@16 -- # local dev driver 00:03:30.373 16:39:16 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:30.373 16:39:16 -- setup/acl.sh@15 -- # setup output status 00:03:30.373 16:39:16 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:30.373 16:39:16 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:03:33.668 Hugepages 00:03:33.668 node hugesize free / total 00:03:33.668 16:39:19 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:33.668 16:39:19 -- setup/acl.sh@19 -- # continue 00:03:33.668 16:39:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.668 16:39:19 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:33.668 16:39:19 -- setup/acl.sh@19 -- # continue 00:03:33.668 16:39:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.668 16:39:19 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:33.668 16:39:19 -- setup/acl.sh@19 -- # continue 00:03:33.669 16:39:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.669 00:03:33.669 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:33.669 16:39:19 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:33.669 16:39:19 -- setup/acl.sh@19 -- # continue 00:03:33.669 16:39:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.669 16:39:19 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # continue 00:03:33.669 16:39:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.669 16:39:19 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # continue 00:03:33.669 16:39:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.669 16:39:19 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # continue 00:03:33.669 16:39:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.669 16:39:19 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # continue 00:03:33.669 16:39:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.669 16:39:19 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # continue 00:03:33.669 16:39:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.669 16:39:19 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # continue 00:03:33.669 16:39:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.669 16:39:19 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # continue 00:03:33.669 16:39:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.669 16:39:19 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # continue 00:03:33.669 16:39:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.669 16:39:19 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # continue 00:03:33.669 16:39:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.669 16:39:19 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # continue 00:03:33.669 16:39:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.669 16:39:19 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # continue 00:03:33.669 16:39:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.669 16:39:19 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # continue 00:03:33.669 16:39:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.669 16:39:19 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # continue 00:03:33.669 16:39:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.669 16:39:19 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # continue 00:03:33.669 16:39:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.669 16:39:19 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # continue 00:03:33.669 16:39:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.669 16:39:19 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:33.669 16:39:19 -- setup/acl.sh@20 -- # continue 00:03:33.669 16:39:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.929 16:39:19 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:03:33.929 16:39:19 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:33.929 16:39:19 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:03:33.929 16:39:19 -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:33.929 16:39:19 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:33.929 16:39:19 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:33.929 16:39:19 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:33.929 16:39:19 -- setup/acl.sh@54 -- # run_test denied denied 00:03:33.929 16:39:19 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:33.929 16:39:19 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:33.929 16:39:19 -- common/autotest_common.sh@10 -- # set +x 00:03:33.929 ************************************ 00:03:33.929 START TEST denied 00:03:33.929 ************************************ 00:03:33.929 16:39:19 -- common/autotest_common.sh@1114 -- # denied 00:03:33.929 16:39:19 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:03:33.929 16:39:19 -- setup/acl.sh@38 -- # setup output config 00:03:33.929 16:39:19 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:03:33.929 16:39:19 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:33.929 16:39:19 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:38.129 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:03:38.130 16:39:23 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:03:38.130 16:39:23 -- setup/acl.sh@28 -- # local dev driver 00:03:38.130 16:39:23 -- setup/acl.sh@30 -- # for dev in "$@" 00:03:38.130 16:39:23 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:03:38.130 16:39:23 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:03:38.130 16:39:23 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:38.130 16:39:23 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:38.130 16:39:23 -- setup/acl.sh@41 -- # setup reset 00:03:38.130 16:39:23 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:38.130 16:39:23 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:43.412 00:03:43.412 real 0m8.615s 00:03:43.412 user 0m2.781s 00:03:43.412 sys 0m5.153s 00:03:43.412 16:39:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:43.412 16:39:28 -- common/autotest_common.sh@10 -- # set +x 00:03:43.412 ************************************ 00:03:43.412 END TEST denied 00:03:43.412 ************************************ 00:03:43.412 16:39:28 -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:43.412 16:39:28 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:43.412 16:39:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:43.412 16:39:28 -- common/autotest_common.sh@10 -- # set +x 00:03:43.412 ************************************ 00:03:43.412 START TEST allowed 00:03:43.412 ************************************ 00:03:43.412 16:39:28 -- common/autotest_common.sh@1114 -- # allowed 00:03:43.412 16:39:28 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:03:43.412 16:39:28 -- setup/acl.sh@45 -- # setup output config 00:03:43.412 16:39:28 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:03:43.412 16:39:28 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:43.412 16:39:28 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:47.610 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:47.610 16:39:33 -- setup/acl.sh@47 -- # verify 00:03:47.610 16:39:33 -- setup/acl.sh@28 -- # local dev driver 00:03:47.610 16:39:33 -- setup/acl.sh@48 -- # setup reset 00:03:47.610 16:39:33 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:47.610 16:39:33 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:51.810 00:03:51.810 real 0m9.235s 00:03:51.810 user 0m2.677s 00:03:51.810 sys 0m5.153s 00:03:51.810 16:39:37 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:51.810 16:39:37 -- common/autotest_common.sh@10 -- # set +x 00:03:51.810 ************************************ 00:03:51.810 END TEST allowed 00:03:51.810 ************************************ 00:03:51.810 00:03:51.810 real 0m25.564s 00:03:51.810 user 0m8.347s 00:03:51.810 sys 0m15.444s 00:03:51.810 16:39:37 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:51.810 16:39:37 -- common/autotest_common.sh@10 -- # set +x 00:03:51.810 ************************************ 00:03:51.810 END TEST acl 00:03:51.810 ************************************ 00:03:51.810 16:39:37 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:51.810 16:39:37 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:51.810 16:39:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:51.810 16:39:37 -- common/autotest_common.sh@10 -- # set +x 00:03:51.810 ************************************ 00:03:51.810 START TEST hugepages 00:03:51.810 ************************************ 00:03:51.810 16:39:37 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:52.071 * Looking for test storage... 00:03:52.071 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:52.071 16:39:37 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:52.071 16:39:37 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:52.071 16:39:37 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:52.071 16:39:37 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:52.071 16:39:37 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:52.071 16:39:37 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:52.071 16:39:37 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:52.071 16:39:37 -- scripts/common.sh@335 -- # IFS=.-: 00:03:52.071 16:39:37 -- scripts/common.sh@335 -- # read -ra ver1 00:03:52.071 16:39:37 -- scripts/common.sh@336 -- # IFS=.-: 00:03:52.071 16:39:37 -- scripts/common.sh@336 -- # read -ra ver2 00:03:52.071 16:39:37 -- scripts/common.sh@337 -- # local 'op=<' 00:03:52.071 16:39:37 -- scripts/common.sh@339 -- # ver1_l=2 00:03:52.071 16:39:37 -- scripts/common.sh@340 -- # ver2_l=1 00:03:52.071 16:39:37 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:52.071 16:39:37 -- scripts/common.sh@343 -- # case "$op" in 00:03:52.071 16:39:37 -- scripts/common.sh@344 -- # : 1 00:03:52.071 16:39:37 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:52.071 16:39:37 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:52.071 16:39:37 -- scripts/common.sh@364 -- # decimal 1 00:03:52.071 16:39:37 -- scripts/common.sh@352 -- # local d=1 00:03:52.071 16:39:37 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:52.071 16:39:37 -- scripts/common.sh@354 -- # echo 1 00:03:52.071 16:39:37 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:52.071 16:39:37 -- scripts/common.sh@365 -- # decimal 2 00:03:52.071 16:39:37 -- scripts/common.sh@352 -- # local d=2 00:03:52.071 16:39:37 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:52.071 16:39:37 -- scripts/common.sh@354 -- # echo 2 00:03:52.071 16:39:37 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:52.071 16:39:37 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:52.071 16:39:37 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:52.071 16:39:37 -- scripts/common.sh@367 -- # return 0 00:03:52.071 16:39:37 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:52.071 16:39:37 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:52.071 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:52.071 --rc genhtml_branch_coverage=1 00:03:52.071 --rc genhtml_function_coverage=1 00:03:52.071 --rc genhtml_legend=1 00:03:52.072 --rc geninfo_all_blocks=1 00:03:52.072 --rc geninfo_unexecuted_blocks=1 00:03:52.072 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:52.072 ' 00:03:52.072 16:39:37 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:52.072 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:52.072 --rc genhtml_branch_coverage=1 00:03:52.072 --rc genhtml_function_coverage=1 00:03:52.072 --rc genhtml_legend=1 00:03:52.072 --rc geninfo_all_blocks=1 00:03:52.072 --rc geninfo_unexecuted_blocks=1 00:03:52.072 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:52.072 ' 00:03:52.072 16:39:37 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:52.072 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:52.072 --rc genhtml_branch_coverage=1 00:03:52.072 --rc genhtml_function_coverage=1 00:03:52.072 --rc genhtml_legend=1 00:03:52.072 --rc geninfo_all_blocks=1 00:03:52.072 --rc geninfo_unexecuted_blocks=1 00:03:52.072 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:52.072 ' 00:03:52.072 16:39:37 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:52.072 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:52.072 --rc genhtml_branch_coverage=1 00:03:52.072 --rc genhtml_function_coverage=1 00:03:52.072 --rc genhtml_legend=1 00:03:52.072 --rc geninfo_all_blocks=1 00:03:52.072 --rc geninfo_unexecuted_blocks=1 00:03:52.072 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:52.072 ' 00:03:52.072 16:39:37 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:52.072 16:39:37 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:52.072 16:39:37 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:52.072 16:39:37 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:52.072 16:39:37 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:52.072 16:39:37 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:52.072 16:39:37 -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:52.072 16:39:37 -- setup/common.sh@18 -- # local node= 00:03:52.072 16:39:37 -- setup/common.sh@19 -- # local var val 00:03:52.072 16:39:37 -- setup/common.sh@20 -- # local mem_f mem 00:03:52.072 16:39:37 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:52.072 16:39:37 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:52.072 16:39:37 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:52.072 16:39:37 -- setup/common.sh@28 -- # mapfile -t mem 00:03:52.072 16:39:37 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 39383276 kB' 'MemAvailable: 43109296 kB' 'Buffers: 8940 kB' 'Cached: 12523572 kB' 'SwapCached: 0 kB' 'Active: 9392724 kB' 'Inactive: 3688316 kB' 'Active(anon): 8976228 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 551968 kB' 'Mapped: 149240 kB' 'Shmem: 8427700 kB' 'KReclaimable: 236148 kB' 'Slab: 905112 kB' 'SReclaimable: 236148 kB' 'SUnreclaim: 668964 kB' 'KernelStack: 21760 kB' 'PageTables: 7692 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433336 kB' 'Committed_AS: 10238116 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214000 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.072 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.072 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # continue 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # IFS=': ' 00:03:52.073 16:39:37 -- setup/common.sh@31 -- # read -r var val _ 00:03:52.073 16:39:37 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:52.073 16:39:37 -- setup/common.sh@33 -- # echo 2048 00:03:52.073 16:39:37 -- setup/common.sh@33 -- # return 0 00:03:52.073 16:39:37 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:52.073 16:39:37 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:52.073 16:39:37 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:52.073 16:39:37 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:52.073 16:39:37 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:52.073 16:39:37 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:52.073 16:39:37 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:52.073 16:39:37 -- setup/hugepages.sh@207 -- # get_nodes 00:03:52.073 16:39:37 -- setup/hugepages.sh@27 -- # local node 00:03:52.073 16:39:37 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:52.073 16:39:37 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:03:52.073 16:39:37 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:52.073 16:39:37 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:52.073 16:39:37 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:52.073 16:39:37 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:52.073 16:39:37 -- setup/hugepages.sh@208 -- # clear_hp 00:03:52.073 16:39:37 -- setup/hugepages.sh@37 -- # local node hp 00:03:52.073 16:39:37 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:52.073 16:39:37 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:52.073 16:39:37 -- setup/hugepages.sh@41 -- # echo 0 00:03:52.073 16:39:37 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:52.073 16:39:37 -- setup/hugepages.sh@41 -- # echo 0 00:03:52.073 16:39:37 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:52.073 16:39:37 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:52.073 16:39:37 -- setup/hugepages.sh@41 -- # echo 0 00:03:52.073 16:39:37 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:52.073 16:39:37 -- setup/hugepages.sh@41 -- # echo 0 00:03:52.073 16:39:37 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:52.073 16:39:37 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:52.073 16:39:37 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:52.073 16:39:37 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:52.073 16:39:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:52.073 16:39:37 -- common/autotest_common.sh@10 -- # set +x 00:03:52.073 ************************************ 00:03:52.073 START TEST default_setup 00:03:52.073 ************************************ 00:03:52.073 16:39:37 -- common/autotest_common.sh@1114 -- # default_setup 00:03:52.073 16:39:37 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:52.073 16:39:37 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:52.073 16:39:37 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:52.073 16:39:37 -- setup/hugepages.sh@51 -- # shift 00:03:52.073 16:39:37 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:52.073 16:39:37 -- setup/hugepages.sh@52 -- # local node_ids 00:03:52.073 16:39:37 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:52.073 16:39:37 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:52.073 16:39:37 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:52.073 16:39:37 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:52.073 16:39:37 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:52.073 16:39:37 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:52.073 16:39:37 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:52.074 16:39:37 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:52.074 16:39:37 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:52.074 16:39:37 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:52.074 16:39:37 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:52.074 16:39:37 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:52.074 16:39:37 -- setup/hugepages.sh@73 -- # return 0 00:03:52.074 16:39:37 -- setup/hugepages.sh@137 -- # setup output 00:03:52.074 16:39:37 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:52.074 16:39:37 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:56.271 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:56.271 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:56.271 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:56.271 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:56.271 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:56.271 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:56.271 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:56.271 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:56.271 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:56.271 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:56.271 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:56.271 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:56.271 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:56.271 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:56.271 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:56.271 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:57.210 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:57.473 16:39:43 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:57.473 16:39:43 -- setup/hugepages.sh@89 -- # local node 00:03:57.473 16:39:43 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:57.473 16:39:43 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:57.473 16:39:43 -- setup/hugepages.sh@92 -- # local surp 00:03:57.473 16:39:43 -- setup/hugepages.sh@93 -- # local resv 00:03:57.473 16:39:43 -- setup/hugepages.sh@94 -- # local anon 00:03:57.473 16:39:43 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:57.473 16:39:43 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:57.473 16:39:43 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:57.473 16:39:43 -- setup/common.sh@18 -- # local node= 00:03:57.473 16:39:43 -- setup/common.sh@19 -- # local var val 00:03:57.473 16:39:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:57.473 16:39:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.473 16:39:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:57.473 16:39:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:57.473 16:39:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.473 16:39:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.473 16:39:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41578984 kB' 'MemAvailable: 45304456 kB' 'Buffers: 8940 kB' 'Cached: 12523720 kB' 'SwapCached: 0 kB' 'Active: 9394088 kB' 'Inactive: 3688316 kB' 'Active(anon): 8977592 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 552532 kB' 'Mapped: 149264 kB' 'Shmem: 8427848 kB' 'KReclaimable: 235052 kB' 'Slab: 903420 kB' 'SReclaimable: 235052 kB' 'SUnreclaim: 668368 kB' 'KernelStack: 21760 kB' 'PageTables: 7416 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10241612 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214208 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.473 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.473 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.474 16:39:43 -- setup/common.sh@33 -- # echo 0 00:03:57.474 16:39:43 -- setup/common.sh@33 -- # return 0 00:03:57.474 16:39:43 -- setup/hugepages.sh@97 -- # anon=0 00:03:57.474 16:39:43 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:57.474 16:39:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:57.474 16:39:43 -- setup/common.sh@18 -- # local node= 00:03:57.474 16:39:43 -- setup/common.sh@19 -- # local var val 00:03:57.474 16:39:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:57.474 16:39:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.474 16:39:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:57.474 16:39:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:57.474 16:39:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.474 16:39:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41580464 kB' 'MemAvailable: 45305936 kB' 'Buffers: 8940 kB' 'Cached: 12523724 kB' 'SwapCached: 0 kB' 'Active: 9393928 kB' 'Inactive: 3688316 kB' 'Active(anon): 8977432 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 552388 kB' 'Mapped: 149296 kB' 'Shmem: 8427852 kB' 'KReclaimable: 235052 kB' 'Slab: 903500 kB' 'SReclaimable: 235052 kB' 'SUnreclaim: 668448 kB' 'KernelStack: 21792 kB' 'PageTables: 7476 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10241624 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.474 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.474 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.475 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.475 16:39:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.476 16:39:43 -- setup/common.sh@33 -- # echo 0 00:03:57.476 16:39:43 -- setup/common.sh@33 -- # return 0 00:03:57.476 16:39:43 -- setup/hugepages.sh@99 -- # surp=0 00:03:57.476 16:39:43 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:57.476 16:39:43 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:57.476 16:39:43 -- setup/common.sh@18 -- # local node= 00:03:57.476 16:39:43 -- setup/common.sh@19 -- # local var val 00:03:57.476 16:39:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:57.476 16:39:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.476 16:39:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:57.476 16:39:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:57.476 16:39:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.476 16:39:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41579760 kB' 'MemAvailable: 45305232 kB' 'Buffers: 8940 kB' 'Cached: 12523736 kB' 'SwapCached: 0 kB' 'Active: 9393392 kB' 'Inactive: 3688316 kB' 'Active(anon): 8976896 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 552260 kB' 'Mapped: 149216 kB' 'Shmem: 8427864 kB' 'KReclaimable: 235052 kB' 'Slab: 903476 kB' 'SReclaimable: 235052 kB' 'SUnreclaim: 668424 kB' 'KernelStack: 21856 kB' 'PageTables: 7580 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10241640 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.476 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.476 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.477 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.477 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.478 16:39:43 -- setup/common.sh@33 -- # echo 0 00:03:57.478 16:39:43 -- setup/common.sh@33 -- # return 0 00:03:57.478 16:39:43 -- setup/hugepages.sh@100 -- # resv=0 00:03:57.478 16:39:43 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:57.478 nr_hugepages=1024 00:03:57.478 16:39:43 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:57.478 resv_hugepages=0 00:03:57.478 16:39:43 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:57.478 surplus_hugepages=0 00:03:57.478 16:39:43 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:57.478 anon_hugepages=0 00:03:57.478 16:39:43 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:57.478 16:39:43 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:57.478 16:39:43 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:57.478 16:39:43 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:57.478 16:39:43 -- setup/common.sh@18 -- # local node= 00:03:57.478 16:39:43 -- setup/common.sh@19 -- # local var val 00:03:57.478 16:39:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:57.478 16:39:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.478 16:39:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:57.478 16:39:43 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:57.478 16:39:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.478 16:39:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.478 16:39:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41578880 kB' 'MemAvailable: 45304352 kB' 'Buffers: 8940 kB' 'Cached: 12523748 kB' 'SwapCached: 0 kB' 'Active: 9393476 kB' 'Inactive: 3688316 kB' 'Active(anon): 8976980 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 552324 kB' 'Mapped: 149216 kB' 'Shmem: 8427876 kB' 'KReclaimable: 235052 kB' 'Slab: 903476 kB' 'SReclaimable: 235052 kB' 'SUnreclaim: 668424 kB' 'KernelStack: 21952 kB' 'PageTables: 7632 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10241652 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214384 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.478 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.478 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.479 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.479 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.480 16:39:43 -- setup/common.sh@33 -- # echo 1024 00:03:57.480 16:39:43 -- setup/common.sh@33 -- # return 0 00:03:57.480 16:39:43 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:57.480 16:39:43 -- setup/hugepages.sh@112 -- # get_nodes 00:03:57.480 16:39:43 -- setup/hugepages.sh@27 -- # local node 00:03:57.480 16:39:43 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:57.480 16:39:43 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:57.480 16:39:43 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:57.480 16:39:43 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:57.480 16:39:43 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:57.480 16:39:43 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:57.480 16:39:43 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:57.480 16:39:43 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:57.480 16:39:43 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:57.480 16:39:43 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:57.480 16:39:43 -- setup/common.sh@18 -- # local node=0 00:03:57.480 16:39:43 -- setup/common.sh@19 -- # local var val 00:03:57.480 16:39:43 -- setup/common.sh@20 -- # local mem_f mem 00:03:57.480 16:39:43 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.480 16:39:43 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:57.480 16:39:43 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:57.480 16:39:43 -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.480 16:39:43 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 19341488 kB' 'MemUsed: 13243880 kB' 'SwapCached: 0 kB' 'Active: 6480224 kB' 'Inactive: 3554120 kB' 'Active(anon): 6187164 kB' 'Inactive(anon): 0 kB' 'Active(file): 293060 kB' 'Inactive(file): 3554120 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9625492 kB' 'Mapped: 127876 kB' 'AnonPages: 412020 kB' 'Shmem: 5778312 kB' 'KernelStack: 13096 kB' 'PageTables: 5060 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 149952 kB' 'Slab: 460500 kB' 'SReclaimable: 149952 kB' 'SUnreclaim: 310548 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.480 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.480 16:39:43 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # continue 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.481 16:39:43 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.481 16:39:43 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.481 16:39:43 -- setup/common.sh@33 -- # echo 0 00:03:57.481 16:39:43 -- setup/common.sh@33 -- # return 0 00:03:57.481 16:39:43 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:57.481 16:39:43 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:57.481 16:39:43 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:57.481 16:39:43 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:57.481 16:39:43 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:57.481 node0=1024 expecting 1024 00:03:57.481 16:39:43 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:57.481 00:03:57.481 real 0m5.400s 00:03:57.481 user 0m1.511s 00:03:57.481 sys 0m2.465s 00:03:57.481 16:39:43 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:57.481 16:39:43 -- common/autotest_common.sh@10 -- # set +x 00:03:57.481 ************************************ 00:03:57.481 END TEST default_setup 00:03:57.481 ************************************ 00:03:57.741 16:39:43 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:57.741 16:39:43 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:57.741 16:39:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:57.741 16:39:43 -- common/autotest_common.sh@10 -- # set +x 00:03:57.741 ************************************ 00:03:57.741 START TEST per_node_1G_alloc 00:03:57.741 ************************************ 00:03:57.741 16:39:43 -- common/autotest_common.sh@1114 -- # per_node_1G_alloc 00:03:57.741 16:39:43 -- setup/hugepages.sh@143 -- # local IFS=, 00:03:57.741 16:39:43 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:57.741 16:39:43 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:57.741 16:39:43 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:57.741 16:39:43 -- setup/hugepages.sh@51 -- # shift 00:03:57.741 16:39:43 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:57.741 16:39:43 -- setup/hugepages.sh@52 -- # local node_ids 00:03:57.741 16:39:43 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:57.741 16:39:43 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:57.741 16:39:43 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:57.741 16:39:43 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:57.741 16:39:43 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:57.741 16:39:43 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:57.741 16:39:43 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:57.741 16:39:43 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:57.741 16:39:43 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:57.741 16:39:43 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:57.741 16:39:43 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:57.741 16:39:43 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:57.741 16:39:43 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:57.741 16:39:43 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:57.741 16:39:43 -- setup/hugepages.sh@73 -- # return 0 00:03:57.741 16:39:43 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:57.741 16:39:43 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:57.741 16:39:43 -- setup/hugepages.sh@146 -- # setup output 00:03:57.741 16:39:43 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:57.741 16:39:43 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:01.034 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:01.034 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:01.034 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:01.034 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:01.034 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:01.034 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:01.034 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:01.034 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:01.034 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:01.034 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:01.034 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:01.034 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:01.034 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:01.034 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:01.034 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:01.034 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:01.034 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:01.298 16:39:46 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:01.298 16:39:46 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:01.298 16:39:46 -- setup/hugepages.sh@89 -- # local node 00:04:01.298 16:39:46 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:01.298 16:39:46 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:01.298 16:39:46 -- setup/hugepages.sh@92 -- # local surp 00:04:01.298 16:39:46 -- setup/hugepages.sh@93 -- # local resv 00:04:01.298 16:39:46 -- setup/hugepages.sh@94 -- # local anon 00:04:01.298 16:39:46 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:01.298 16:39:46 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:01.298 16:39:46 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:01.298 16:39:46 -- setup/common.sh@18 -- # local node= 00:04:01.298 16:39:46 -- setup/common.sh@19 -- # local var val 00:04:01.298 16:39:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:01.298 16:39:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:01.298 16:39:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:01.298 16:39:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:01.298 16:39:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:01.298 16:39:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.298 16:39:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41610068 kB' 'MemAvailable: 45335540 kB' 'Buffers: 8940 kB' 'Cached: 12523836 kB' 'SwapCached: 0 kB' 'Active: 9393720 kB' 'Inactive: 3688316 kB' 'Active(anon): 8977224 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 552392 kB' 'Mapped: 148228 kB' 'Shmem: 8427964 kB' 'KReclaimable: 235052 kB' 'Slab: 903328 kB' 'SReclaimable: 235052 kB' 'SUnreclaim: 668276 kB' 'KernelStack: 21824 kB' 'PageTables: 7664 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10230400 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214416 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.298 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.298 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:01.299 16:39:46 -- setup/common.sh@33 -- # echo 0 00:04:01.299 16:39:46 -- setup/common.sh@33 -- # return 0 00:04:01.299 16:39:46 -- setup/hugepages.sh@97 -- # anon=0 00:04:01.299 16:39:46 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:01.299 16:39:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:01.299 16:39:46 -- setup/common.sh@18 -- # local node= 00:04:01.299 16:39:46 -- setup/common.sh@19 -- # local var val 00:04:01.299 16:39:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:01.299 16:39:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:01.299 16:39:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:01.299 16:39:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:01.299 16:39:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:01.299 16:39:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41611240 kB' 'MemAvailable: 45336712 kB' 'Buffers: 8940 kB' 'Cached: 12523836 kB' 'SwapCached: 0 kB' 'Active: 9393344 kB' 'Inactive: 3688316 kB' 'Active(anon): 8976848 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 552124 kB' 'Mapped: 148212 kB' 'Shmem: 8427964 kB' 'KReclaimable: 235052 kB' 'Slab: 903372 kB' 'SReclaimable: 235052 kB' 'SUnreclaim: 668320 kB' 'KernelStack: 21792 kB' 'PageTables: 7584 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10230412 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.299 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.299 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.300 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.300 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.301 16:39:46 -- setup/common.sh@33 -- # echo 0 00:04:01.301 16:39:46 -- setup/common.sh@33 -- # return 0 00:04:01.301 16:39:46 -- setup/hugepages.sh@99 -- # surp=0 00:04:01.301 16:39:46 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:01.301 16:39:46 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:01.301 16:39:46 -- setup/common.sh@18 -- # local node= 00:04:01.301 16:39:46 -- setup/common.sh@19 -- # local var val 00:04:01.301 16:39:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:01.301 16:39:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:01.301 16:39:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:01.301 16:39:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:01.301 16:39:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:01.301 16:39:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41611952 kB' 'MemAvailable: 45337424 kB' 'Buffers: 8940 kB' 'Cached: 12523848 kB' 'SwapCached: 0 kB' 'Active: 9393204 kB' 'Inactive: 3688316 kB' 'Active(anon): 8976708 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 551924 kB' 'Mapped: 148212 kB' 'Shmem: 8427976 kB' 'KReclaimable: 235052 kB' 'Slab: 903372 kB' 'SReclaimable: 235052 kB' 'SUnreclaim: 668320 kB' 'KernelStack: 21776 kB' 'PageTables: 7528 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10230424 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.301 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.301 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:01.302 16:39:46 -- setup/common.sh@33 -- # echo 0 00:04:01.302 16:39:46 -- setup/common.sh@33 -- # return 0 00:04:01.302 16:39:46 -- setup/hugepages.sh@100 -- # resv=0 00:04:01.302 16:39:46 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:01.302 nr_hugepages=1024 00:04:01.302 16:39:46 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:01.302 resv_hugepages=0 00:04:01.302 16:39:46 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:01.302 surplus_hugepages=0 00:04:01.302 16:39:46 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:01.302 anon_hugepages=0 00:04:01.302 16:39:46 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:01.302 16:39:46 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:01.302 16:39:46 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:01.302 16:39:46 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:01.302 16:39:46 -- setup/common.sh@18 -- # local node= 00:04:01.302 16:39:46 -- setup/common.sh@19 -- # local var val 00:04:01.302 16:39:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:01.302 16:39:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:01.302 16:39:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:01.302 16:39:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:01.302 16:39:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:01.302 16:39:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41611700 kB' 'MemAvailable: 45337172 kB' 'Buffers: 8940 kB' 'Cached: 12523864 kB' 'SwapCached: 0 kB' 'Active: 9393380 kB' 'Inactive: 3688316 kB' 'Active(anon): 8976884 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 552124 kB' 'Mapped: 148212 kB' 'Shmem: 8427992 kB' 'KReclaimable: 235052 kB' 'Slab: 903372 kB' 'SReclaimable: 235052 kB' 'SUnreclaim: 668320 kB' 'KernelStack: 21792 kB' 'PageTables: 7584 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10230440 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.302 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.302 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.303 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.303 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:01.304 16:39:46 -- setup/common.sh@33 -- # echo 1024 00:04:01.304 16:39:46 -- setup/common.sh@33 -- # return 0 00:04:01.304 16:39:46 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:01.304 16:39:46 -- setup/hugepages.sh@112 -- # get_nodes 00:04:01.304 16:39:46 -- setup/hugepages.sh@27 -- # local node 00:04:01.304 16:39:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:01.304 16:39:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:01.304 16:39:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:01.304 16:39:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:01.304 16:39:46 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:01.304 16:39:46 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:01.304 16:39:46 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:01.304 16:39:46 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:01.304 16:39:46 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:01.304 16:39:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:01.304 16:39:46 -- setup/common.sh@18 -- # local node=0 00:04:01.304 16:39:46 -- setup/common.sh@19 -- # local var val 00:04:01.304 16:39:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:01.304 16:39:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:01.304 16:39:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:01.304 16:39:46 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:01.304 16:39:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:01.304 16:39:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 20402092 kB' 'MemUsed: 12183276 kB' 'SwapCached: 0 kB' 'Active: 6481372 kB' 'Inactive: 3554120 kB' 'Active(anon): 6188312 kB' 'Inactive(anon): 0 kB' 'Active(file): 293060 kB' 'Inactive(file): 3554120 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9625512 kB' 'Mapped: 127028 kB' 'AnonPages: 413076 kB' 'Shmem: 5778332 kB' 'KernelStack: 12968 kB' 'PageTables: 4912 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 149952 kB' 'Slab: 460364 kB' 'SReclaimable: 149952 kB' 'SUnreclaim: 310412 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.304 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.304 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:46 -- setup/common.sh@33 -- # echo 0 00:04:01.305 16:39:46 -- setup/common.sh@33 -- # return 0 00:04:01.305 16:39:46 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:01.305 16:39:46 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:01.305 16:39:46 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:01.305 16:39:46 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:01.305 16:39:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:01.305 16:39:46 -- setup/common.sh@18 -- # local node=1 00:04:01.305 16:39:46 -- setup/common.sh@19 -- # local var val 00:04:01.305 16:39:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:01.305 16:39:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:01.305 16:39:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:01.305 16:39:46 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:01.305 16:39:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:01.305 16:39:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698400 kB' 'MemFree: 21209924 kB' 'MemUsed: 6488476 kB' 'SwapCached: 0 kB' 'Active: 2912068 kB' 'Inactive: 134196 kB' 'Active(anon): 2788632 kB' 'Inactive(anon): 0 kB' 'Active(file): 123436 kB' 'Inactive(file): 134196 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2907320 kB' 'Mapped: 21184 kB' 'AnonPages: 139060 kB' 'Shmem: 2649688 kB' 'KernelStack: 8824 kB' 'PageTables: 2672 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 85100 kB' 'Slab: 443008 kB' 'SReclaimable: 85100 kB' 'SUnreclaim: 357908 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:46 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.305 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.305 16:39:47 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # continue 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # IFS=': ' 00:04:01.306 16:39:47 -- setup/common.sh@31 -- # read -r var val _ 00:04:01.306 16:39:47 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:01.306 16:39:47 -- setup/common.sh@33 -- # echo 0 00:04:01.306 16:39:47 -- setup/common.sh@33 -- # return 0 00:04:01.306 16:39:47 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:01.306 16:39:47 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:01.306 16:39:47 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:01.306 16:39:47 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:01.306 16:39:47 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:01.306 node0=512 expecting 512 00:04:01.306 16:39:47 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:01.306 16:39:47 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:01.306 16:39:47 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:01.306 16:39:47 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:01.306 node1=512 expecting 512 00:04:01.306 16:39:47 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:01.306 00:04:01.306 real 0m3.773s 00:04:01.306 user 0m1.436s 00:04:01.306 sys 0m2.407s 00:04:01.306 16:39:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:01.306 16:39:47 -- common/autotest_common.sh@10 -- # set +x 00:04:01.306 ************************************ 00:04:01.306 END TEST per_node_1G_alloc 00:04:01.306 ************************************ 00:04:01.568 16:39:47 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:01.568 16:39:47 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:01.568 16:39:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:01.568 16:39:47 -- common/autotest_common.sh@10 -- # set +x 00:04:01.568 ************************************ 00:04:01.568 START TEST even_2G_alloc 00:04:01.568 ************************************ 00:04:01.568 16:39:47 -- common/autotest_common.sh@1114 -- # even_2G_alloc 00:04:01.568 16:39:47 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:01.569 16:39:47 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:01.569 16:39:47 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:01.569 16:39:47 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:01.569 16:39:47 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:01.569 16:39:47 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:01.569 16:39:47 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:01.569 16:39:47 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:01.569 16:39:47 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:01.569 16:39:47 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:01.569 16:39:47 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:01.569 16:39:47 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:01.569 16:39:47 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:01.569 16:39:47 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:01.569 16:39:47 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:01.569 16:39:47 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:01.569 16:39:47 -- setup/hugepages.sh@83 -- # : 512 00:04:01.569 16:39:47 -- setup/hugepages.sh@84 -- # : 1 00:04:01.569 16:39:47 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:01.569 16:39:47 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:01.569 16:39:47 -- setup/hugepages.sh@83 -- # : 0 00:04:01.569 16:39:47 -- setup/hugepages.sh@84 -- # : 0 00:04:01.569 16:39:47 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:01.569 16:39:47 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:01.569 16:39:47 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:01.569 16:39:47 -- setup/hugepages.sh@153 -- # setup output 00:04:01.569 16:39:47 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:01.569 16:39:47 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:04.869 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:04.869 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:04.869 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:04.869 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:04.869 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:04.869 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:04.869 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:04.869 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:04.869 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:04.869 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:04.869 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:04.869 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:04.869 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:04.869 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:04.869 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:04.869 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:04.869 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:05.133 16:39:50 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:05.133 16:39:50 -- setup/hugepages.sh@89 -- # local node 00:04:05.133 16:39:50 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:05.133 16:39:50 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:05.133 16:39:50 -- setup/hugepages.sh@92 -- # local surp 00:04:05.133 16:39:50 -- setup/hugepages.sh@93 -- # local resv 00:04:05.133 16:39:50 -- setup/hugepages.sh@94 -- # local anon 00:04:05.133 16:39:50 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:05.133 16:39:50 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:05.133 16:39:50 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:05.133 16:39:50 -- setup/common.sh@18 -- # local node= 00:04:05.133 16:39:50 -- setup/common.sh@19 -- # local var val 00:04:05.133 16:39:50 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.133 16:39:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.133 16:39:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.133 16:39:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.133 16:39:50 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.133 16:39:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41632840 kB' 'MemAvailable: 45358304 kB' 'Buffers: 8940 kB' 'Cached: 12523976 kB' 'SwapCached: 0 kB' 'Active: 9396448 kB' 'Inactive: 3688316 kB' 'Active(anon): 8979952 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 555296 kB' 'Mapped: 148304 kB' 'Shmem: 8428104 kB' 'KReclaimable: 235036 kB' 'Slab: 902880 kB' 'SReclaimable: 235036 kB' 'SUnreclaim: 667844 kB' 'KernelStack: 21792 kB' 'PageTables: 7656 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10231188 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214176 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.133 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.133 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:05.134 16:39:50 -- setup/common.sh@33 -- # echo 0 00:04:05.134 16:39:50 -- setup/common.sh@33 -- # return 0 00:04:05.134 16:39:50 -- setup/hugepages.sh@97 -- # anon=0 00:04:05.134 16:39:50 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:05.134 16:39:50 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:05.134 16:39:50 -- setup/common.sh@18 -- # local node= 00:04:05.134 16:39:50 -- setup/common.sh@19 -- # local var val 00:04:05.134 16:39:50 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.134 16:39:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.134 16:39:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.134 16:39:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.134 16:39:50 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.134 16:39:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41634060 kB' 'MemAvailable: 45359524 kB' 'Buffers: 8940 kB' 'Cached: 12523976 kB' 'SwapCached: 0 kB' 'Active: 9395532 kB' 'Inactive: 3688316 kB' 'Active(anon): 8979036 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 554404 kB' 'Mapped: 148296 kB' 'Shmem: 8428104 kB' 'KReclaimable: 235036 kB' 'Slab: 902896 kB' 'SReclaimable: 235036 kB' 'SUnreclaim: 667860 kB' 'KernelStack: 21744 kB' 'PageTables: 7464 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10231200 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214144 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.134 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.134 16:39:50 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.135 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.135 16:39:50 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.135 16:39:50 -- setup/common.sh@33 -- # echo 0 00:04:05.135 16:39:50 -- setup/common.sh@33 -- # return 0 00:04:05.135 16:39:50 -- setup/hugepages.sh@99 -- # surp=0 00:04:05.135 16:39:50 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:05.135 16:39:50 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:05.135 16:39:50 -- setup/common.sh@18 -- # local node= 00:04:05.135 16:39:50 -- setup/common.sh@19 -- # local var val 00:04:05.135 16:39:50 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.135 16:39:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.135 16:39:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.135 16:39:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.135 16:39:50 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.135 16:39:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.136 16:39:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41635236 kB' 'MemAvailable: 45360700 kB' 'Buffers: 8940 kB' 'Cached: 12523988 kB' 'SwapCached: 0 kB' 'Active: 9395740 kB' 'Inactive: 3688316 kB' 'Active(anon): 8979244 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 554648 kB' 'Mapped: 148216 kB' 'Shmem: 8428116 kB' 'KReclaimable: 235036 kB' 'Slab: 902832 kB' 'SReclaimable: 235036 kB' 'SUnreclaim: 667796 kB' 'KernelStack: 21792 kB' 'PageTables: 7584 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10234416 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214144 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.136 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.136 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:05.137 16:39:50 -- setup/common.sh@33 -- # echo 0 00:04:05.137 16:39:50 -- setup/common.sh@33 -- # return 0 00:04:05.137 16:39:50 -- setup/hugepages.sh@100 -- # resv=0 00:04:05.137 16:39:50 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:05.137 nr_hugepages=1024 00:04:05.137 16:39:50 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:05.137 resv_hugepages=0 00:04:05.137 16:39:50 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:05.137 surplus_hugepages=0 00:04:05.137 16:39:50 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:05.137 anon_hugepages=0 00:04:05.137 16:39:50 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:05.137 16:39:50 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:05.137 16:39:50 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:05.137 16:39:50 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:05.137 16:39:50 -- setup/common.sh@18 -- # local node= 00:04:05.137 16:39:50 -- setup/common.sh@19 -- # local var val 00:04:05.137 16:39:50 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.137 16:39:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.137 16:39:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:05.137 16:39:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:05.137 16:39:50 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.137 16:39:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41635244 kB' 'MemAvailable: 45360708 kB' 'Buffers: 8940 kB' 'Cached: 12524004 kB' 'SwapCached: 0 kB' 'Active: 9396044 kB' 'Inactive: 3688316 kB' 'Active(anon): 8979548 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 554928 kB' 'Mapped: 148216 kB' 'Shmem: 8428132 kB' 'KReclaimable: 235036 kB' 'Slab: 902812 kB' 'SReclaimable: 235036 kB' 'SUnreclaim: 667776 kB' 'KernelStack: 21744 kB' 'PageTables: 7448 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10234260 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214096 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.137 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.137 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.138 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.138 16:39:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:05.138 16:39:50 -- setup/common.sh@33 -- # echo 1024 00:04:05.138 16:39:50 -- setup/common.sh@33 -- # return 0 00:04:05.139 16:39:50 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:05.139 16:39:50 -- setup/hugepages.sh@112 -- # get_nodes 00:04:05.139 16:39:50 -- setup/hugepages.sh@27 -- # local node 00:04:05.139 16:39:50 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:05.139 16:39:50 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:05.139 16:39:50 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:05.139 16:39:50 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:05.139 16:39:50 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:05.139 16:39:50 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:05.139 16:39:50 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:05.139 16:39:50 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:05.139 16:39:50 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:05.139 16:39:50 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:05.139 16:39:50 -- setup/common.sh@18 -- # local node=0 00:04:05.139 16:39:50 -- setup/common.sh@19 -- # local var val 00:04:05.139 16:39:50 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.139 16:39:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.139 16:39:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:05.139 16:39:50 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:05.139 16:39:50 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.139 16:39:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 20422624 kB' 'MemUsed: 12162744 kB' 'SwapCached: 0 kB' 'Active: 6482080 kB' 'Inactive: 3554120 kB' 'Active(anon): 6189020 kB' 'Inactive(anon): 0 kB' 'Active(file): 293060 kB' 'Inactive(file): 3554120 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9625552 kB' 'Mapped: 127032 kB' 'AnonPages: 413812 kB' 'Shmem: 5778372 kB' 'KernelStack: 12872 kB' 'PageTables: 4576 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 149936 kB' 'Slab: 460108 kB' 'SReclaimable: 149936 kB' 'SUnreclaim: 310172 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.139 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.139 16:39:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@33 -- # echo 0 00:04:05.140 16:39:50 -- setup/common.sh@33 -- # return 0 00:04:05.140 16:39:50 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:05.140 16:39:50 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:05.140 16:39:50 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:05.140 16:39:50 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:05.140 16:39:50 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:05.140 16:39:50 -- setup/common.sh@18 -- # local node=1 00:04:05.140 16:39:50 -- setup/common.sh@19 -- # local var val 00:04:05.140 16:39:50 -- setup/common.sh@20 -- # local mem_f mem 00:04:05.140 16:39:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:05.140 16:39:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:05.140 16:39:50 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:05.140 16:39:50 -- setup/common.sh@28 -- # mapfile -t mem 00:04:05.140 16:39:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698400 kB' 'MemFree: 21212872 kB' 'MemUsed: 6485528 kB' 'SwapCached: 0 kB' 'Active: 2913424 kB' 'Inactive: 134196 kB' 'Active(anon): 2789988 kB' 'Inactive(anon): 0 kB' 'Active(file): 123436 kB' 'Inactive(file): 134196 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2907420 kB' 'Mapped: 21184 kB' 'AnonPages: 140500 kB' 'Shmem: 2649788 kB' 'KernelStack: 8888 kB' 'PageTables: 2912 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 85100 kB' 'Slab: 442704 kB' 'SReclaimable: 85100 kB' 'SUnreclaim: 357604 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.140 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.140 16:39:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.141 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.141 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.141 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.141 16:39:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.141 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.141 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.141 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.141 16:39:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.141 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.141 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.141 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.141 16:39:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.141 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.141 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.141 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.141 16:39:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.141 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.141 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.141 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.141 16:39:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.141 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.141 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.141 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.141 16:39:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.141 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.141 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.141 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.141 16:39:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.141 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.141 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.141 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.141 16:39:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.141 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.141 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.141 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.141 16:39:50 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.141 16:39:50 -- setup/common.sh@32 -- # continue 00:04:05.141 16:39:50 -- setup/common.sh@31 -- # IFS=': ' 00:04:05.141 16:39:50 -- setup/common.sh@31 -- # read -r var val _ 00:04:05.141 16:39:50 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:05.141 16:39:50 -- setup/common.sh@33 -- # echo 0 00:04:05.141 16:39:50 -- setup/common.sh@33 -- # return 0 00:04:05.141 16:39:50 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:05.141 16:39:50 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:05.141 16:39:50 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:05.141 16:39:50 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:05.141 16:39:50 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:05.141 node0=512 expecting 512 00:04:05.141 16:39:50 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:05.141 16:39:50 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:05.141 16:39:50 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:05.141 16:39:50 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:05.141 node1=512 expecting 512 00:04:05.141 16:39:50 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:05.141 00:04:05.141 real 0m3.785s 00:04:05.141 user 0m1.467s 00:04:05.141 sys 0m2.391s 00:04:05.141 16:39:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:05.141 16:39:50 -- common/autotest_common.sh@10 -- # set +x 00:04:05.141 ************************************ 00:04:05.141 END TEST even_2G_alloc 00:04:05.141 ************************************ 00:04:05.400 16:39:50 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:05.400 16:39:50 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:05.400 16:39:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:05.400 16:39:50 -- common/autotest_common.sh@10 -- # set +x 00:04:05.400 ************************************ 00:04:05.400 START TEST odd_alloc 00:04:05.400 ************************************ 00:04:05.400 16:39:50 -- common/autotest_common.sh@1114 -- # odd_alloc 00:04:05.400 16:39:50 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:05.400 16:39:50 -- setup/hugepages.sh@49 -- # local size=2098176 00:04:05.400 16:39:50 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:05.400 16:39:50 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:05.400 16:39:50 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:05.400 16:39:50 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:05.400 16:39:50 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:05.400 16:39:50 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:05.400 16:39:50 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:05.400 16:39:50 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:05.400 16:39:50 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:05.400 16:39:50 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:05.400 16:39:50 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:05.400 16:39:50 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:05.400 16:39:50 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:05.400 16:39:50 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:05.400 16:39:50 -- setup/hugepages.sh@83 -- # : 513 00:04:05.400 16:39:50 -- setup/hugepages.sh@84 -- # : 1 00:04:05.400 16:39:50 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:05.401 16:39:50 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:05.401 16:39:50 -- setup/hugepages.sh@83 -- # : 0 00:04:05.401 16:39:50 -- setup/hugepages.sh@84 -- # : 0 00:04:05.401 16:39:50 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:05.401 16:39:50 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:05.401 16:39:50 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:05.401 16:39:50 -- setup/hugepages.sh@160 -- # setup output 00:04:05.401 16:39:50 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:05.401 16:39:50 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:08.694 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:08.694 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:08.694 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:08.694 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:08.694 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:08.694 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:08.694 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:08.694 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:08.694 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:08.694 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:08.694 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:08.694 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:08.694 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:08.694 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:08.694 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:08.694 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:08.694 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:08.959 16:39:54 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:08.959 16:39:54 -- setup/hugepages.sh@89 -- # local node 00:04:08.959 16:39:54 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:08.959 16:39:54 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:08.959 16:39:54 -- setup/hugepages.sh@92 -- # local surp 00:04:08.959 16:39:54 -- setup/hugepages.sh@93 -- # local resv 00:04:08.959 16:39:54 -- setup/hugepages.sh@94 -- # local anon 00:04:08.959 16:39:54 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:08.959 16:39:54 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:08.959 16:39:54 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:08.959 16:39:54 -- setup/common.sh@18 -- # local node= 00:04:08.959 16:39:54 -- setup/common.sh@19 -- # local var val 00:04:08.959 16:39:54 -- setup/common.sh@20 -- # local mem_f mem 00:04:08.959 16:39:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:08.959 16:39:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:08.959 16:39:54 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:08.959 16:39:54 -- setup/common.sh@28 -- # mapfile -t mem 00:04:08.959 16:39:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41640352 kB' 'MemAvailable: 45365816 kB' 'Buffers: 8940 kB' 'Cached: 12524112 kB' 'SwapCached: 0 kB' 'Active: 9393908 kB' 'Inactive: 3688316 kB' 'Active(anon): 8977412 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 552396 kB' 'Mapped: 148248 kB' 'Shmem: 8428240 kB' 'KReclaimable: 235036 kB' 'Slab: 902792 kB' 'SReclaimable: 235036 kB' 'SUnreclaim: 667756 kB' 'KernelStack: 21744 kB' 'PageTables: 7428 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480888 kB' 'Committed_AS: 10231844 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.959 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.959 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:08.960 16:39:54 -- setup/common.sh@33 -- # echo 0 00:04:08.960 16:39:54 -- setup/common.sh@33 -- # return 0 00:04:08.960 16:39:54 -- setup/hugepages.sh@97 -- # anon=0 00:04:08.960 16:39:54 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:08.960 16:39:54 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:08.960 16:39:54 -- setup/common.sh@18 -- # local node= 00:04:08.960 16:39:54 -- setup/common.sh@19 -- # local var val 00:04:08.960 16:39:54 -- setup/common.sh@20 -- # local mem_f mem 00:04:08.960 16:39:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:08.960 16:39:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:08.960 16:39:54 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:08.960 16:39:54 -- setup/common.sh@28 -- # mapfile -t mem 00:04:08.960 16:39:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41640672 kB' 'MemAvailable: 45366136 kB' 'Buffers: 8940 kB' 'Cached: 12524116 kB' 'SwapCached: 0 kB' 'Active: 9393624 kB' 'Inactive: 3688316 kB' 'Active(anon): 8977128 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 552128 kB' 'Mapped: 148224 kB' 'Shmem: 8428244 kB' 'KReclaimable: 235036 kB' 'Slab: 902820 kB' 'SReclaimable: 235036 kB' 'SUnreclaim: 667784 kB' 'KernelStack: 21728 kB' 'PageTables: 7392 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480888 kB' 'Committed_AS: 10231856 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.960 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.960 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.961 16:39:54 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.961 16:39:54 -- setup/common.sh@33 -- # echo 0 00:04:08.961 16:39:54 -- setup/common.sh@33 -- # return 0 00:04:08.961 16:39:54 -- setup/hugepages.sh@99 -- # surp=0 00:04:08.961 16:39:54 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:08.961 16:39:54 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:08.961 16:39:54 -- setup/common.sh@18 -- # local node= 00:04:08.961 16:39:54 -- setup/common.sh@19 -- # local var val 00:04:08.961 16:39:54 -- setup/common.sh@20 -- # local mem_f mem 00:04:08.961 16:39:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:08.961 16:39:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:08.961 16:39:54 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:08.961 16:39:54 -- setup/common.sh@28 -- # mapfile -t mem 00:04:08.961 16:39:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.961 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41641092 kB' 'MemAvailable: 45366556 kB' 'Buffers: 8940 kB' 'Cached: 12524116 kB' 'SwapCached: 0 kB' 'Active: 9393664 kB' 'Inactive: 3688316 kB' 'Active(anon): 8977168 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 552156 kB' 'Mapped: 148224 kB' 'Shmem: 8428244 kB' 'KReclaimable: 235036 kB' 'Slab: 902820 kB' 'SReclaimable: 235036 kB' 'SUnreclaim: 667784 kB' 'KernelStack: 21728 kB' 'PageTables: 7412 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480888 kB' 'Committed_AS: 10232020 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.962 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.962 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:08.963 16:39:54 -- setup/common.sh@33 -- # echo 0 00:04:08.963 16:39:54 -- setup/common.sh@33 -- # return 0 00:04:08.963 16:39:54 -- setup/hugepages.sh@100 -- # resv=0 00:04:08.963 16:39:54 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:08.963 nr_hugepages=1025 00:04:08.963 16:39:54 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:08.963 resv_hugepages=0 00:04:08.963 16:39:54 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:08.963 surplus_hugepages=0 00:04:08.963 16:39:54 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:08.963 anon_hugepages=0 00:04:08.963 16:39:54 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:08.963 16:39:54 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:08.963 16:39:54 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:08.963 16:39:54 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:08.963 16:39:54 -- setup/common.sh@18 -- # local node= 00:04:08.963 16:39:54 -- setup/common.sh@19 -- # local var val 00:04:08.963 16:39:54 -- setup/common.sh@20 -- # local mem_f mem 00:04:08.963 16:39:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:08.963 16:39:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:08.963 16:39:54 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:08.963 16:39:54 -- setup/common.sh@28 -- # mapfile -t mem 00:04:08.963 16:39:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41641452 kB' 'MemAvailable: 45366916 kB' 'Buffers: 8940 kB' 'Cached: 12524120 kB' 'SwapCached: 0 kB' 'Active: 9394520 kB' 'Inactive: 3688316 kB' 'Active(anon): 8978024 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 553020 kB' 'Mapped: 148224 kB' 'Shmem: 8428248 kB' 'KReclaimable: 235036 kB' 'Slab: 902820 kB' 'SReclaimable: 235036 kB' 'SUnreclaim: 667784 kB' 'KernelStack: 21728 kB' 'PageTables: 7416 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480888 kB' 'Committed_AS: 10231884 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.963 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.963 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.964 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.964 16:39:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:08.964 16:39:54 -- setup/common.sh@33 -- # echo 1025 00:04:08.964 16:39:54 -- setup/common.sh@33 -- # return 0 00:04:08.964 16:39:54 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:08.964 16:39:54 -- setup/hugepages.sh@112 -- # get_nodes 00:04:08.964 16:39:54 -- setup/hugepages.sh@27 -- # local node 00:04:08.964 16:39:54 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:08.964 16:39:54 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:08.965 16:39:54 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:08.965 16:39:54 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:08.965 16:39:54 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:08.965 16:39:54 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:08.965 16:39:54 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:08.965 16:39:54 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:08.965 16:39:54 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:08.965 16:39:54 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:08.965 16:39:54 -- setup/common.sh@18 -- # local node=0 00:04:08.965 16:39:54 -- setup/common.sh@19 -- # local var val 00:04:08.965 16:39:54 -- setup/common.sh@20 -- # local mem_f mem 00:04:08.965 16:39:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:08.965 16:39:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:08.965 16:39:54 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:08.965 16:39:54 -- setup/common.sh@28 -- # mapfile -t mem 00:04:08.965 16:39:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 20420312 kB' 'MemUsed: 12165056 kB' 'SwapCached: 0 kB' 'Active: 6481756 kB' 'Inactive: 3554120 kB' 'Active(anon): 6188696 kB' 'Inactive(anon): 0 kB' 'Active(file): 293060 kB' 'Inactive(file): 3554120 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9625592 kB' 'Mapped: 127040 kB' 'AnonPages: 413460 kB' 'Shmem: 5778412 kB' 'KernelStack: 12904 kB' 'PageTables: 4672 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 149936 kB' 'Slab: 460004 kB' 'SReclaimable: 149936 kB' 'SUnreclaim: 310068 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.965 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.965 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@33 -- # echo 0 00:04:08.966 16:39:54 -- setup/common.sh@33 -- # return 0 00:04:08.966 16:39:54 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:08.966 16:39:54 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:08.966 16:39:54 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:08.966 16:39:54 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:08.966 16:39:54 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:08.966 16:39:54 -- setup/common.sh@18 -- # local node=1 00:04:08.966 16:39:54 -- setup/common.sh@19 -- # local var val 00:04:08.966 16:39:54 -- setup/common.sh@20 -- # local mem_f mem 00:04:08.966 16:39:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:08.966 16:39:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:08.966 16:39:54 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:08.966 16:39:54 -- setup/common.sh@28 -- # mapfile -t mem 00:04:08.966 16:39:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698400 kB' 'MemFree: 21223708 kB' 'MemUsed: 6474692 kB' 'SwapCached: 0 kB' 'Active: 2912516 kB' 'Inactive: 134196 kB' 'Active(anon): 2789080 kB' 'Inactive(anon): 0 kB' 'Active(file): 123436 kB' 'Inactive(file): 134196 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2907472 kB' 'Mapped: 21184 kB' 'AnonPages: 139332 kB' 'Shmem: 2649840 kB' 'KernelStack: 8792 kB' 'PageTables: 2612 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 85100 kB' 'Slab: 442800 kB' 'SReclaimable: 85100 kB' 'SUnreclaim: 357700 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.966 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.966 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.967 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.967 16:39:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.967 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.967 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.967 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.967 16:39:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.967 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.967 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.967 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.967 16:39:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.967 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.967 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.967 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.967 16:39:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.967 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.967 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.967 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.967 16:39:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.967 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.967 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.967 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.967 16:39:54 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.967 16:39:54 -- setup/common.sh@32 -- # continue 00:04:08.967 16:39:54 -- setup/common.sh@31 -- # IFS=': ' 00:04:08.967 16:39:54 -- setup/common.sh@31 -- # read -r var val _ 00:04:08.967 16:39:54 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:08.967 16:39:54 -- setup/common.sh@33 -- # echo 0 00:04:08.967 16:39:54 -- setup/common.sh@33 -- # return 0 00:04:08.967 16:39:54 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:08.967 16:39:54 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:08.967 16:39:54 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:08.967 16:39:54 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:08.967 16:39:54 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:08.967 node0=512 expecting 513 00:04:08.967 16:39:54 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:08.967 16:39:54 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:08.967 16:39:54 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:08.967 16:39:54 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:08.967 node1=513 expecting 512 00:04:08.967 16:39:54 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:08.967 00:04:08.967 real 0m3.703s 00:04:08.967 user 0m1.457s 00:04:08.967 sys 0m2.316s 00:04:08.967 16:39:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:08.967 16:39:54 -- common/autotest_common.sh@10 -- # set +x 00:04:08.967 ************************************ 00:04:08.967 END TEST odd_alloc 00:04:08.967 ************************************ 00:04:08.967 16:39:54 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:08.967 16:39:54 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:08.967 16:39:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:08.967 16:39:54 -- common/autotest_common.sh@10 -- # set +x 00:04:08.967 ************************************ 00:04:08.967 START TEST custom_alloc 00:04:08.967 ************************************ 00:04:08.967 16:39:54 -- common/autotest_common.sh@1114 -- # custom_alloc 00:04:08.967 16:39:54 -- setup/hugepages.sh@167 -- # local IFS=, 00:04:08.967 16:39:54 -- setup/hugepages.sh@169 -- # local node 00:04:08.967 16:39:54 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:08.967 16:39:54 -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:08.967 16:39:54 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:08.967 16:39:54 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:08.967 16:39:54 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:08.967 16:39:54 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:08.967 16:39:54 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:08.967 16:39:54 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:08.967 16:39:54 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:08.967 16:39:54 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:08.967 16:39:54 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:08.967 16:39:54 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:08.967 16:39:54 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:08.967 16:39:54 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:08.967 16:39:54 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:08.967 16:39:54 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:08.967 16:39:54 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:08.967 16:39:54 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:08.967 16:39:54 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:08.967 16:39:54 -- setup/hugepages.sh@83 -- # : 256 00:04:08.967 16:39:54 -- setup/hugepages.sh@84 -- # : 1 00:04:08.967 16:39:54 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:08.967 16:39:54 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:08.967 16:39:54 -- setup/hugepages.sh@83 -- # : 0 00:04:08.967 16:39:54 -- setup/hugepages.sh@84 -- # : 0 00:04:08.967 16:39:54 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:08.967 16:39:54 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:08.967 16:39:54 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:08.967 16:39:54 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:08.967 16:39:54 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:08.967 16:39:54 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:08.967 16:39:54 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:08.967 16:39:54 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:08.967 16:39:54 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:08.967 16:39:54 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:08.967 16:39:54 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:08.967 16:39:54 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:08.967 16:39:54 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:08.967 16:39:54 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:08.967 16:39:54 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:08.967 16:39:54 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:08.967 16:39:54 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:08.967 16:39:54 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:08.967 16:39:54 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:08.967 16:39:54 -- setup/hugepages.sh@78 -- # return 0 00:04:08.967 16:39:54 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:08.967 16:39:54 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:08.967 16:39:54 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:08.967 16:39:54 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:08.967 16:39:54 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:08.967 16:39:54 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:08.967 16:39:54 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:08.967 16:39:54 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:08.967 16:39:54 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:08.967 16:39:54 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:08.967 16:39:54 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:08.967 16:39:54 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:08.967 16:39:54 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:08.967 16:39:54 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:08.967 16:39:54 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:08.967 16:39:54 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:08.967 16:39:54 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:08.967 16:39:54 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:08.967 16:39:54 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:08.967 16:39:54 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:08.967 16:39:54 -- setup/hugepages.sh@78 -- # return 0 00:04:08.967 16:39:54 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:08.967 16:39:54 -- setup/hugepages.sh@187 -- # setup output 00:04:08.967 16:39:54 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:08.967 16:39:54 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:13.171 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:13.171 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:13.171 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:13.171 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:13.171 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:13.171 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:13.171 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:13.171 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:13.171 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:13.171 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:13.171 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:13.171 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:13.171 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:13.171 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:13.171 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:13.171 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:13.171 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:13.171 16:39:58 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:13.171 16:39:58 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:13.171 16:39:58 -- setup/hugepages.sh@89 -- # local node 00:04:13.171 16:39:58 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:13.171 16:39:58 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:13.171 16:39:58 -- setup/hugepages.sh@92 -- # local surp 00:04:13.171 16:39:58 -- setup/hugepages.sh@93 -- # local resv 00:04:13.171 16:39:58 -- setup/hugepages.sh@94 -- # local anon 00:04:13.171 16:39:58 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:13.171 16:39:58 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:13.171 16:39:58 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:13.171 16:39:58 -- setup/common.sh@18 -- # local node= 00:04:13.171 16:39:58 -- setup/common.sh@19 -- # local var val 00:04:13.171 16:39:58 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.171 16:39:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.171 16:39:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.171 16:39:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.171 16:39:58 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.171 16:39:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.171 16:39:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 40605160 kB' 'MemAvailable: 44330624 kB' 'Buffers: 8940 kB' 'Cached: 12524240 kB' 'SwapCached: 0 kB' 'Active: 9395292 kB' 'Inactive: 3688316 kB' 'Active(anon): 8978796 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 553060 kB' 'Mapped: 148236 kB' 'Shmem: 8428368 kB' 'KReclaimable: 235036 kB' 'Slab: 902804 kB' 'SReclaimable: 235036 kB' 'SUnreclaim: 667768 kB' 'KernelStack: 21744 kB' 'PageTables: 7328 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957624 kB' 'Committed_AS: 10232500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214416 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.171 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.171 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:13.172 16:39:58 -- setup/common.sh@33 -- # echo 0 00:04:13.172 16:39:58 -- setup/common.sh@33 -- # return 0 00:04:13.172 16:39:58 -- setup/hugepages.sh@97 -- # anon=0 00:04:13.172 16:39:58 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:13.172 16:39:58 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.172 16:39:58 -- setup/common.sh@18 -- # local node= 00:04:13.172 16:39:58 -- setup/common.sh@19 -- # local var val 00:04:13.172 16:39:58 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.172 16:39:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.172 16:39:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.172 16:39:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.172 16:39:58 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.172 16:39:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 40605692 kB' 'MemAvailable: 44331156 kB' 'Buffers: 8940 kB' 'Cached: 12524240 kB' 'SwapCached: 0 kB' 'Active: 9395008 kB' 'Inactive: 3688316 kB' 'Active(anon): 8978512 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 553304 kB' 'Mapped: 148232 kB' 'Shmem: 8428368 kB' 'KReclaimable: 235036 kB' 'Slab: 902872 kB' 'SReclaimable: 235036 kB' 'SUnreclaim: 667836 kB' 'KernelStack: 21744 kB' 'PageTables: 7400 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957624 kB' 'Committed_AS: 10232512 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214400 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.172 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.172 16:39:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.173 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.173 16:39:58 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.173 16:39:58 -- setup/common.sh@33 -- # echo 0 00:04:13.173 16:39:58 -- setup/common.sh@33 -- # return 0 00:04:13.173 16:39:58 -- setup/hugepages.sh@99 -- # surp=0 00:04:13.173 16:39:58 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:13.173 16:39:58 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:13.174 16:39:58 -- setup/common.sh@18 -- # local node= 00:04:13.174 16:39:58 -- setup/common.sh@19 -- # local var val 00:04:13.174 16:39:58 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.174 16:39:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.174 16:39:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.174 16:39:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.174 16:39:58 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.174 16:39:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 40605692 kB' 'MemAvailable: 44331156 kB' 'Buffers: 8940 kB' 'Cached: 12524240 kB' 'SwapCached: 0 kB' 'Active: 9395008 kB' 'Inactive: 3688316 kB' 'Active(anon): 8978512 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 553304 kB' 'Mapped: 148232 kB' 'Shmem: 8428368 kB' 'KReclaimable: 235036 kB' 'Slab: 902872 kB' 'SReclaimable: 235036 kB' 'SUnreclaim: 667836 kB' 'KernelStack: 21744 kB' 'PageTables: 7400 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957624 kB' 'Committed_AS: 10232528 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214400 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.174 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.174 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:13.175 16:39:58 -- setup/common.sh@33 -- # echo 0 00:04:13.175 16:39:58 -- setup/common.sh@33 -- # return 0 00:04:13.175 16:39:58 -- setup/hugepages.sh@100 -- # resv=0 00:04:13.175 16:39:58 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:13.175 nr_hugepages=1536 00:04:13.175 16:39:58 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:13.175 resv_hugepages=0 00:04:13.175 16:39:58 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:13.175 surplus_hugepages=0 00:04:13.175 16:39:58 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:13.175 anon_hugepages=0 00:04:13.175 16:39:58 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:13.175 16:39:58 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:13.175 16:39:58 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:13.175 16:39:58 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:13.175 16:39:58 -- setup/common.sh@18 -- # local node= 00:04:13.175 16:39:58 -- setup/common.sh@19 -- # local var val 00:04:13.175 16:39:58 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.175 16:39:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.175 16:39:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:13.175 16:39:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:13.175 16:39:58 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.175 16:39:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 40604936 kB' 'MemAvailable: 44330400 kB' 'Buffers: 8940 kB' 'Cached: 12524240 kB' 'SwapCached: 0 kB' 'Active: 9395168 kB' 'Inactive: 3688316 kB' 'Active(anon): 8978672 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 553464 kB' 'Mapped: 148232 kB' 'Shmem: 8428368 kB' 'KReclaimable: 235036 kB' 'Slab: 902872 kB' 'SReclaimable: 235036 kB' 'SUnreclaim: 667836 kB' 'KernelStack: 21728 kB' 'PageTables: 7348 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957624 kB' 'Committed_AS: 10232540 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214400 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.175 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.175 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.176 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.176 16:39:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:13.177 16:39:58 -- setup/common.sh@33 -- # echo 1536 00:04:13.177 16:39:58 -- setup/common.sh@33 -- # return 0 00:04:13.177 16:39:58 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:13.177 16:39:58 -- setup/hugepages.sh@112 -- # get_nodes 00:04:13.177 16:39:58 -- setup/hugepages.sh@27 -- # local node 00:04:13.177 16:39:58 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:13.177 16:39:58 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:13.177 16:39:58 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:13.177 16:39:58 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:13.177 16:39:58 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:13.177 16:39:58 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:13.177 16:39:58 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:13.177 16:39:58 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:13.177 16:39:58 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:13.177 16:39:58 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.177 16:39:58 -- setup/common.sh@18 -- # local node=0 00:04:13.177 16:39:58 -- setup/common.sh@19 -- # local var val 00:04:13.177 16:39:58 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.177 16:39:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.177 16:39:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:13.177 16:39:58 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:13.177 16:39:58 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.177 16:39:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 20435848 kB' 'MemUsed: 12149520 kB' 'SwapCached: 0 kB' 'Active: 6482736 kB' 'Inactive: 3554120 kB' 'Active(anon): 6189676 kB' 'Inactive(anon): 0 kB' 'Active(file): 293060 kB' 'Inactive(file): 3554120 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9625636 kB' 'Mapped: 127048 kB' 'AnonPages: 414316 kB' 'Shmem: 5778456 kB' 'KernelStack: 12920 kB' 'PageTables: 4680 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 149936 kB' 'Slab: 459984 kB' 'SReclaimable: 149936 kB' 'SUnreclaim: 310048 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.177 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.177 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@33 -- # echo 0 00:04:13.178 16:39:58 -- setup/common.sh@33 -- # return 0 00:04:13.178 16:39:58 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:13.178 16:39:58 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:13.178 16:39:58 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:13.178 16:39:58 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:13.178 16:39:58 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:13.178 16:39:58 -- setup/common.sh@18 -- # local node=1 00:04:13.178 16:39:58 -- setup/common.sh@19 -- # local var val 00:04:13.178 16:39:58 -- setup/common.sh@20 -- # local mem_f mem 00:04:13.178 16:39:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:13.178 16:39:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:13.178 16:39:58 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:13.178 16:39:58 -- setup/common.sh@28 -- # mapfile -t mem 00:04:13.178 16:39:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698400 kB' 'MemFree: 20168660 kB' 'MemUsed: 7529740 kB' 'SwapCached: 0 kB' 'Active: 2912344 kB' 'Inactive: 134196 kB' 'Active(anon): 2788908 kB' 'Inactive(anon): 0 kB' 'Active(file): 123436 kB' 'Inactive(file): 134196 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2907600 kB' 'Mapped: 21184 kB' 'AnonPages: 139004 kB' 'Shmem: 2649968 kB' 'KernelStack: 8824 kB' 'PageTables: 2720 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 85100 kB' 'Slab: 442888 kB' 'SReclaimable: 85100 kB' 'SUnreclaim: 357788 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.178 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.178 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 16:39:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.179 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.179 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 16:39:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.179 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.179 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 16:39:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.179 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.179 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 16:39:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.179 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.179 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 16:39:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.179 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.179 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 16:39:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.179 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.179 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 16:39:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.179 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.179 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 16:39:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.179 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.179 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 16:39:58 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.179 16:39:58 -- setup/common.sh@32 -- # continue 00:04:13.179 16:39:58 -- setup/common.sh@31 -- # IFS=': ' 00:04:13.179 16:39:58 -- setup/common.sh@31 -- # read -r var val _ 00:04:13.179 16:39:58 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:13.179 16:39:58 -- setup/common.sh@33 -- # echo 0 00:04:13.179 16:39:58 -- setup/common.sh@33 -- # return 0 00:04:13.179 16:39:58 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:13.179 16:39:58 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:13.179 16:39:58 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:13.179 16:39:58 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:13.179 16:39:58 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:13.179 node0=512 expecting 512 00:04:13.179 16:39:58 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:13.179 16:39:58 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:13.179 16:39:58 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:13.179 16:39:58 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:13.179 node1=1024 expecting 1024 00:04:13.179 16:39:58 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:13.179 00:04:13.179 real 0m3.735s 00:04:13.179 user 0m1.420s 00:04:13.179 sys 0m2.386s 00:04:13.179 16:39:58 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:13.179 16:39:58 -- common/autotest_common.sh@10 -- # set +x 00:04:13.179 ************************************ 00:04:13.179 END TEST custom_alloc 00:04:13.179 ************************************ 00:04:13.179 16:39:58 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:13.179 16:39:58 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:13.179 16:39:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:13.179 16:39:58 -- common/autotest_common.sh@10 -- # set +x 00:04:13.179 ************************************ 00:04:13.179 START TEST no_shrink_alloc 00:04:13.179 ************************************ 00:04:13.179 16:39:58 -- common/autotest_common.sh@1114 -- # no_shrink_alloc 00:04:13.179 16:39:58 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:13.179 16:39:58 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:13.179 16:39:58 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:13.179 16:39:58 -- setup/hugepages.sh@51 -- # shift 00:04:13.179 16:39:58 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:13.179 16:39:58 -- setup/hugepages.sh@52 -- # local node_ids 00:04:13.179 16:39:58 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:13.179 16:39:58 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:13.179 16:39:58 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:13.179 16:39:58 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:13.179 16:39:58 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:13.179 16:39:58 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:13.179 16:39:58 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:13.179 16:39:58 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:13.179 16:39:58 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:13.179 16:39:58 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:13.179 16:39:58 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:13.179 16:39:58 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:13.179 16:39:58 -- setup/hugepages.sh@73 -- # return 0 00:04:13.179 16:39:58 -- setup/hugepages.sh@198 -- # setup output 00:04:13.179 16:39:58 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:13.179 16:39:58 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:16.475 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:16.475 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:16.475 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:16.475 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:16.475 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:16.475 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:16.475 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:16.475 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:16.475 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:16.475 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:16.475 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:16.476 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:16.476 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:16.476 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:16.476 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:16.476 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:16.476 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:16.476 16:40:01 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:16.476 16:40:01 -- setup/hugepages.sh@89 -- # local node 00:04:16.476 16:40:01 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:16.476 16:40:01 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:16.476 16:40:01 -- setup/hugepages.sh@92 -- # local surp 00:04:16.476 16:40:01 -- setup/hugepages.sh@93 -- # local resv 00:04:16.476 16:40:01 -- setup/hugepages.sh@94 -- # local anon 00:04:16.476 16:40:01 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:16.476 16:40:01 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:16.476 16:40:01 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:16.476 16:40:01 -- setup/common.sh@18 -- # local node= 00:04:16.476 16:40:01 -- setup/common.sh@19 -- # local var val 00:04:16.476 16:40:01 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.476 16:40:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.476 16:40:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.476 16:40:01 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.476 16:40:01 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.476 16:40:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.476 16:40:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41621992 kB' 'MemAvailable: 45347440 kB' 'Buffers: 8940 kB' 'Cached: 12524376 kB' 'SwapCached: 0 kB' 'Active: 9397676 kB' 'Inactive: 3688316 kB' 'Active(anon): 8981180 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 555688 kB' 'Mapped: 148336 kB' 'Shmem: 8428504 kB' 'KReclaimable: 235004 kB' 'Slab: 903108 kB' 'SReclaimable: 235004 kB' 'SUnreclaim: 668104 kB' 'KernelStack: 21872 kB' 'PageTables: 7356 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10237468 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214432 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:16.476 16:40:01 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:01 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:01 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:01 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:01 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:01 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:01 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:01 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:01 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:01 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:01 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:01 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:01 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:01 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:01 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:01 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:01 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:01 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.476 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.476 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:16.477 16:40:02 -- setup/common.sh@33 -- # echo 0 00:04:16.477 16:40:02 -- setup/common.sh@33 -- # return 0 00:04:16.477 16:40:02 -- setup/hugepages.sh@97 -- # anon=0 00:04:16.477 16:40:02 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:16.477 16:40:02 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:16.477 16:40:02 -- setup/common.sh@18 -- # local node= 00:04:16.477 16:40:02 -- setup/common.sh@19 -- # local var val 00:04:16.477 16:40:02 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.477 16:40:02 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.477 16:40:02 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.477 16:40:02 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.477 16:40:02 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.477 16:40:02 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41624748 kB' 'MemAvailable: 45350196 kB' 'Buffers: 8940 kB' 'Cached: 12524384 kB' 'SwapCached: 0 kB' 'Active: 9399144 kB' 'Inactive: 3688316 kB' 'Active(anon): 8982648 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 557440 kB' 'Mapped: 148748 kB' 'Shmem: 8428512 kB' 'KReclaimable: 235004 kB' 'Slab: 903196 kB' 'SReclaimable: 235004 kB' 'SUnreclaim: 668192 kB' 'KernelStack: 21824 kB' 'PageTables: 7644 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10240556 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214416 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.477 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.477 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.478 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.478 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.479 16:40:02 -- setup/common.sh@33 -- # echo 0 00:04:16.479 16:40:02 -- setup/common.sh@33 -- # return 0 00:04:16.479 16:40:02 -- setup/hugepages.sh@99 -- # surp=0 00:04:16.479 16:40:02 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:16.479 16:40:02 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:16.479 16:40:02 -- setup/common.sh@18 -- # local node= 00:04:16.479 16:40:02 -- setup/common.sh@19 -- # local var val 00:04:16.479 16:40:02 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.479 16:40:02 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.479 16:40:02 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.479 16:40:02 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.479 16:40:02 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.479 16:40:02 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.479 16:40:02 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41620248 kB' 'MemAvailable: 45345696 kB' 'Buffers: 8940 kB' 'Cached: 12524400 kB' 'SwapCached: 0 kB' 'Active: 9402148 kB' 'Inactive: 3688316 kB' 'Active(anon): 8985652 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 560400 kB' 'Mapped: 148748 kB' 'Shmem: 8428528 kB' 'KReclaimable: 235004 kB' 'Slab: 903196 kB' 'SReclaimable: 235004 kB' 'SUnreclaim: 668192 kB' 'KernelStack: 21856 kB' 'PageTables: 7512 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10242240 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214516 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.479 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.479 16:40:02 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.480 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.480 16:40:02 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:16.480 16:40:02 -- setup/common.sh@33 -- # echo 0 00:04:16.480 16:40:02 -- setup/common.sh@33 -- # return 0 00:04:16.480 16:40:02 -- setup/hugepages.sh@100 -- # resv=0 00:04:16.480 16:40:02 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:16.480 nr_hugepages=1024 00:04:16.480 16:40:02 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:16.480 resv_hugepages=0 00:04:16.480 16:40:02 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:16.480 surplus_hugepages=0 00:04:16.480 16:40:02 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:16.480 anon_hugepages=0 00:04:16.480 16:40:02 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:16.481 16:40:02 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:16.481 16:40:02 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:16.481 16:40:02 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:16.481 16:40:02 -- setup/common.sh@18 -- # local node= 00:04:16.481 16:40:02 -- setup/common.sh@19 -- # local var val 00:04:16.481 16:40:02 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.481 16:40:02 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.481 16:40:02 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:16.481 16:40:02 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:16.481 16:40:02 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.481 16:40:02 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.481 16:40:02 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41620324 kB' 'MemAvailable: 45345772 kB' 'Buffers: 8940 kB' 'Cached: 12524416 kB' 'SwapCached: 0 kB' 'Active: 9399680 kB' 'Inactive: 3688316 kB' 'Active(anon): 8983184 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 558012 kB' 'Mapped: 148748 kB' 'Shmem: 8428544 kB' 'KReclaimable: 235004 kB' 'Slab: 903196 kB' 'SReclaimable: 235004 kB' 'SUnreclaim: 668192 kB' 'KernelStack: 21920 kB' 'PageTables: 7708 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10241092 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214464 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.481 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.481 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.482 16:40:02 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:16.482 16:40:02 -- setup/common.sh@33 -- # echo 1024 00:04:16.482 16:40:02 -- setup/common.sh@33 -- # return 0 00:04:16.482 16:40:02 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:16.482 16:40:02 -- setup/hugepages.sh@112 -- # get_nodes 00:04:16.482 16:40:02 -- setup/hugepages.sh@27 -- # local node 00:04:16.482 16:40:02 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:16.482 16:40:02 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:16.482 16:40:02 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:16.482 16:40:02 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:16.482 16:40:02 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:16.482 16:40:02 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:16.482 16:40:02 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:16.482 16:40:02 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:16.482 16:40:02 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:16.482 16:40:02 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:16.482 16:40:02 -- setup/common.sh@18 -- # local node=0 00:04:16.482 16:40:02 -- setup/common.sh@19 -- # local var val 00:04:16.482 16:40:02 -- setup/common.sh@20 -- # local mem_f mem 00:04:16.482 16:40:02 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:16.482 16:40:02 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:16.482 16:40:02 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:16.482 16:40:02 -- setup/common.sh@28 -- # mapfile -t mem 00:04:16.482 16:40:02 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.482 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 19387136 kB' 'MemUsed: 13198232 kB' 'SwapCached: 0 kB' 'Active: 6483380 kB' 'Inactive: 3554120 kB' 'Active(anon): 6190320 kB' 'Inactive(anon): 0 kB' 'Active(file): 293060 kB' 'Inactive(file): 3554120 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9625736 kB' 'Mapped: 127060 kB' 'AnonPages: 415052 kB' 'Shmem: 5778556 kB' 'KernelStack: 12904 kB' 'PageTables: 4676 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 149920 kB' 'Slab: 460064 kB' 'SReclaimable: 149920 kB' 'SUnreclaim: 310144 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # continue 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # IFS=': ' 00:04:16.483 16:40:02 -- setup/common.sh@31 -- # read -r var val _ 00:04:16.483 16:40:02 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:16.483 16:40:02 -- setup/common.sh@33 -- # echo 0 00:04:16.484 16:40:02 -- setup/common.sh@33 -- # return 0 00:04:16.484 16:40:02 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:16.484 16:40:02 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:16.484 16:40:02 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:16.484 16:40:02 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:16.484 16:40:02 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:16.484 node0=1024 expecting 1024 00:04:16.484 16:40:02 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:16.484 16:40:02 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:16.484 16:40:02 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:16.484 16:40:02 -- setup/hugepages.sh@202 -- # setup output 00:04:16.484 16:40:02 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:16.484 16:40:02 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:20.686 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:20.686 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:20.686 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:20.686 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:20.686 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:20.686 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:20.686 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:20.686 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:20.687 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:20.687 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:20.687 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:20.687 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:20.687 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:20.687 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:20.687 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:20.687 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:20.687 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:20.687 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:20.687 16:40:05 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:20.687 16:40:05 -- setup/hugepages.sh@89 -- # local node 00:04:20.687 16:40:05 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:20.687 16:40:05 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:20.687 16:40:05 -- setup/hugepages.sh@92 -- # local surp 00:04:20.687 16:40:05 -- setup/hugepages.sh@93 -- # local resv 00:04:20.687 16:40:05 -- setup/hugepages.sh@94 -- # local anon 00:04:20.687 16:40:05 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:20.687 16:40:05 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:20.687 16:40:05 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:20.687 16:40:05 -- setup/common.sh@18 -- # local node= 00:04:20.687 16:40:05 -- setup/common.sh@19 -- # local var val 00:04:20.687 16:40:05 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.687 16:40:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.687 16:40:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.687 16:40:05 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.687 16:40:05 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.687 16:40:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41618376 kB' 'MemAvailable: 45343824 kB' 'Buffers: 8940 kB' 'Cached: 12524508 kB' 'SwapCached: 0 kB' 'Active: 9397928 kB' 'Inactive: 3688316 kB' 'Active(anon): 8981432 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 555536 kB' 'Mapped: 148348 kB' 'Shmem: 8428636 kB' 'KReclaimable: 235004 kB' 'Slab: 903412 kB' 'SReclaimable: 235004 kB' 'SUnreclaim: 668408 kB' 'KernelStack: 21776 kB' 'PageTables: 7468 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10234080 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.687 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.687 16:40:05 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:20.688 16:40:05 -- setup/common.sh@33 -- # echo 0 00:04:20.688 16:40:05 -- setup/common.sh@33 -- # return 0 00:04:20.688 16:40:05 -- setup/hugepages.sh@97 -- # anon=0 00:04:20.688 16:40:05 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:20.688 16:40:05 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.688 16:40:05 -- setup/common.sh@18 -- # local node= 00:04:20.688 16:40:05 -- setup/common.sh@19 -- # local var val 00:04:20.688 16:40:05 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.688 16:40:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.688 16:40:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.688 16:40:05 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.688 16:40:05 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.688 16:40:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41618824 kB' 'MemAvailable: 45344272 kB' 'Buffers: 8940 kB' 'Cached: 12524512 kB' 'SwapCached: 0 kB' 'Active: 9397640 kB' 'Inactive: 3688316 kB' 'Active(anon): 8981144 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 555760 kB' 'Mapped: 148252 kB' 'Shmem: 8428640 kB' 'KReclaimable: 235004 kB' 'Slab: 903372 kB' 'SReclaimable: 235004 kB' 'SUnreclaim: 668368 kB' 'KernelStack: 21760 kB' 'PageTables: 7404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10234092 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.688 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.688 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.689 16:40:05 -- setup/common.sh@33 -- # echo 0 00:04:20.689 16:40:05 -- setup/common.sh@33 -- # return 0 00:04:20.689 16:40:05 -- setup/hugepages.sh@99 -- # surp=0 00:04:20.689 16:40:05 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:20.689 16:40:05 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:20.689 16:40:05 -- setup/common.sh@18 -- # local node= 00:04:20.689 16:40:05 -- setup/common.sh@19 -- # local var val 00:04:20.689 16:40:05 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.689 16:40:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.689 16:40:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.689 16:40:05 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.689 16:40:05 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.689 16:40:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41619000 kB' 'MemAvailable: 45344448 kB' 'Buffers: 8940 kB' 'Cached: 12524516 kB' 'SwapCached: 0 kB' 'Active: 9397340 kB' 'Inactive: 3688316 kB' 'Active(anon): 8980844 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 555440 kB' 'Mapped: 148252 kB' 'Shmem: 8428644 kB' 'KReclaimable: 235004 kB' 'Slab: 903372 kB' 'SReclaimable: 235004 kB' 'SUnreclaim: 668368 kB' 'KernelStack: 21760 kB' 'PageTables: 7404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10234108 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.689 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.689 16:40:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.690 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.690 16:40:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:20.691 16:40:05 -- setup/common.sh@33 -- # echo 0 00:04:20.691 16:40:05 -- setup/common.sh@33 -- # return 0 00:04:20.691 16:40:05 -- setup/hugepages.sh@100 -- # resv=0 00:04:20.691 16:40:05 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:20.691 nr_hugepages=1024 00:04:20.691 16:40:05 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:20.691 resv_hugepages=0 00:04:20.691 16:40:05 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:20.691 surplus_hugepages=0 00:04:20.691 16:40:05 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:20.691 anon_hugepages=0 00:04:20.691 16:40:05 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:20.691 16:40:05 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:20.691 16:40:05 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:20.691 16:40:05 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:20.691 16:40:05 -- setup/common.sh@18 -- # local node= 00:04:20.691 16:40:05 -- setup/common.sh@19 -- # local var val 00:04:20.691 16:40:05 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.691 16:40:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.691 16:40:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:20.691 16:40:05 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:20.691 16:40:05 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.691 16:40:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 41618244 kB' 'MemAvailable: 45343692 kB' 'Buffers: 8940 kB' 'Cached: 12524548 kB' 'SwapCached: 0 kB' 'Active: 9397320 kB' 'Inactive: 3688316 kB' 'Active(anon): 8980824 kB' 'Inactive(anon): 0 kB' 'Active(file): 416496 kB' 'Inactive(file): 3688316 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 555372 kB' 'Mapped: 148252 kB' 'Shmem: 8428676 kB' 'KReclaimable: 235004 kB' 'Slab: 903372 kB' 'SReclaimable: 235004 kB' 'SUnreclaim: 668368 kB' 'KernelStack: 21744 kB' 'PageTables: 7348 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 10234120 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 74368 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 12740608 kB' 'DirectMap1G: 56623104 kB' 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.691 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.691 16:40:05 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.692 16:40:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:20.692 16:40:05 -- setup/common.sh@33 -- # echo 1024 00:04:20.692 16:40:05 -- setup/common.sh@33 -- # return 0 00:04:20.692 16:40:05 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:20.692 16:40:05 -- setup/hugepages.sh@112 -- # get_nodes 00:04:20.692 16:40:05 -- setup/hugepages.sh@27 -- # local node 00:04:20.692 16:40:05 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:20.692 16:40:05 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:20.692 16:40:05 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:20.692 16:40:05 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:20.692 16:40:05 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:20.692 16:40:05 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:20.692 16:40:05 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:20.692 16:40:05 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:20.692 16:40:05 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:20.692 16:40:05 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:20.692 16:40:05 -- setup/common.sh@18 -- # local node=0 00:04:20.692 16:40:05 -- setup/common.sh@19 -- # local var val 00:04:20.692 16:40:05 -- setup/common.sh@20 -- # local mem_f mem 00:04:20.692 16:40:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:20.692 16:40:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:20.692 16:40:05 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:20.692 16:40:05 -- setup/common.sh@28 -- # mapfile -t mem 00:04:20.692 16:40:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.692 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 19391436 kB' 'MemUsed: 13193932 kB' 'SwapCached: 0 kB' 'Active: 6484776 kB' 'Inactive: 3554120 kB' 'Active(anon): 6191716 kB' 'Inactive(anon): 0 kB' 'Active(file): 293060 kB' 'Inactive(file): 3554120 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9625852 kB' 'Mapped: 127572 kB' 'AnonPages: 416296 kB' 'Shmem: 5778672 kB' 'KernelStack: 12984 kB' 'PageTables: 4900 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 149920 kB' 'Slab: 460164 kB' 'SReclaimable: 149920 kB' 'SUnreclaim: 310244 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.693 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.693 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.694 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.694 16:40:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.694 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.694 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.694 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.694 16:40:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.694 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.694 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.694 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.694 16:40:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.694 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.694 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.694 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.694 16:40:05 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.694 16:40:05 -- setup/common.sh@32 -- # continue 00:04:20.694 16:40:05 -- setup/common.sh@31 -- # IFS=': ' 00:04:20.694 16:40:05 -- setup/common.sh@31 -- # read -r var val _ 00:04:20.694 16:40:05 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:20.694 16:40:05 -- setup/common.sh@33 -- # echo 0 00:04:20.694 16:40:05 -- setup/common.sh@33 -- # return 0 00:04:20.694 16:40:05 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:20.694 16:40:05 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:20.694 16:40:05 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:20.694 16:40:05 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:20.694 16:40:05 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:20.694 node0=1024 expecting 1024 00:04:20.694 16:40:05 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:20.694 00:04:20.694 real 0m7.463s 00:04:20.694 user 0m2.825s 00:04:20.694 sys 0m4.775s 00:04:20.694 16:40:05 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:20.694 16:40:05 -- common/autotest_common.sh@10 -- # set +x 00:04:20.694 ************************************ 00:04:20.694 END TEST no_shrink_alloc 00:04:20.694 ************************************ 00:04:20.694 16:40:05 -- setup/hugepages.sh@217 -- # clear_hp 00:04:20.694 16:40:05 -- setup/hugepages.sh@37 -- # local node hp 00:04:20.694 16:40:05 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:20.694 16:40:05 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:20.694 16:40:05 -- setup/hugepages.sh@41 -- # echo 0 00:04:20.694 16:40:05 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:20.694 16:40:05 -- setup/hugepages.sh@41 -- # echo 0 00:04:20.694 16:40:05 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:20.694 16:40:05 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:20.694 16:40:05 -- setup/hugepages.sh@41 -- # echo 0 00:04:20.694 16:40:05 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:20.694 16:40:05 -- setup/hugepages.sh@41 -- # echo 0 00:04:20.694 16:40:05 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:20.694 16:40:05 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:20.694 00:04:20.694 real 0m28.427s 00:04:20.694 user 0m10.362s 00:04:20.694 sys 0m17.135s 00:04:20.694 16:40:05 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:20.694 16:40:05 -- common/autotest_common.sh@10 -- # set +x 00:04:20.694 ************************************ 00:04:20.694 END TEST hugepages 00:04:20.694 ************************************ 00:04:20.694 16:40:06 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:20.694 16:40:06 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:20.694 16:40:06 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:20.694 16:40:06 -- common/autotest_common.sh@10 -- # set +x 00:04:20.694 ************************************ 00:04:20.694 START TEST driver 00:04:20.694 ************************************ 00:04:20.694 16:40:06 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:20.694 * Looking for test storage... 00:04:20.694 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:20.694 16:40:06 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:20.694 16:40:06 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:20.694 16:40:06 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:20.694 16:40:06 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:20.694 16:40:06 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:20.694 16:40:06 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:20.694 16:40:06 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:20.694 16:40:06 -- scripts/common.sh@335 -- # IFS=.-: 00:04:20.694 16:40:06 -- scripts/common.sh@335 -- # read -ra ver1 00:04:20.694 16:40:06 -- scripts/common.sh@336 -- # IFS=.-: 00:04:20.694 16:40:06 -- scripts/common.sh@336 -- # read -ra ver2 00:04:20.694 16:40:06 -- scripts/common.sh@337 -- # local 'op=<' 00:04:20.694 16:40:06 -- scripts/common.sh@339 -- # ver1_l=2 00:04:20.694 16:40:06 -- scripts/common.sh@340 -- # ver2_l=1 00:04:20.694 16:40:06 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:20.694 16:40:06 -- scripts/common.sh@343 -- # case "$op" in 00:04:20.694 16:40:06 -- scripts/common.sh@344 -- # : 1 00:04:20.694 16:40:06 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:20.694 16:40:06 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:20.694 16:40:06 -- scripts/common.sh@364 -- # decimal 1 00:04:20.694 16:40:06 -- scripts/common.sh@352 -- # local d=1 00:04:20.694 16:40:06 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:20.694 16:40:06 -- scripts/common.sh@354 -- # echo 1 00:04:20.694 16:40:06 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:20.694 16:40:06 -- scripts/common.sh@365 -- # decimal 2 00:04:20.694 16:40:06 -- scripts/common.sh@352 -- # local d=2 00:04:20.694 16:40:06 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:20.694 16:40:06 -- scripts/common.sh@354 -- # echo 2 00:04:20.694 16:40:06 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:20.694 16:40:06 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:20.694 16:40:06 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:20.694 16:40:06 -- scripts/common.sh@367 -- # return 0 00:04:20.694 16:40:06 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:20.694 16:40:06 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:20.694 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:20.694 --rc genhtml_branch_coverage=1 00:04:20.694 --rc genhtml_function_coverage=1 00:04:20.694 --rc genhtml_legend=1 00:04:20.694 --rc geninfo_all_blocks=1 00:04:20.694 --rc geninfo_unexecuted_blocks=1 00:04:20.694 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:20.694 ' 00:04:20.694 16:40:06 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:20.694 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:20.694 --rc genhtml_branch_coverage=1 00:04:20.694 --rc genhtml_function_coverage=1 00:04:20.694 --rc genhtml_legend=1 00:04:20.694 --rc geninfo_all_blocks=1 00:04:20.694 --rc geninfo_unexecuted_blocks=1 00:04:20.694 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:20.694 ' 00:04:20.694 16:40:06 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:20.694 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:20.694 --rc genhtml_branch_coverage=1 00:04:20.694 --rc genhtml_function_coverage=1 00:04:20.694 --rc genhtml_legend=1 00:04:20.694 --rc geninfo_all_blocks=1 00:04:20.694 --rc geninfo_unexecuted_blocks=1 00:04:20.694 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:20.694 ' 00:04:20.694 16:40:06 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:20.694 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:20.694 --rc genhtml_branch_coverage=1 00:04:20.694 --rc genhtml_function_coverage=1 00:04:20.694 --rc genhtml_legend=1 00:04:20.694 --rc geninfo_all_blocks=1 00:04:20.694 --rc geninfo_unexecuted_blocks=1 00:04:20.694 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:20.694 ' 00:04:20.694 16:40:06 -- setup/driver.sh@68 -- # setup reset 00:04:20.694 16:40:06 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:20.694 16:40:06 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:25.972 16:40:11 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:25.972 16:40:11 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:25.972 16:40:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:25.972 16:40:11 -- common/autotest_common.sh@10 -- # set +x 00:04:25.972 ************************************ 00:04:25.972 START TEST guess_driver 00:04:25.972 ************************************ 00:04:25.972 16:40:11 -- common/autotest_common.sh@1114 -- # guess_driver 00:04:25.972 16:40:11 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:25.972 16:40:11 -- setup/driver.sh@47 -- # local fail=0 00:04:25.972 16:40:11 -- setup/driver.sh@49 -- # pick_driver 00:04:25.972 16:40:11 -- setup/driver.sh@36 -- # vfio 00:04:25.972 16:40:11 -- setup/driver.sh@21 -- # local iommu_grups 00:04:25.972 16:40:11 -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:25.972 16:40:11 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:25.972 16:40:11 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:25.972 16:40:11 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:25.972 16:40:11 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:04:25.972 16:40:11 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:25.972 16:40:11 -- setup/driver.sh@14 -- # mod vfio_pci 00:04:25.972 16:40:11 -- setup/driver.sh@12 -- # dep vfio_pci 00:04:25.972 16:40:11 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:25.972 16:40:11 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:25.972 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:25.972 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:25.972 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:25.972 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:25.972 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:25.972 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:25.972 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:25.972 16:40:11 -- setup/driver.sh@30 -- # return 0 00:04:25.972 16:40:11 -- setup/driver.sh@37 -- # echo vfio-pci 00:04:25.972 16:40:11 -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:25.972 16:40:11 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:25.972 16:40:11 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:25.972 Looking for driver=vfio-pci 00:04:25.972 16:40:11 -- setup/driver.sh@45 -- # setup output config 00:04:25.972 16:40:11 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:25.972 16:40:11 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:25.972 16:40:11 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:29.267 16:40:14 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.267 16:40:14 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.267 16:40:14 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.267 16:40:14 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.267 16:40:14 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.267 16:40:14 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.267 16:40:14 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.267 16:40:14 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.267 16:40:14 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.267 16:40:14 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.267 16:40:14 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.267 16:40:14 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.267 16:40:14 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.267 16:40:14 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.267 16:40:14 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.267 16:40:14 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.267 16:40:14 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.267 16:40:14 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.267 16:40:14 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.267 16:40:14 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.267 16:40:14 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.267 16:40:14 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.267 16:40:14 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.267 16:40:14 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.267 16:40:14 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.267 16:40:14 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.267 16:40:14 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.267 16:40:14 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.267 16:40:14 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.267 16:40:14 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.267 16:40:14 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.267 16:40:14 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.267 16:40:14 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.267 16:40:14 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.267 16:40:14 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.267 16:40:14 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.267 16:40:14 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.267 16:40:14 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.267 16:40:14 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.267 16:40:14 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.267 16:40:14 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.267 16:40:14 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.267 16:40:14 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.267 16:40:14 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.267 16:40:14 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:29.267 16:40:14 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:29.267 16:40:14 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:29.267 16:40:14 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:31.176 16:40:16 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:31.176 16:40:16 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:31.176 16:40:16 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:31.176 16:40:16 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:31.176 16:40:16 -- setup/driver.sh@65 -- # setup reset 00:04:31.176 16:40:16 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:31.176 16:40:16 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:36.454 00:04:36.454 real 0m10.299s 00:04:36.454 user 0m2.820s 00:04:36.454 sys 0m5.180s 00:04:36.454 16:40:21 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:36.454 16:40:21 -- common/autotest_common.sh@10 -- # set +x 00:04:36.454 ************************************ 00:04:36.454 END TEST guess_driver 00:04:36.454 ************************************ 00:04:36.454 00:04:36.454 real 0m15.539s 00:04:36.454 user 0m4.329s 00:04:36.454 sys 0m8.096s 00:04:36.454 16:40:21 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:36.454 16:40:21 -- common/autotest_common.sh@10 -- # set +x 00:04:36.454 ************************************ 00:04:36.454 END TEST driver 00:04:36.454 ************************************ 00:04:36.454 16:40:21 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:36.454 16:40:21 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:36.454 16:40:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:36.454 16:40:21 -- common/autotest_common.sh@10 -- # set +x 00:04:36.454 ************************************ 00:04:36.454 START TEST devices 00:04:36.454 ************************************ 00:04:36.454 16:40:21 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:36.454 * Looking for test storage... 00:04:36.454 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:36.454 16:40:21 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:36.454 16:40:21 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:36.454 16:40:21 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:36.454 16:40:21 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:36.454 16:40:21 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:36.454 16:40:21 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:36.454 16:40:21 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:36.454 16:40:21 -- scripts/common.sh@335 -- # IFS=.-: 00:04:36.454 16:40:21 -- scripts/common.sh@335 -- # read -ra ver1 00:04:36.454 16:40:21 -- scripts/common.sh@336 -- # IFS=.-: 00:04:36.454 16:40:21 -- scripts/common.sh@336 -- # read -ra ver2 00:04:36.454 16:40:21 -- scripts/common.sh@337 -- # local 'op=<' 00:04:36.454 16:40:21 -- scripts/common.sh@339 -- # ver1_l=2 00:04:36.454 16:40:21 -- scripts/common.sh@340 -- # ver2_l=1 00:04:36.454 16:40:21 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:36.454 16:40:21 -- scripts/common.sh@343 -- # case "$op" in 00:04:36.454 16:40:21 -- scripts/common.sh@344 -- # : 1 00:04:36.454 16:40:21 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:36.454 16:40:21 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:36.454 16:40:21 -- scripts/common.sh@364 -- # decimal 1 00:04:36.454 16:40:21 -- scripts/common.sh@352 -- # local d=1 00:04:36.454 16:40:21 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:36.454 16:40:21 -- scripts/common.sh@354 -- # echo 1 00:04:36.454 16:40:21 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:36.454 16:40:21 -- scripts/common.sh@365 -- # decimal 2 00:04:36.454 16:40:21 -- scripts/common.sh@352 -- # local d=2 00:04:36.454 16:40:21 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:36.454 16:40:21 -- scripts/common.sh@354 -- # echo 2 00:04:36.454 16:40:21 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:36.454 16:40:21 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:36.454 16:40:21 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:36.454 16:40:21 -- scripts/common.sh@367 -- # return 0 00:04:36.454 16:40:21 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:36.454 16:40:21 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:36.454 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:36.454 --rc genhtml_branch_coverage=1 00:04:36.454 --rc genhtml_function_coverage=1 00:04:36.454 --rc genhtml_legend=1 00:04:36.454 --rc geninfo_all_blocks=1 00:04:36.454 --rc geninfo_unexecuted_blocks=1 00:04:36.454 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:36.454 ' 00:04:36.454 16:40:21 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:36.454 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:36.454 --rc genhtml_branch_coverage=1 00:04:36.454 --rc genhtml_function_coverage=1 00:04:36.454 --rc genhtml_legend=1 00:04:36.454 --rc geninfo_all_blocks=1 00:04:36.454 --rc geninfo_unexecuted_blocks=1 00:04:36.454 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:36.454 ' 00:04:36.454 16:40:21 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:36.454 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:36.454 --rc genhtml_branch_coverage=1 00:04:36.454 --rc genhtml_function_coverage=1 00:04:36.454 --rc genhtml_legend=1 00:04:36.454 --rc geninfo_all_blocks=1 00:04:36.454 --rc geninfo_unexecuted_blocks=1 00:04:36.454 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:36.454 ' 00:04:36.454 16:40:21 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:36.454 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:36.454 --rc genhtml_branch_coverage=1 00:04:36.454 --rc genhtml_function_coverage=1 00:04:36.454 --rc genhtml_legend=1 00:04:36.454 --rc geninfo_all_blocks=1 00:04:36.454 --rc geninfo_unexecuted_blocks=1 00:04:36.454 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:36.454 ' 00:04:36.454 16:40:21 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:36.454 16:40:21 -- setup/devices.sh@192 -- # setup reset 00:04:36.454 16:40:21 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:36.454 16:40:21 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:40.650 16:40:25 -- setup/devices.sh@194 -- # get_zoned_devs 00:04:40.650 16:40:25 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:04:40.650 16:40:25 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:04:40.650 16:40:25 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:04:40.650 16:40:25 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:04:40.650 16:40:25 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:04:40.650 16:40:25 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:04:40.650 16:40:25 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:40.650 16:40:25 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:04:40.650 16:40:25 -- setup/devices.sh@196 -- # blocks=() 00:04:40.650 16:40:25 -- setup/devices.sh@196 -- # declare -a blocks 00:04:40.650 16:40:25 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:40.650 16:40:25 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:40.650 16:40:25 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:40.650 16:40:25 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:40.650 16:40:25 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:40.650 16:40:25 -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:40.650 16:40:25 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:04:40.650 16:40:25 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:40.650 16:40:25 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:40.650 16:40:25 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:04:40.650 16:40:25 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:40.650 No valid GPT data, bailing 00:04:40.650 16:40:25 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:40.650 16:40:25 -- scripts/common.sh@393 -- # pt= 00:04:40.650 16:40:25 -- scripts/common.sh@394 -- # return 1 00:04:40.650 16:40:25 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:40.650 16:40:25 -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:40.650 16:40:25 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:40.650 16:40:25 -- setup/common.sh@80 -- # echo 1600321314816 00:04:40.650 16:40:25 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:04:40.650 16:40:25 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:40.650 16:40:25 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:04:40.650 16:40:25 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:40.650 16:40:25 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:40.650 16:40:25 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:40.650 16:40:25 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:40.650 16:40:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:40.650 16:40:25 -- common/autotest_common.sh@10 -- # set +x 00:04:40.650 ************************************ 00:04:40.650 START TEST nvme_mount 00:04:40.650 ************************************ 00:04:40.650 16:40:25 -- common/autotest_common.sh@1114 -- # nvme_mount 00:04:40.650 16:40:25 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:40.650 16:40:25 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:40.650 16:40:25 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:40.650 16:40:25 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:40.650 16:40:25 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:40.650 16:40:25 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:40.650 16:40:25 -- setup/common.sh@40 -- # local part_no=1 00:04:40.650 16:40:25 -- setup/common.sh@41 -- # local size=1073741824 00:04:40.650 16:40:25 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:40.650 16:40:25 -- setup/common.sh@44 -- # parts=() 00:04:40.650 16:40:25 -- setup/common.sh@44 -- # local parts 00:04:40.650 16:40:25 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:40.650 16:40:25 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:40.650 16:40:25 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:40.650 16:40:25 -- setup/common.sh@46 -- # (( part++ )) 00:04:40.650 16:40:25 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:40.650 16:40:25 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:40.650 16:40:25 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:40.650 16:40:25 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:41.219 Creating new GPT entries in memory. 00:04:41.219 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:41.219 other utilities. 00:04:41.219 16:40:26 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:41.219 16:40:26 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:41.219 16:40:26 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:41.219 16:40:26 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:41.219 16:40:26 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:42.158 Creating new GPT entries in memory. 00:04:42.158 The operation has completed successfully. 00:04:42.158 16:40:27 -- setup/common.sh@57 -- # (( part++ )) 00:04:42.158 16:40:27 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:42.158 16:40:27 -- setup/common.sh@62 -- # wait 442785 00:04:42.417 16:40:27 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:42.418 16:40:27 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:42.418 16:40:27 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:42.418 16:40:27 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:42.418 16:40:27 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:42.418 16:40:27 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:42.418 16:40:27 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:42.418 16:40:27 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:42.418 16:40:27 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:42.418 16:40:27 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:42.418 16:40:27 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:42.418 16:40:27 -- setup/devices.sh@53 -- # local found=0 00:04:42.418 16:40:27 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:42.418 16:40:27 -- setup/devices.sh@56 -- # : 00:04:42.418 16:40:27 -- setup/devices.sh@59 -- # local pci status 00:04:42.418 16:40:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:42.418 16:40:27 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:42.418 16:40:27 -- setup/devices.sh@47 -- # setup output config 00:04:42.418 16:40:27 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:42.418 16:40:27 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:45.711 16:40:31 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.711 16:40:31 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:45.711 16:40:31 -- setup/devices.sh@63 -- # found=1 00:04:45.711 16:40:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.711 16:40:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.711 16:40:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.711 16:40:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.711 16:40:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.711 16:40:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.711 16:40:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.711 16:40:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.711 16:40:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.711 16:40:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.711 16:40:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.711 16:40:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.711 16:40:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.711 16:40:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.711 16:40:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.711 16:40:31 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.711 16:40:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.711 16:40:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.711 16:40:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.711 16:40:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.711 16:40:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.711 16:40:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.711 16:40:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.711 16:40:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.711 16:40:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.711 16:40:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.711 16:40:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.711 16:40:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.711 16:40:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.711 16:40:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.711 16:40:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.711 16:40:31 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:45.711 16:40:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.711 16:40:31 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:45.711 16:40:31 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:45.711 16:40:31 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:45.711 16:40:31 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:45.711 16:40:31 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:45.971 16:40:31 -- setup/devices.sh@110 -- # cleanup_nvme 00:04:45.971 16:40:31 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:45.971 16:40:31 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:45.971 16:40:31 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:45.971 16:40:31 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:45.971 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:45.971 16:40:31 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:45.971 16:40:31 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:46.231 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:46.231 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:46.231 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:46.231 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:46.231 16:40:31 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:46.231 16:40:31 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:46.231 16:40:31 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.231 16:40:31 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:46.231 16:40:31 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:46.231 16:40:31 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.231 16:40:31 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:46.231 16:40:31 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:46.231 16:40:31 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:46.231 16:40:31 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:46.231 16:40:31 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:46.231 16:40:31 -- setup/devices.sh@53 -- # local found=0 00:04:46.231 16:40:31 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:46.231 16:40:31 -- setup/devices.sh@56 -- # : 00:04:46.231 16:40:31 -- setup/devices.sh@59 -- # local pci status 00:04:46.231 16:40:31 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.231 16:40:31 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:46.231 16:40:31 -- setup/devices.sh@47 -- # setup output config 00:04:46.231 16:40:31 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:46.231 16:40:31 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:49.525 16:40:35 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.525 16:40:35 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:49.525 16:40:35 -- setup/devices.sh@63 -- # found=1 00:04:49.525 16:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.525 16:40:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.525 16:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.525 16:40:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.525 16:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.525 16:40:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.525 16:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.525 16:40:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.525 16:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.525 16:40:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.525 16:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.525 16:40:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.525 16:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.525 16:40:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.525 16:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.525 16:40:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.525 16:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.525 16:40:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.525 16:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.525 16:40:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.525 16:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.525 16:40:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.525 16:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.525 16:40:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.525 16:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.525 16:40:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.525 16:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.525 16:40:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.525 16:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.525 16:40:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.525 16:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.525 16:40:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.525 16:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.785 16:40:35 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:49.785 16:40:35 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:49.785 16:40:35 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:49.785 16:40:35 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:49.785 16:40:35 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:49.785 16:40:35 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:49.785 16:40:35 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:04:49.785 16:40:35 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:49.785 16:40:35 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:49.785 16:40:35 -- setup/devices.sh@50 -- # local mount_point= 00:04:49.785 16:40:35 -- setup/devices.sh@51 -- # local test_file= 00:04:49.785 16:40:35 -- setup/devices.sh@53 -- # local found=0 00:04:49.785 16:40:35 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:49.785 16:40:35 -- setup/devices.sh@59 -- # local pci status 00:04:49.785 16:40:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.785 16:40:35 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:49.785 16:40:35 -- setup/devices.sh@47 -- # setup output config 00:04:49.785 16:40:35 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:49.785 16:40:35 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:53.079 16:40:38 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.079 16:40:38 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:53.079 16:40:38 -- setup/devices.sh@63 -- # found=1 00:04:53.079 16:40:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.079 16:40:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.079 16:40:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.079 16:40:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.079 16:40:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.079 16:40:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.079 16:40:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.079 16:40:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.079 16:40:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.079 16:40:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.079 16:40:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.079 16:40:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.079 16:40:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.079 16:40:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.079 16:40:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.079 16:40:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.079 16:40:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.079 16:40:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.079 16:40:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.079 16:40:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.079 16:40:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.079 16:40:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.079 16:40:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.079 16:40:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.079 16:40:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.079 16:40:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.079 16:40:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.079 16:40:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.079 16:40:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.079 16:40:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.079 16:40:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.080 16:40:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:53.080 16:40:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:53.340 16:40:38 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:53.340 16:40:38 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:53.340 16:40:38 -- setup/devices.sh@68 -- # return 0 00:04:53.340 16:40:38 -- setup/devices.sh@128 -- # cleanup_nvme 00:04:53.340 16:40:38 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:53.340 16:40:38 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:53.340 16:40:38 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:53.340 16:40:38 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:53.340 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:53.340 00:04:53.340 real 0m13.040s 00:04:53.340 user 0m3.872s 00:04:53.340 sys 0m7.151s 00:04:53.340 16:40:38 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:53.340 16:40:38 -- common/autotest_common.sh@10 -- # set +x 00:04:53.340 ************************************ 00:04:53.340 END TEST nvme_mount 00:04:53.340 ************************************ 00:04:53.340 16:40:38 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:53.340 16:40:38 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:53.340 16:40:38 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:53.340 16:40:38 -- common/autotest_common.sh@10 -- # set +x 00:04:53.340 ************************************ 00:04:53.340 START TEST dm_mount 00:04:53.340 ************************************ 00:04:53.340 16:40:38 -- common/autotest_common.sh@1114 -- # dm_mount 00:04:53.340 16:40:38 -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:53.340 16:40:38 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:53.340 16:40:38 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:53.340 16:40:38 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:53.340 16:40:38 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:53.340 16:40:38 -- setup/common.sh@40 -- # local part_no=2 00:04:53.340 16:40:38 -- setup/common.sh@41 -- # local size=1073741824 00:04:53.340 16:40:38 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:53.340 16:40:38 -- setup/common.sh@44 -- # parts=() 00:04:53.340 16:40:38 -- setup/common.sh@44 -- # local parts 00:04:53.340 16:40:38 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:53.340 16:40:38 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:53.340 16:40:38 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:53.340 16:40:38 -- setup/common.sh@46 -- # (( part++ )) 00:04:53.340 16:40:38 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:53.340 16:40:38 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:53.340 16:40:38 -- setup/common.sh@46 -- # (( part++ )) 00:04:53.340 16:40:38 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:53.340 16:40:38 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:53.340 16:40:38 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:53.340 16:40:38 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:54.279 Creating new GPT entries in memory. 00:04:54.279 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:54.279 other utilities. 00:04:54.279 16:40:39 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:54.279 16:40:39 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:54.279 16:40:39 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:54.279 16:40:39 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:54.279 16:40:39 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:55.219 Creating new GPT entries in memory. 00:04:55.219 The operation has completed successfully. 00:04:55.219 16:40:40 -- setup/common.sh@57 -- # (( part++ )) 00:04:55.219 16:40:40 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:55.219 16:40:40 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:55.219 16:40:40 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:55.219 16:40:40 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:56.601 The operation has completed successfully. 00:04:56.601 16:40:41 -- setup/common.sh@57 -- # (( part++ )) 00:04:56.601 16:40:41 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:56.601 16:40:41 -- setup/common.sh@62 -- # wait 447515 00:04:56.601 16:40:42 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:56.601 16:40:42 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.601 16:40:42 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:56.601 16:40:42 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:56.601 16:40:42 -- setup/devices.sh@160 -- # for t in {1..5} 00:04:56.601 16:40:42 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:56.601 16:40:42 -- setup/devices.sh@161 -- # break 00:04:56.601 16:40:42 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:56.601 16:40:42 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:56.601 16:40:42 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:56.601 16:40:42 -- setup/devices.sh@166 -- # dm=dm-0 00:04:56.601 16:40:42 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:56.601 16:40:42 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:56.601 16:40:42 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.601 16:40:42 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:04:56.601 16:40:42 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.601 16:40:42 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:56.601 16:40:42 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:56.601 16:40:42 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.601 16:40:42 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:56.601 16:40:42 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:56.601 16:40:42 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:56.601 16:40:42 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:56.601 16:40:42 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:56.601 16:40:42 -- setup/devices.sh@53 -- # local found=0 00:04:56.601 16:40:42 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:56.601 16:40:42 -- setup/devices.sh@56 -- # : 00:04:56.601 16:40:42 -- setup/devices.sh@59 -- # local pci status 00:04:56.601 16:40:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:56.601 16:40:42 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:56.601 16:40:42 -- setup/devices.sh@47 -- # setup output config 00:04:56.601 16:40:42 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:56.601 16:40:42 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:59.895 16:40:45 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.895 16:40:45 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:59.895 16:40:45 -- setup/devices.sh@63 -- # found=1 00:04:59.895 16:40:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.895 16:40:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.895 16:40:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.895 16:40:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.895 16:40:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.895 16:40:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.895 16:40:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.895 16:40:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.895 16:40:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.895 16:40:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.895 16:40:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.895 16:40:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.895 16:40:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.895 16:40:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.895 16:40:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.895 16:40:45 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.895 16:40:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.895 16:40:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.895 16:40:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.895 16:40:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.895 16:40:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.895 16:40:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.895 16:40:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.895 16:40:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.895 16:40:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.895 16:40:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.895 16:40:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.895 16:40:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.895 16:40:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.895 16:40:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.895 16:40:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.895 16:40:45 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:59.895 16:40:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.895 16:40:45 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:59.895 16:40:45 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:59.895 16:40:45 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:59.895 16:40:45 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:59.895 16:40:45 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:59.895 16:40:45 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:00.155 16:40:45 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:00.155 16:40:45 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:00.155 16:40:45 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:00.155 16:40:45 -- setup/devices.sh@50 -- # local mount_point= 00:05:00.155 16:40:45 -- setup/devices.sh@51 -- # local test_file= 00:05:00.155 16:40:45 -- setup/devices.sh@53 -- # local found=0 00:05:00.155 16:40:45 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:00.155 16:40:45 -- setup/devices.sh@59 -- # local pci status 00:05:00.155 16:40:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:00.155 16:40:45 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:00.155 16:40:45 -- setup/devices.sh@47 -- # setup output config 00:05:00.155 16:40:45 -- setup/common.sh@9 -- # [[ output == output ]] 00:05:00.155 16:40:45 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:03.450 16:40:48 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.450 16:40:48 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:03.450 16:40:48 -- setup/devices.sh@63 -- # found=1 00:05:03.450 16:40:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.450 16:40:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.450 16:40:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.450 16:40:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.450 16:40:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.450 16:40:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.450 16:40:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.450 16:40:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.450 16:40:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.450 16:40:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.450 16:40:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.450 16:40:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.450 16:40:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.450 16:40:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.450 16:40:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.450 16:40:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.450 16:40:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.450 16:40:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.450 16:40:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.450 16:40:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.450 16:40:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.450 16:40:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.450 16:40:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.450 16:40:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.450 16:40:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.450 16:40:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.450 16:40:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.450 16:40:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.450 16:40:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.450 16:40:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.450 16:40:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.450 16:40:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:03.450 16:40:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.450 16:40:49 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:03.450 16:40:49 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:03.450 16:40:49 -- setup/devices.sh@68 -- # return 0 00:05:03.450 16:40:49 -- setup/devices.sh@187 -- # cleanup_dm 00:05:03.450 16:40:49 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:03.450 16:40:49 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:03.450 16:40:49 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:03.450 16:40:49 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:03.450 16:40:49 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:03.450 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:03.450 16:40:49 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:03.450 16:40:49 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:03.450 00:05:03.450 real 0m10.283s 00:05:03.450 user 0m2.636s 00:05:03.450 sys 0m4.772s 00:05:03.450 16:40:49 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:03.450 16:40:49 -- common/autotest_common.sh@10 -- # set +x 00:05:03.450 ************************************ 00:05:03.450 END TEST dm_mount 00:05:03.450 ************************************ 00:05:03.709 16:40:49 -- setup/devices.sh@1 -- # cleanup 00:05:03.709 16:40:49 -- setup/devices.sh@11 -- # cleanup_nvme 00:05:03.709 16:40:49 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:03.709 16:40:49 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:03.709 16:40:49 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:03.709 16:40:49 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:03.709 16:40:49 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:03.968 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:03.968 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:03.968 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:03.968 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:03.968 16:40:49 -- setup/devices.sh@12 -- # cleanup_dm 00:05:03.969 16:40:49 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:03.969 16:40:49 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:03.969 16:40:49 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:03.969 16:40:49 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:03.969 16:40:49 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:03.969 16:40:49 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:03.969 00:05:03.969 real 0m27.927s 00:05:03.969 user 0m8.090s 00:05:03.969 sys 0m14.882s 00:05:03.969 16:40:49 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:03.969 16:40:49 -- common/autotest_common.sh@10 -- # set +x 00:05:03.969 ************************************ 00:05:03.969 END TEST devices 00:05:03.969 ************************************ 00:05:03.969 00:05:03.969 real 1m37.859s 00:05:03.969 user 0m31.306s 00:05:03.969 sys 0m55.834s 00:05:03.969 16:40:49 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:03.969 16:40:49 -- common/autotest_common.sh@10 -- # set +x 00:05:03.969 ************************************ 00:05:03.969 END TEST setup.sh 00:05:03.969 ************************************ 00:05:03.969 16:40:49 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:05:07.263 Hugepages 00:05:07.263 node hugesize free / total 00:05:07.263 node0 1048576kB 0 / 0 00:05:07.263 node0 2048kB 2048 / 2048 00:05:07.263 node1 1048576kB 0 / 0 00:05:07.263 node1 2048kB 0 / 0 00:05:07.263 00:05:07.263 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:07.263 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:07.263 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:07.263 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:07.263 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:07.263 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:07.263 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:07.263 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:07.263 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:07.263 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:07.263 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:07.263 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:07.263 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:07.263 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:07.263 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:07.263 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:07.263 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:07.522 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:05:07.522 16:40:53 -- spdk/autotest.sh@128 -- # uname -s 00:05:07.522 16:40:53 -- spdk/autotest.sh@128 -- # [[ Linux == Linux ]] 00:05:07.522 16:40:53 -- spdk/autotest.sh@130 -- # nvme_namespace_revert 00:05:07.522 16:40:53 -- common/autotest_common.sh@1526 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:10.814 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:10.814 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:10.814 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:10.814 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:10.814 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:10.814 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:10.814 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:10.814 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:11.074 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:11.074 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:11.074 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:11.074 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:11.074 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:11.074 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:11.074 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:11.074 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:12.453 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:12.712 16:40:58 -- common/autotest_common.sh@1527 -- # sleep 1 00:05:13.660 16:40:59 -- common/autotest_common.sh@1528 -- # bdfs=() 00:05:13.660 16:40:59 -- common/autotest_common.sh@1528 -- # local bdfs 00:05:13.660 16:40:59 -- common/autotest_common.sh@1529 -- # bdfs=($(get_nvme_bdfs)) 00:05:13.661 16:40:59 -- common/autotest_common.sh@1529 -- # get_nvme_bdfs 00:05:13.661 16:40:59 -- common/autotest_common.sh@1508 -- # bdfs=() 00:05:13.661 16:40:59 -- common/autotest_common.sh@1508 -- # local bdfs 00:05:13.661 16:40:59 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:13.661 16:40:59 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:13.661 16:40:59 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:05:13.661 16:40:59 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:05:13.661 16:40:59 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:05:13.661 16:40:59 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:17.863 Waiting for block devices as requested 00:05:17.863 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:17.863 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:17.863 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:17.863 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:17.863 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:17.863 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:17.863 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:17.863 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:17.863 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:17.863 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:18.122 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:18.122 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:18.122 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:18.380 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:18.380 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:18.380 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:18.638 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:05:18.638 16:41:04 -- common/autotest_common.sh@1533 -- # for bdf in "${bdfs[@]}" 00:05:18.638 16:41:04 -- common/autotest_common.sh@1534 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:05:18.638 16:41:04 -- common/autotest_common.sh@1497 -- # readlink -f /sys/class/nvme/nvme0 00:05:18.638 16:41:04 -- common/autotest_common.sh@1497 -- # grep 0000:d8:00.0/nvme/nvme 00:05:18.638 16:41:04 -- common/autotest_common.sh@1497 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:18.638 16:41:04 -- common/autotest_common.sh@1498 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:05:18.638 16:41:04 -- common/autotest_common.sh@1502 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:18.638 16:41:04 -- common/autotest_common.sh@1502 -- # printf '%s\n' nvme0 00:05:18.638 16:41:04 -- common/autotest_common.sh@1534 -- # nvme_ctrlr=/dev/nvme0 00:05:18.638 16:41:04 -- common/autotest_common.sh@1535 -- # [[ -z /dev/nvme0 ]] 00:05:18.638 16:41:04 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:05:18.638 16:41:04 -- common/autotest_common.sh@1540 -- # grep oacs 00:05:18.638 16:41:04 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:05:18.896 16:41:04 -- common/autotest_common.sh@1540 -- # oacs=' 0xe' 00:05:18.896 16:41:04 -- common/autotest_common.sh@1541 -- # oacs_ns_manage=8 00:05:18.896 16:41:04 -- common/autotest_common.sh@1543 -- # [[ 8 -ne 0 ]] 00:05:18.896 16:41:04 -- common/autotest_common.sh@1549 -- # nvme id-ctrl /dev/nvme0 00:05:18.896 16:41:04 -- common/autotest_common.sh@1549 -- # grep unvmcap 00:05:18.896 16:41:04 -- common/autotest_common.sh@1549 -- # cut -d: -f2 00:05:18.896 16:41:04 -- common/autotest_common.sh@1549 -- # unvmcap=' 0' 00:05:18.896 16:41:04 -- common/autotest_common.sh@1550 -- # [[ 0 -eq 0 ]] 00:05:18.896 16:41:04 -- common/autotest_common.sh@1552 -- # continue 00:05:18.896 16:41:04 -- spdk/autotest.sh@133 -- # timing_exit pre_cleanup 00:05:18.896 16:41:04 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:18.896 16:41:04 -- common/autotest_common.sh@10 -- # set +x 00:05:18.896 16:41:04 -- spdk/autotest.sh@136 -- # timing_enter afterboot 00:05:18.896 16:41:04 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:18.896 16:41:04 -- common/autotest_common.sh@10 -- # set +x 00:05:18.896 16:41:04 -- spdk/autotest.sh@137 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:22.186 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:22.186 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:22.186 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:22.446 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:22.446 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:22.446 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:22.446 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:22.446 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:22.446 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:22.446 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:22.446 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:22.446 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:22.446 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:22.446 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:22.446 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:22.446 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:24.353 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:24.353 16:41:09 -- spdk/autotest.sh@138 -- # timing_exit afterboot 00:05:24.353 16:41:09 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:24.353 16:41:09 -- common/autotest_common.sh@10 -- # set +x 00:05:24.353 16:41:09 -- spdk/autotest.sh@142 -- # opal_revert_cleanup 00:05:24.353 16:41:09 -- common/autotest_common.sh@1586 -- # mapfile -t bdfs 00:05:24.353 16:41:09 -- common/autotest_common.sh@1586 -- # get_nvme_bdfs_by_id 0x0a54 00:05:24.353 16:41:09 -- common/autotest_common.sh@1572 -- # bdfs=() 00:05:24.353 16:41:09 -- common/autotest_common.sh@1572 -- # local bdfs 00:05:24.353 16:41:09 -- common/autotest_common.sh@1574 -- # get_nvme_bdfs 00:05:24.353 16:41:09 -- common/autotest_common.sh@1508 -- # bdfs=() 00:05:24.353 16:41:09 -- common/autotest_common.sh@1508 -- # local bdfs 00:05:24.353 16:41:09 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:24.353 16:41:09 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:24.353 16:41:09 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:05:24.353 16:41:09 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:05:24.353 16:41:09 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:05:24.353 16:41:09 -- common/autotest_common.sh@1574 -- # for bdf in $(get_nvme_bdfs) 00:05:24.353 16:41:09 -- common/autotest_common.sh@1575 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:05:24.353 16:41:09 -- common/autotest_common.sh@1575 -- # device=0x0a54 00:05:24.353 16:41:09 -- common/autotest_common.sh@1576 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:24.353 16:41:09 -- common/autotest_common.sh@1577 -- # bdfs+=($bdf) 00:05:24.353 16:41:09 -- common/autotest_common.sh@1581 -- # printf '%s\n' 0000:d8:00.0 00:05:24.353 16:41:09 -- common/autotest_common.sh@1587 -- # [[ -z 0000:d8:00.0 ]] 00:05:24.353 16:41:09 -- common/autotest_common.sh@1592 -- # spdk_tgt_pid=457490 00:05:24.353 16:41:09 -- common/autotest_common.sh@1593 -- # waitforlisten 457490 00:05:24.353 16:41:09 -- common/autotest_common.sh@1591 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:24.353 16:41:09 -- common/autotest_common.sh@829 -- # '[' -z 457490 ']' 00:05:24.353 16:41:09 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:24.353 16:41:09 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:24.353 16:41:09 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:24.353 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:24.353 16:41:09 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:24.353 16:41:09 -- common/autotest_common.sh@10 -- # set +x 00:05:24.353 [2024-11-16 16:41:09.974135] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:24.353 [2024-11-16 16:41:09.974207] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid457490 ] 00:05:24.353 EAL: No free 2048 kB hugepages reported on node 1 00:05:24.353 [2024-11-16 16:41:10.056656] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:24.353 [2024-11-16 16:41:10.095162] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:24.353 [2024-11-16 16:41:10.095283] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:25.290 16:41:10 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:25.290 16:41:10 -- common/autotest_common.sh@862 -- # return 0 00:05:25.290 16:41:10 -- common/autotest_common.sh@1595 -- # bdf_id=0 00:05:25.290 16:41:10 -- common/autotest_common.sh@1596 -- # for bdf in "${bdfs[@]}" 00:05:25.290 16:41:10 -- common/autotest_common.sh@1597 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:05:28.582 nvme0n1 00:05:28.582 16:41:13 -- common/autotest_common.sh@1599 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:28.582 [2024-11-16 16:41:14.006919] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:28.582 request: 00:05:28.582 { 00:05:28.582 "nvme_ctrlr_name": "nvme0", 00:05:28.582 "password": "test", 00:05:28.582 "method": "bdev_nvme_opal_revert", 00:05:28.582 "req_id": 1 00:05:28.582 } 00:05:28.582 Got JSON-RPC error response 00:05:28.582 response: 00:05:28.582 { 00:05:28.582 "code": -32602, 00:05:28.582 "message": "Invalid parameters" 00:05:28.582 } 00:05:28.582 16:41:14 -- common/autotest_common.sh@1599 -- # true 00:05:28.582 16:41:14 -- common/autotest_common.sh@1600 -- # (( ++bdf_id )) 00:05:28.582 16:41:14 -- common/autotest_common.sh@1603 -- # killprocess 457490 00:05:28.582 16:41:14 -- common/autotest_common.sh@936 -- # '[' -z 457490 ']' 00:05:28.582 16:41:14 -- common/autotest_common.sh@940 -- # kill -0 457490 00:05:28.582 16:41:14 -- common/autotest_common.sh@941 -- # uname 00:05:28.582 16:41:14 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:28.582 16:41:14 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 457490 00:05:28.582 16:41:14 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:28.582 16:41:14 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:28.582 16:41:14 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 457490' 00:05:28.582 killing process with pid 457490 00:05:28.582 16:41:14 -- common/autotest_common.sh@955 -- # kill 457490 00:05:28.582 16:41:14 -- common/autotest_common.sh@960 -- # wait 457490 00:05:28.582 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.582 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.582 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.582 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.582 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.582 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.582 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.582 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.582 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.582 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.582 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.582 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.582 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.582 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.582 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.582 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.582 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.582 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.582 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.583 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:28.584 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:05:31.123 16:41:16 -- spdk/autotest.sh@148 -- # '[' 0 -eq 1 ']' 00:05:31.123 16:41:16 -- spdk/autotest.sh@152 -- # '[' 1 -eq 1 ']' 00:05:31.123 16:41:16 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:05:31.123 16:41:16 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:05:31.123 16:41:16 -- spdk/autotest.sh@160 -- # timing_enter lib 00:05:31.123 16:41:16 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:31.123 16:41:16 -- common/autotest_common.sh@10 -- # set +x 00:05:31.123 16:41:16 -- spdk/autotest.sh@162 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:31.123 16:41:16 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:31.123 16:41:16 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:31.123 16:41:16 -- common/autotest_common.sh@10 -- # set +x 00:05:31.123 ************************************ 00:05:31.123 START TEST env 00:05:31.123 ************************************ 00:05:31.123 16:41:16 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:31.123 * Looking for test storage... 00:05:31.123 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:31.123 16:41:16 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:31.123 16:41:16 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:31.123 16:41:16 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:31.123 16:41:16 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:31.123 16:41:16 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:31.123 16:41:16 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:31.123 16:41:16 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:31.123 16:41:16 -- scripts/common.sh@335 -- # IFS=.-: 00:05:31.123 16:41:16 -- scripts/common.sh@335 -- # read -ra ver1 00:05:31.123 16:41:16 -- scripts/common.sh@336 -- # IFS=.-: 00:05:31.123 16:41:16 -- scripts/common.sh@336 -- # read -ra ver2 00:05:31.123 16:41:16 -- scripts/common.sh@337 -- # local 'op=<' 00:05:31.123 16:41:16 -- scripts/common.sh@339 -- # ver1_l=2 00:05:31.123 16:41:16 -- scripts/common.sh@340 -- # ver2_l=1 00:05:31.123 16:41:16 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:31.123 16:41:16 -- scripts/common.sh@343 -- # case "$op" in 00:05:31.123 16:41:16 -- scripts/common.sh@344 -- # : 1 00:05:31.123 16:41:16 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:31.123 16:41:16 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:31.123 16:41:16 -- scripts/common.sh@364 -- # decimal 1 00:05:31.123 16:41:16 -- scripts/common.sh@352 -- # local d=1 00:05:31.123 16:41:16 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:31.123 16:41:16 -- scripts/common.sh@354 -- # echo 1 00:05:31.123 16:41:16 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:31.123 16:41:16 -- scripts/common.sh@365 -- # decimal 2 00:05:31.123 16:41:16 -- scripts/common.sh@352 -- # local d=2 00:05:31.123 16:41:16 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:31.123 16:41:16 -- scripts/common.sh@354 -- # echo 2 00:05:31.123 16:41:16 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:31.123 16:41:16 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:31.123 16:41:16 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:31.123 16:41:16 -- scripts/common.sh@367 -- # return 0 00:05:31.123 16:41:16 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:31.123 16:41:16 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:31.123 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:31.123 --rc genhtml_branch_coverage=1 00:05:31.123 --rc genhtml_function_coverage=1 00:05:31.123 --rc genhtml_legend=1 00:05:31.123 --rc geninfo_all_blocks=1 00:05:31.123 --rc geninfo_unexecuted_blocks=1 00:05:31.123 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:31.123 ' 00:05:31.123 16:41:16 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:31.123 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:31.123 --rc genhtml_branch_coverage=1 00:05:31.123 --rc genhtml_function_coverage=1 00:05:31.123 --rc genhtml_legend=1 00:05:31.123 --rc geninfo_all_blocks=1 00:05:31.123 --rc geninfo_unexecuted_blocks=1 00:05:31.123 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:31.123 ' 00:05:31.123 16:41:16 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:31.123 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:31.123 --rc genhtml_branch_coverage=1 00:05:31.123 --rc genhtml_function_coverage=1 00:05:31.123 --rc genhtml_legend=1 00:05:31.123 --rc geninfo_all_blocks=1 00:05:31.123 --rc geninfo_unexecuted_blocks=1 00:05:31.123 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:31.123 ' 00:05:31.123 16:41:16 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:31.123 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:31.123 --rc genhtml_branch_coverage=1 00:05:31.123 --rc genhtml_function_coverage=1 00:05:31.123 --rc genhtml_legend=1 00:05:31.123 --rc geninfo_all_blocks=1 00:05:31.123 --rc geninfo_unexecuted_blocks=1 00:05:31.123 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:31.123 ' 00:05:31.123 16:41:16 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:31.123 16:41:16 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:31.123 16:41:16 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:31.123 16:41:16 -- common/autotest_common.sh@10 -- # set +x 00:05:31.123 ************************************ 00:05:31.123 START TEST env_memory 00:05:31.123 ************************************ 00:05:31.123 16:41:16 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:31.123 00:05:31.123 00:05:31.123 CUnit - A unit testing framework for C - Version 2.1-3 00:05:31.123 http://cunit.sourceforge.net/ 00:05:31.123 00:05:31.123 00:05:31.123 Suite: memory 00:05:31.123 Test: alloc and free memory map ...[2024-11-16 16:41:16.546327] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:31.123 passed 00:05:31.123 Test: mem map translation ...[2024-11-16 16:41:16.559754] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:31.123 [2024-11-16 16:41:16.559779] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:31.123 [2024-11-16 16:41:16.559826] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:31.123 [2024-11-16 16:41:16.559834] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:31.123 passed 00:05:31.123 Test: mem map registration ...[2024-11-16 16:41:16.579578] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:31.123 [2024-11-16 16:41:16.579592] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:31.123 passed 00:05:31.123 Test: mem map adjacent registrations ...passed 00:05:31.123 00:05:31.123 Run Summary: Type Total Ran Passed Failed Inactive 00:05:31.123 suites 1 1 n/a 0 0 00:05:31.123 tests 4 4 4 0 0 00:05:31.123 asserts 152 152 152 0 n/a 00:05:31.123 00:05:31.123 Elapsed time = 0.085 seconds 00:05:31.123 00:05:31.123 real 0m0.098s 00:05:31.123 user 0m0.086s 00:05:31.123 sys 0m0.012s 00:05:31.123 16:41:16 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:31.123 16:41:16 -- common/autotest_common.sh@10 -- # set +x 00:05:31.123 ************************************ 00:05:31.123 END TEST env_memory 00:05:31.123 ************************************ 00:05:31.123 16:41:16 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:31.123 16:41:16 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:31.123 16:41:16 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:31.123 16:41:16 -- common/autotest_common.sh@10 -- # set +x 00:05:31.123 ************************************ 00:05:31.123 START TEST env_vtophys 00:05:31.123 ************************************ 00:05:31.123 16:41:16 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:31.123 EAL: lib.eal log level changed from notice to debug 00:05:31.123 EAL: Detected lcore 0 as core 0 on socket 0 00:05:31.123 EAL: Detected lcore 1 as core 1 on socket 0 00:05:31.123 EAL: Detected lcore 2 as core 2 on socket 0 00:05:31.123 EAL: Detected lcore 3 as core 3 on socket 0 00:05:31.123 EAL: Detected lcore 4 as core 4 on socket 0 00:05:31.123 EAL: Detected lcore 5 as core 5 on socket 0 00:05:31.123 EAL: Detected lcore 6 as core 6 on socket 0 00:05:31.123 EAL: Detected lcore 7 as core 8 on socket 0 00:05:31.123 EAL: Detected lcore 8 as core 9 on socket 0 00:05:31.123 EAL: Detected lcore 9 as core 10 on socket 0 00:05:31.123 EAL: Detected lcore 10 as core 11 on socket 0 00:05:31.123 EAL: Detected lcore 11 as core 12 on socket 0 00:05:31.123 EAL: Detected lcore 12 as core 13 on socket 0 00:05:31.124 EAL: Detected lcore 13 as core 14 on socket 0 00:05:31.124 EAL: Detected lcore 14 as core 16 on socket 0 00:05:31.124 EAL: Detected lcore 15 as core 17 on socket 0 00:05:31.124 EAL: Detected lcore 16 as core 18 on socket 0 00:05:31.124 EAL: Detected lcore 17 as core 19 on socket 0 00:05:31.124 EAL: Detected lcore 18 as core 20 on socket 0 00:05:31.124 EAL: Detected lcore 19 as core 21 on socket 0 00:05:31.124 EAL: Detected lcore 20 as core 22 on socket 0 00:05:31.124 EAL: Detected lcore 21 as core 24 on socket 0 00:05:31.124 EAL: Detected lcore 22 as core 25 on socket 0 00:05:31.124 EAL: Detected lcore 23 as core 26 on socket 0 00:05:31.124 EAL: Detected lcore 24 as core 27 on socket 0 00:05:31.124 EAL: Detected lcore 25 as core 28 on socket 0 00:05:31.124 EAL: Detected lcore 26 as core 29 on socket 0 00:05:31.124 EAL: Detected lcore 27 as core 30 on socket 0 00:05:31.124 EAL: Detected lcore 28 as core 0 on socket 1 00:05:31.124 EAL: Detected lcore 29 as core 1 on socket 1 00:05:31.124 EAL: Detected lcore 30 as core 2 on socket 1 00:05:31.124 EAL: Detected lcore 31 as core 3 on socket 1 00:05:31.124 EAL: Detected lcore 32 as core 4 on socket 1 00:05:31.124 EAL: Detected lcore 33 as core 5 on socket 1 00:05:31.124 EAL: Detected lcore 34 as core 6 on socket 1 00:05:31.124 EAL: Detected lcore 35 as core 8 on socket 1 00:05:31.124 EAL: Detected lcore 36 as core 9 on socket 1 00:05:31.124 EAL: Detected lcore 37 as core 10 on socket 1 00:05:31.124 EAL: Detected lcore 38 as core 11 on socket 1 00:05:31.124 EAL: Detected lcore 39 as core 12 on socket 1 00:05:31.124 EAL: Detected lcore 40 as core 13 on socket 1 00:05:31.124 EAL: Detected lcore 41 as core 14 on socket 1 00:05:31.124 EAL: Detected lcore 42 as core 16 on socket 1 00:05:31.124 EAL: Detected lcore 43 as core 17 on socket 1 00:05:31.124 EAL: Detected lcore 44 as core 18 on socket 1 00:05:31.124 EAL: Detected lcore 45 as core 19 on socket 1 00:05:31.124 EAL: Detected lcore 46 as core 20 on socket 1 00:05:31.124 EAL: Detected lcore 47 as core 21 on socket 1 00:05:31.124 EAL: Detected lcore 48 as core 22 on socket 1 00:05:31.124 EAL: Detected lcore 49 as core 24 on socket 1 00:05:31.124 EAL: Detected lcore 50 as core 25 on socket 1 00:05:31.124 EAL: Detected lcore 51 as core 26 on socket 1 00:05:31.124 EAL: Detected lcore 52 as core 27 on socket 1 00:05:31.124 EAL: Detected lcore 53 as core 28 on socket 1 00:05:31.124 EAL: Detected lcore 54 as core 29 on socket 1 00:05:31.124 EAL: Detected lcore 55 as core 30 on socket 1 00:05:31.124 EAL: Detected lcore 56 as core 0 on socket 0 00:05:31.124 EAL: Detected lcore 57 as core 1 on socket 0 00:05:31.124 EAL: Detected lcore 58 as core 2 on socket 0 00:05:31.124 EAL: Detected lcore 59 as core 3 on socket 0 00:05:31.124 EAL: Detected lcore 60 as core 4 on socket 0 00:05:31.124 EAL: Detected lcore 61 as core 5 on socket 0 00:05:31.124 EAL: Detected lcore 62 as core 6 on socket 0 00:05:31.124 EAL: Detected lcore 63 as core 8 on socket 0 00:05:31.124 EAL: Detected lcore 64 as core 9 on socket 0 00:05:31.124 EAL: Detected lcore 65 as core 10 on socket 0 00:05:31.124 EAL: Detected lcore 66 as core 11 on socket 0 00:05:31.124 EAL: Detected lcore 67 as core 12 on socket 0 00:05:31.124 EAL: Detected lcore 68 as core 13 on socket 0 00:05:31.124 EAL: Detected lcore 69 as core 14 on socket 0 00:05:31.124 EAL: Detected lcore 70 as core 16 on socket 0 00:05:31.124 EAL: Detected lcore 71 as core 17 on socket 0 00:05:31.124 EAL: Detected lcore 72 as core 18 on socket 0 00:05:31.124 EAL: Detected lcore 73 as core 19 on socket 0 00:05:31.124 EAL: Detected lcore 74 as core 20 on socket 0 00:05:31.124 EAL: Detected lcore 75 as core 21 on socket 0 00:05:31.124 EAL: Detected lcore 76 as core 22 on socket 0 00:05:31.124 EAL: Detected lcore 77 as core 24 on socket 0 00:05:31.124 EAL: Detected lcore 78 as core 25 on socket 0 00:05:31.124 EAL: Detected lcore 79 as core 26 on socket 0 00:05:31.124 EAL: Detected lcore 80 as core 27 on socket 0 00:05:31.124 EAL: Detected lcore 81 as core 28 on socket 0 00:05:31.124 EAL: Detected lcore 82 as core 29 on socket 0 00:05:31.124 EAL: Detected lcore 83 as core 30 on socket 0 00:05:31.124 EAL: Detected lcore 84 as core 0 on socket 1 00:05:31.124 EAL: Detected lcore 85 as core 1 on socket 1 00:05:31.124 EAL: Detected lcore 86 as core 2 on socket 1 00:05:31.124 EAL: Detected lcore 87 as core 3 on socket 1 00:05:31.124 EAL: Detected lcore 88 as core 4 on socket 1 00:05:31.124 EAL: Detected lcore 89 as core 5 on socket 1 00:05:31.124 EAL: Detected lcore 90 as core 6 on socket 1 00:05:31.124 EAL: Detected lcore 91 as core 8 on socket 1 00:05:31.124 EAL: Detected lcore 92 as core 9 on socket 1 00:05:31.124 EAL: Detected lcore 93 as core 10 on socket 1 00:05:31.124 EAL: Detected lcore 94 as core 11 on socket 1 00:05:31.124 EAL: Detected lcore 95 as core 12 on socket 1 00:05:31.124 EAL: Detected lcore 96 as core 13 on socket 1 00:05:31.124 EAL: Detected lcore 97 as core 14 on socket 1 00:05:31.124 EAL: Detected lcore 98 as core 16 on socket 1 00:05:31.124 EAL: Detected lcore 99 as core 17 on socket 1 00:05:31.124 EAL: Detected lcore 100 as core 18 on socket 1 00:05:31.124 EAL: Detected lcore 101 as core 19 on socket 1 00:05:31.124 EAL: Detected lcore 102 as core 20 on socket 1 00:05:31.124 EAL: Detected lcore 103 as core 21 on socket 1 00:05:31.124 EAL: Detected lcore 104 as core 22 on socket 1 00:05:31.124 EAL: Detected lcore 105 as core 24 on socket 1 00:05:31.124 EAL: Detected lcore 106 as core 25 on socket 1 00:05:31.124 EAL: Detected lcore 107 as core 26 on socket 1 00:05:31.124 EAL: Detected lcore 108 as core 27 on socket 1 00:05:31.124 EAL: Detected lcore 109 as core 28 on socket 1 00:05:31.124 EAL: Detected lcore 110 as core 29 on socket 1 00:05:31.124 EAL: Detected lcore 111 as core 30 on socket 1 00:05:31.124 EAL: Maximum logical cores by configuration: 128 00:05:31.124 EAL: Detected CPU lcores: 112 00:05:31.124 EAL: Detected NUMA nodes: 2 00:05:31.124 EAL: Checking presence of .so 'librte_eal.so.23.0' 00:05:31.124 EAL: Checking presence of .so 'librte_eal.so.23' 00:05:31.124 EAL: Checking presence of .so 'librte_eal.so' 00:05:31.124 EAL: Detected static linkage of DPDK 00:05:31.124 EAL: No shared files mode enabled, IPC will be disabled 00:05:31.124 EAL: Bus pci wants IOVA as 'DC' 00:05:31.124 EAL: Buses did not request a specific IOVA mode. 00:05:31.124 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:31.124 EAL: Selected IOVA mode 'VA' 00:05:31.124 EAL: No free 2048 kB hugepages reported on node 1 00:05:31.124 EAL: Probing VFIO support... 00:05:31.124 EAL: IOMMU type 1 (Type 1) is supported 00:05:31.124 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:31.124 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:31.124 EAL: VFIO support initialized 00:05:31.124 EAL: Ask a virtual area of 0x2e000 bytes 00:05:31.124 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:31.124 EAL: Setting up physically contiguous memory... 00:05:31.124 EAL: Setting maximum number of open files to 524288 00:05:31.124 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:31.124 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:31.124 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:31.124 EAL: Ask a virtual area of 0x61000 bytes 00:05:31.124 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:31.124 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:31.124 EAL: Ask a virtual area of 0x400000000 bytes 00:05:31.124 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:31.124 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:31.124 EAL: Ask a virtual area of 0x61000 bytes 00:05:31.124 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:31.124 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:31.124 EAL: Ask a virtual area of 0x400000000 bytes 00:05:31.124 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:31.124 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:31.124 EAL: Ask a virtual area of 0x61000 bytes 00:05:31.124 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:31.124 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:31.124 EAL: Ask a virtual area of 0x400000000 bytes 00:05:31.124 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:31.124 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:31.124 EAL: Ask a virtual area of 0x61000 bytes 00:05:31.124 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:31.124 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:31.124 EAL: Ask a virtual area of 0x400000000 bytes 00:05:31.124 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:31.124 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:31.124 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:31.124 EAL: Ask a virtual area of 0x61000 bytes 00:05:31.124 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:31.124 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:31.124 EAL: Ask a virtual area of 0x400000000 bytes 00:05:31.124 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:31.124 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:31.124 EAL: Ask a virtual area of 0x61000 bytes 00:05:31.124 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:31.124 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:31.124 EAL: Ask a virtual area of 0x400000000 bytes 00:05:31.124 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:31.124 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:31.124 EAL: Ask a virtual area of 0x61000 bytes 00:05:31.124 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:31.124 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:31.124 EAL: Ask a virtual area of 0x400000000 bytes 00:05:31.124 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:31.124 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:31.124 EAL: Ask a virtual area of 0x61000 bytes 00:05:31.124 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:31.124 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:31.124 EAL: Ask a virtual area of 0x400000000 bytes 00:05:31.124 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:31.124 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:31.124 EAL: Hugepages will be freed exactly as allocated. 00:05:31.124 EAL: No shared files mode enabled, IPC is disabled 00:05:31.124 EAL: No shared files mode enabled, IPC is disabled 00:05:31.124 EAL: TSC frequency is ~2500000 KHz 00:05:31.124 EAL: Main lcore 0 is ready (tid=7eff116dda00;cpuset=[0]) 00:05:31.124 EAL: Trying to obtain current memory policy. 00:05:31.124 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:31.124 EAL: Restoring previous memory policy: 0 00:05:31.125 EAL: request: mp_malloc_sync 00:05:31.125 EAL: No shared files mode enabled, IPC is disabled 00:05:31.125 EAL: Heap on socket 0 was expanded by 2MB 00:05:31.125 EAL: No shared files mode enabled, IPC is disabled 00:05:31.125 EAL: Mem event callback 'spdk:(nil)' registered 00:05:31.125 00:05:31.125 00:05:31.125 CUnit - A unit testing framework for C - Version 2.1-3 00:05:31.125 http://cunit.sourceforge.net/ 00:05:31.125 00:05:31.125 00:05:31.125 Suite: components_suite 00:05:31.125 Test: vtophys_malloc_test ...passed 00:05:31.125 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:31.125 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:31.125 EAL: Restoring previous memory policy: 4 00:05:31.125 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.125 EAL: request: mp_malloc_sync 00:05:31.125 EAL: No shared files mode enabled, IPC is disabled 00:05:31.125 EAL: Heap on socket 0 was expanded by 4MB 00:05:31.125 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.125 EAL: request: mp_malloc_sync 00:05:31.125 EAL: No shared files mode enabled, IPC is disabled 00:05:31.125 EAL: Heap on socket 0 was shrunk by 4MB 00:05:31.125 EAL: Trying to obtain current memory policy. 00:05:31.125 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:31.125 EAL: Restoring previous memory policy: 4 00:05:31.125 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.125 EAL: request: mp_malloc_sync 00:05:31.125 EAL: No shared files mode enabled, IPC is disabled 00:05:31.125 EAL: Heap on socket 0 was expanded by 6MB 00:05:31.125 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.125 EAL: request: mp_malloc_sync 00:05:31.125 EAL: No shared files mode enabled, IPC is disabled 00:05:31.125 EAL: Heap on socket 0 was shrunk by 6MB 00:05:31.125 EAL: Trying to obtain current memory policy. 00:05:31.125 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:31.125 EAL: Restoring previous memory policy: 4 00:05:31.125 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.125 EAL: request: mp_malloc_sync 00:05:31.125 EAL: No shared files mode enabled, IPC is disabled 00:05:31.125 EAL: Heap on socket 0 was expanded by 10MB 00:05:31.125 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.125 EAL: request: mp_malloc_sync 00:05:31.125 EAL: No shared files mode enabled, IPC is disabled 00:05:31.125 EAL: Heap on socket 0 was shrunk by 10MB 00:05:31.125 EAL: Trying to obtain current memory policy. 00:05:31.125 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:31.125 EAL: Restoring previous memory policy: 4 00:05:31.125 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.125 EAL: request: mp_malloc_sync 00:05:31.125 EAL: No shared files mode enabled, IPC is disabled 00:05:31.125 EAL: Heap on socket 0 was expanded by 18MB 00:05:31.125 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.125 EAL: request: mp_malloc_sync 00:05:31.125 EAL: No shared files mode enabled, IPC is disabled 00:05:31.125 EAL: Heap on socket 0 was shrunk by 18MB 00:05:31.125 EAL: Trying to obtain current memory policy. 00:05:31.125 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:31.125 EAL: Restoring previous memory policy: 4 00:05:31.125 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.125 EAL: request: mp_malloc_sync 00:05:31.125 EAL: No shared files mode enabled, IPC is disabled 00:05:31.125 EAL: Heap on socket 0 was expanded by 34MB 00:05:31.125 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.125 EAL: request: mp_malloc_sync 00:05:31.125 EAL: No shared files mode enabled, IPC is disabled 00:05:31.125 EAL: Heap on socket 0 was shrunk by 34MB 00:05:31.125 EAL: Trying to obtain current memory policy. 00:05:31.125 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:31.125 EAL: Restoring previous memory policy: 4 00:05:31.125 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.125 EAL: request: mp_malloc_sync 00:05:31.125 EAL: No shared files mode enabled, IPC is disabled 00:05:31.125 EAL: Heap on socket 0 was expanded by 66MB 00:05:31.125 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.125 EAL: request: mp_malloc_sync 00:05:31.125 EAL: No shared files mode enabled, IPC is disabled 00:05:31.125 EAL: Heap on socket 0 was shrunk by 66MB 00:05:31.125 EAL: Trying to obtain current memory policy. 00:05:31.125 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:31.125 EAL: Restoring previous memory policy: 4 00:05:31.125 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.125 EAL: request: mp_malloc_sync 00:05:31.125 EAL: No shared files mode enabled, IPC is disabled 00:05:31.125 EAL: Heap on socket 0 was expanded by 130MB 00:05:31.125 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.125 EAL: request: mp_malloc_sync 00:05:31.125 EAL: No shared files mode enabled, IPC is disabled 00:05:31.125 EAL: Heap on socket 0 was shrunk by 130MB 00:05:31.125 EAL: Trying to obtain current memory policy. 00:05:31.125 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:31.384 EAL: Restoring previous memory policy: 4 00:05:31.384 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.384 EAL: request: mp_malloc_sync 00:05:31.384 EAL: No shared files mode enabled, IPC is disabled 00:05:31.385 EAL: Heap on socket 0 was expanded by 258MB 00:05:31.385 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.385 EAL: request: mp_malloc_sync 00:05:31.385 EAL: No shared files mode enabled, IPC is disabled 00:05:31.385 EAL: Heap on socket 0 was shrunk by 258MB 00:05:31.385 EAL: Trying to obtain current memory policy. 00:05:31.385 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:31.385 EAL: Restoring previous memory policy: 4 00:05:31.385 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.385 EAL: request: mp_malloc_sync 00:05:31.385 EAL: No shared files mode enabled, IPC is disabled 00:05:31.385 EAL: Heap on socket 0 was expanded by 514MB 00:05:31.643 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.643 EAL: request: mp_malloc_sync 00:05:31.644 EAL: No shared files mode enabled, IPC is disabled 00:05:31.644 EAL: Heap on socket 0 was shrunk by 514MB 00:05:31.644 EAL: Trying to obtain current memory policy. 00:05:31.644 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:31.903 EAL: Restoring previous memory policy: 4 00:05:31.903 EAL: Calling mem event callback 'spdk:(nil)' 00:05:31.903 EAL: request: mp_malloc_sync 00:05:31.903 EAL: No shared files mode enabled, IPC is disabled 00:05:31.903 EAL: Heap on socket 0 was expanded by 1026MB 00:05:31.903 EAL: Calling mem event callback 'spdk:(nil)' 00:05:32.162 EAL: request: mp_malloc_sync 00:05:32.162 EAL: No shared files mode enabled, IPC is disabled 00:05:32.162 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:32.162 passed 00:05:32.162 00:05:32.162 Run Summary: Type Total Ran Passed Failed Inactive 00:05:32.162 suites 1 1 n/a 0 0 00:05:32.162 tests 2 2 2 0 0 00:05:32.162 asserts 497 497 497 0 n/a 00:05:32.162 00:05:32.162 Elapsed time = 0.978 seconds 00:05:32.162 EAL: Calling mem event callback 'spdk:(nil)' 00:05:32.162 EAL: request: mp_malloc_sync 00:05:32.162 EAL: No shared files mode enabled, IPC is disabled 00:05:32.162 EAL: Heap on socket 0 was shrunk by 2MB 00:05:32.162 EAL: No shared files mode enabled, IPC is disabled 00:05:32.162 EAL: No shared files mode enabled, IPC is disabled 00:05:32.162 EAL: No shared files mode enabled, IPC is disabled 00:05:32.162 00:05:32.162 real 0m1.116s 00:05:32.162 user 0m0.649s 00:05:32.162 sys 0m0.433s 00:05:32.162 16:41:17 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:32.162 16:41:17 -- common/autotest_common.sh@10 -- # set +x 00:05:32.162 ************************************ 00:05:32.162 END TEST env_vtophys 00:05:32.162 ************************************ 00:05:32.162 16:41:17 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:32.162 16:41:17 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:32.162 16:41:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:32.162 16:41:17 -- common/autotest_common.sh@10 -- # set +x 00:05:32.162 ************************************ 00:05:32.162 START TEST env_pci 00:05:32.162 ************************************ 00:05:32.162 16:41:17 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:32.162 00:05:32.162 00:05:32.162 CUnit - A unit testing framework for C - Version 2.1-3 00:05:32.162 http://cunit.sourceforge.net/ 00:05:32.162 00:05:32.162 00:05:32.162 Suite: pci 00:05:32.162 Test: pci_hook ...[2024-11-16 16:41:17.837594] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 459012 has claimed it 00:05:32.162 EAL: Cannot find device (10000:00:01.0) 00:05:32.162 EAL: Failed to attach device on primary process 00:05:32.162 passed 00:05:32.162 00:05:32.162 Run Summary: Type Total Ran Passed Failed Inactive 00:05:32.162 suites 1 1 n/a 0 0 00:05:32.162 tests 1 1 1 0 0 00:05:32.162 asserts 25 25 25 0 n/a 00:05:32.162 00:05:32.162 Elapsed time = 0.037 seconds 00:05:32.162 00:05:32.162 real 0m0.055s 00:05:32.162 user 0m0.014s 00:05:32.162 sys 0m0.040s 00:05:32.162 16:41:17 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:32.162 16:41:17 -- common/autotest_common.sh@10 -- # set +x 00:05:32.162 ************************************ 00:05:32.162 END TEST env_pci 00:05:32.162 ************************************ 00:05:32.421 16:41:17 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:32.421 16:41:17 -- env/env.sh@15 -- # uname 00:05:32.421 16:41:17 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:32.421 16:41:17 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:32.421 16:41:17 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:32.421 16:41:17 -- common/autotest_common.sh@1087 -- # '[' 5 -le 1 ']' 00:05:32.421 16:41:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:32.421 16:41:17 -- common/autotest_common.sh@10 -- # set +x 00:05:32.421 ************************************ 00:05:32.421 START TEST env_dpdk_post_init 00:05:32.421 ************************************ 00:05:32.421 16:41:17 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:32.421 EAL: Detected CPU lcores: 112 00:05:32.421 EAL: Detected NUMA nodes: 2 00:05:32.421 EAL: Detected static linkage of DPDK 00:05:32.421 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:32.421 EAL: Selected IOVA mode 'VA' 00:05:32.421 EAL: No free 2048 kB hugepages reported on node 1 00:05:32.421 EAL: VFIO support initialized 00:05:32.421 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:32.421 EAL: Using IOMMU type 1 (Type 1) 00:05:33.361 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:05:36.653 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:05:36.653 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:05:37.223 Starting DPDK initialization... 00:05:37.223 Starting SPDK post initialization... 00:05:37.223 SPDK NVMe probe 00:05:37.223 Attaching to 0000:d8:00.0 00:05:37.223 Attached to 0000:d8:00.0 00:05:37.223 Cleaning up... 00:05:37.223 00:05:37.223 real 0m4.754s 00:05:37.223 user 0m3.586s 00:05:37.223 sys 0m0.409s 00:05:37.223 16:41:22 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:37.223 16:41:22 -- common/autotest_common.sh@10 -- # set +x 00:05:37.223 ************************************ 00:05:37.223 END TEST env_dpdk_post_init 00:05:37.223 ************************************ 00:05:37.223 16:41:22 -- env/env.sh@26 -- # uname 00:05:37.223 16:41:22 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:37.223 16:41:22 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:37.223 16:41:22 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:37.223 16:41:22 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:37.223 16:41:22 -- common/autotest_common.sh@10 -- # set +x 00:05:37.223 ************************************ 00:05:37.223 START TEST env_mem_callbacks 00:05:37.223 ************************************ 00:05:37.223 16:41:22 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:37.223 EAL: Detected CPU lcores: 112 00:05:37.223 EAL: Detected NUMA nodes: 2 00:05:37.223 EAL: Detected static linkage of DPDK 00:05:37.223 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:37.223 EAL: Selected IOVA mode 'VA' 00:05:37.223 EAL: No free 2048 kB hugepages reported on node 1 00:05:37.223 EAL: VFIO support initialized 00:05:37.223 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:37.223 00:05:37.223 00:05:37.223 CUnit - A unit testing framework for C - Version 2.1-3 00:05:37.223 http://cunit.sourceforge.net/ 00:05:37.223 00:05:37.223 00:05:37.223 Suite: memory 00:05:37.223 Test: test ... 00:05:37.223 register 0x200000200000 2097152 00:05:37.223 malloc 3145728 00:05:37.223 register 0x200000400000 4194304 00:05:37.223 buf 0x200000500000 len 3145728 PASSED 00:05:37.223 malloc 64 00:05:37.223 buf 0x2000004fff40 len 64 PASSED 00:05:37.223 malloc 4194304 00:05:37.223 register 0x200000800000 6291456 00:05:37.223 buf 0x200000a00000 len 4194304 PASSED 00:05:37.223 free 0x200000500000 3145728 00:05:37.223 free 0x2000004fff40 64 00:05:37.223 unregister 0x200000400000 4194304 PASSED 00:05:37.223 free 0x200000a00000 4194304 00:05:37.223 unregister 0x200000800000 6291456 PASSED 00:05:37.223 malloc 8388608 00:05:37.223 register 0x200000400000 10485760 00:05:37.223 buf 0x200000600000 len 8388608 PASSED 00:05:37.223 free 0x200000600000 8388608 00:05:37.223 unregister 0x200000400000 10485760 PASSED 00:05:37.223 passed 00:05:37.223 00:05:37.223 Run Summary: Type Total Ran Passed Failed Inactive 00:05:37.223 suites 1 1 n/a 0 0 00:05:37.223 tests 1 1 1 0 0 00:05:37.223 asserts 15 15 15 0 n/a 00:05:37.223 00:05:37.223 Elapsed time = 0.007 seconds 00:05:37.223 00:05:37.223 real 0m0.067s 00:05:37.223 user 0m0.017s 00:05:37.223 sys 0m0.050s 00:05:37.223 16:41:22 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:37.223 16:41:22 -- common/autotest_common.sh@10 -- # set +x 00:05:37.223 ************************************ 00:05:37.223 END TEST env_mem_callbacks 00:05:37.223 ************************************ 00:05:37.223 00:05:37.223 real 0m6.537s 00:05:37.223 user 0m4.536s 00:05:37.223 sys 0m1.266s 00:05:37.223 16:41:22 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:37.223 16:41:22 -- common/autotest_common.sh@10 -- # set +x 00:05:37.223 ************************************ 00:05:37.223 END TEST env 00:05:37.223 ************************************ 00:05:37.223 16:41:22 -- spdk/autotest.sh@163 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:37.223 16:41:22 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:37.223 16:41:22 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:37.223 16:41:22 -- common/autotest_common.sh@10 -- # set +x 00:05:37.223 ************************************ 00:05:37.223 START TEST rpc 00:05:37.223 ************************************ 00:05:37.223 16:41:22 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:37.483 * Looking for test storage... 00:05:37.483 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:37.483 16:41:23 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:37.483 16:41:23 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:37.483 16:41:23 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:37.483 16:41:23 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:37.483 16:41:23 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:37.483 16:41:23 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:37.483 16:41:23 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:37.483 16:41:23 -- scripts/common.sh@335 -- # IFS=.-: 00:05:37.483 16:41:23 -- scripts/common.sh@335 -- # read -ra ver1 00:05:37.483 16:41:23 -- scripts/common.sh@336 -- # IFS=.-: 00:05:37.483 16:41:23 -- scripts/common.sh@336 -- # read -ra ver2 00:05:37.483 16:41:23 -- scripts/common.sh@337 -- # local 'op=<' 00:05:37.483 16:41:23 -- scripts/common.sh@339 -- # ver1_l=2 00:05:37.483 16:41:23 -- scripts/common.sh@340 -- # ver2_l=1 00:05:37.483 16:41:23 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:37.483 16:41:23 -- scripts/common.sh@343 -- # case "$op" in 00:05:37.483 16:41:23 -- scripts/common.sh@344 -- # : 1 00:05:37.483 16:41:23 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:37.483 16:41:23 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:37.483 16:41:23 -- scripts/common.sh@364 -- # decimal 1 00:05:37.483 16:41:23 -- scripts/common.sh@352 -- # local d=1 00:05:37.483 16:41:23 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:37.483 16:41:23 -- scripts/common.sh@354 -- # echo 1 00:05:37.483 16:41:23 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:37.483 16:41:23 -- scripts/common.sh@365 -- # decimal 2 00:05:37.483 16:41:23 -- scripts/common.sh@352 -- # local d=2 00:05:37.483 16:41:23 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:37.483 16:41:23 -- scripts/common.sh@354 -- # echo 2 00:05:37.483 16:41:23 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:37.483 16:41:23 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:37.483 16:41:23 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:37.483 16:41:23 -- scripts/common.sh@367 -- # return 0 00:05:37.483 16:41:23 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:37.483 16:41:23 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:37.483 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:37.483 --rc genhtml_branch_coverage=1 00:05:37.483 --rc genhtml_function_coverage=1 00:05:37.483 --rc genhtml_legend=1 00:05:37.483 --rc geninfo_all_blocks=1 00:05:37.483 --rc geninfo_unexecuted_blocks=1 00:05:37.483 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:37.483 ' 00:05:37.483 16:41:23 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:37.483 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:37.483 --rc genhtml_branch_coverage=1 00:05:37.483 --rc genhtml_function_coverage=1 00:05:37.483 --rc genhtml_legend=1 00:05:37.483 --rc geninfo_all_blocks=1 00:05:37.483 --rc geninfo_unexecuted_blocks=1 00:05:37.483 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:37.483 ' 00:05:37.483 16:41:23 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:37.483 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:37.483 --rc genhtml_branch_coverage=1 00:05:37.483 --rc genhtml_function_coverage=1 00:05:37.483 --rc genhtml_legend=1 00:05:37.483 --rc geninfo_all_blocks=1 00:05:37.483 --rc geninfo_unexecuted_blocks=1 00:05:37.483 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:37.483 ' 00:05:37.483 16:41:23 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:37.483 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:37.483 --rc genhtml_branch_coverage=1 00:05:37.483 --rc genhtml_function_coverage=1 00:05:37.483 --rc genhtml_legend=1 00:05:37.483 --rc geninfo_all_blocks=1 00:05:37.483 --rc geninfo_unexecuted_blocks=1 00:05:37.483 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:37.483 ' 00:05:37.483 16:41:23 -- rpc/rpc.sh@65 -- # spdk_pid=459985 00:05:37.483 16:41:23 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:37.483 16:41:23 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:37.483 16:41:23 -- rpc/rpc.sh@67 -- # waitforlisten 459985 00:05:37.483 16:41:23 -- common/autotest_common.sh@829 -- # '[' -z 459985 ']' 00:05:37.483 16:41:23 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:37.483 16:41:23 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:37.483 16:41:23 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:37.483 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:37.483 16:41:23 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:37.483 16:41:23 -- common/autotest_common.sh@10 -- # set +x 00:05:37.483 [2024-11-16 16:41:23.111595] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:37.483 [2024-11-16 16:41:23.111732] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid459985 ] 00:05:37.483 EAL: No free 2048 kB hugepages reported on node 1 00:05:37.483 [2024-11-16 16:41:23.193744] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:37.483 [2024-11-16 16:41:23.231318] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:37.742 [2024-11-16 16:41:23.231427] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:37.742 [2024-11-16 16:41:23.231438] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 459985' to capture a snapshot of events at runtime. 00:05:37.742 [2024-11-16 16:41:23.231447] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid459985 for offline analysis/debug. 00:05:37.742 [2024-11-16 16:41:23.231469] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:38.311 16:41:23 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:38.311 16:41:23 -- common/autotest_common.sh@862 -- # return 0 00:05:38.311 16:41:23 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:38.311 16:41:23 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:38.311 16:41:23 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:38.311 16:41:23 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:38.311 16:41:23 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:38.311 16:41:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:38.311 16:41:23 -- common/autotest_common.sh@10 -- # set +x 00:05:38.311 ************************************ 00:05:38.311 START TEST rpc_integrity 00:05:38.311 ************************************ 00:05:38.311 16:41:23 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:05:38.311 16:41:23 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:38.311 16:41:23 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:38.311 16:41:23 -- common/autotest_common.sh@10 -- # set +x 00:05:38.311 16:41:23 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:38.311 16:41:23 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:38.311 16:41:23 -- rpc/rpc.sh@13 -- # jq length 00:05:38.311 16:41:24 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:38.311 16:41:24 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:38.311 16:41:24 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:38.311 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:38.311 16:41:24 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:38.311 16:41:24 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:38.311 16:41:24 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:38.312 16:41:24 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:38.312 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:38.312 16:41:24 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:38.312 16:41:24 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:38.312 { 00:05:38.312 "name": "Malloc0", 00:05:38.312 "aliases": [ 00:05:38.312 "d4aedb26-e9bb-40ed-b8ac-f3d3256149ae" 00:05:38.312 ], 00:05:38.312 "product_name": "Malloc disk", 00:05:38.312 "block_size": 512, 00:05:38.312 "num_blocks": 16384, 00:05:38.312 "uuid": "d4aedb26-e9bb-40ed-b8ac-f3d3256149ae", 00:05:38.312 "assigned_rate_limits": { 00:05:38.312 "rw_ios_per_sec": 0, 00:05:38.312 "rw_mbytes_per_sec": 0, 00:05:38.312 "r_mbytes_per_sec": 0, 00:05:38.312 "w_mbytes_per_sec": 0 00:05:38.312 }, 00:05:38.312 "claimed": false, 00:05:38.312 "zoned": false, 00:05:38.312 "supported_io_types": { 00:05:38.312 "read": true, 00:05:38.312 "write": true, 00:05:38.312 "unmap": true, 00:05:38.312 "write_zeroes": true, 00:05:38.312 "flush": true, 00:05:38.312 "reset": true, 00:05:38.312 "compare": false, 00:05:38.312 "compare_and_write": false, 00:05:38.312 "abort": true, 00:05:38.312 "nvme_admin": false, 00:05:38.312 "nvme_io": false 00:05:38.312 }, 00:05:38.312 "memory_domains": [ 00:05:38.312 { 00:05:38.312 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:38.312 "dma_device_type": 2 00:05:38.312 } 00:05:38.312 ], 00:05:38.312 "driver_specific": {} 00:05:38.312 } 00:05:38.312 ]' 00:05:38.312 16:41:24 -- rpc/rpc.sh@17 -- # jq length 00:05:38.571 16:41:24 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:38.571 16:41:24 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:38.571 16:41:24 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:38.571 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:38.571 [2024-11-16 16:41:24.079861] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:38.571 [2024-11-16 16:41:24.079894] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:38.571 [2024-11-16 16:41:24.079915] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x505f850 00:05:38.571 [2024-11-16 16:41:24.079925] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:38.571 [2024-11-16 16:41:24.080752] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:38.571 [2024-11-16 16:41:24.080773] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:38.571 Passthru0 00:05:38.571 16:41:24 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:38.571 16:41:24 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:38.571 16:41:24 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:38.571 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:38.571 16:41:24 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:38.571 16:41:24 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:38.571 { 00:05:38.571 "name": "Malloc0", 00:05:38.571 "aliases": [ 00:05:38.571 "d4aedb26-e9bb-40ed-b8ac-f3d3256149ae" 00:05:38.571 ], 00:05:38.571 "product_name": "Malloc disk", 00:05:38.571 "block_size": 512, 00:05:38.571 "num_blocks": 16384, 00:05:38.571 "uuid": "d4aedb26-e9bb-40ed-b8ac-f3d3256149ae", 00:05:38.571 "assigned_rate_limits": { 00:05:38.571 "rw_ios_per_sec": 0, 00:05:38.571 "rw_mbytes_per_sec": 0, 00:05:38.571 "r_mbytes_per_sec": 0, 00:05:38.571 "w_mbytes_per_sec": 0 00:05:38.571 }, 00:05:38.571 "claimed": true, 00:05:38.571 "claim_type": "exclusive_write", 00:05:38.571 "zoned": false, 00:05:38.571 "supported_io_types": { 00:05:38.571 "read": true, 00:05:38.571 "write": true, 00:05:38.571 "unmap": true, 00:05:38.571 "write_zeroes": true, 00:05:38.571 "flush": true, 00:05:38.571 "reset": true, 00:05:38.571 "compare": false, 00:05:38.571 "compare_and_write": false, 00:05:38.571 "abort": true, 00:05:38.571 "nvme_admin": false, 00:05:38.571 "nvme_io": false 00:05:38.571 }, 00:05:38.571 "memory_domains": [ 00:05:38.571 { 00:05:38.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:38.571 "dma_device_type": 2 00:05:38.571 } 00:05:38.571 ], 00:05:38.571 "driver_specific": {} 00:05:38.571 }, 00:05:38.571 { 00:05:38.571 "name": "Passthru0", 00:05:38.571 "aliases": [ 00:05:38.571 "4daef5d2-eee2-51e2-9a6d-c6d98de16eab" 00:05:38.571 ], 00:05:38.571 "product_name": "passthru", 00:05:38.571 "block_size": 512, 00:05:38.571 "num_blocks": 16384, 00:05:38.571 "uuid": "4daef5d2-eee2-51e2-9a6d-c6d98de16eab", 00:05:38.571 "assigned_rate_limits": { 00:05:38.571 "rw_ios_per_sec": 0, 00:05:38.571 "rw_mbytes_per_sec": 0, 00:05:38.571 "r_mbytes_per_sec": 0, 00:05:38.571 "w_mbytes_per_sec": 0 00:05:38.571 }, 00:05:38.571 "claimed": false, 00:05:38.571 "zoned": false, 00:05:38.571 "supported_io_types": { 00:05:38.571 "read": true, 00:05:38.571 "write": true, 00:05:38.571 "unmap": true, 00:05:38.571 "write_zeroes": true, 00:05:38.571 "flush": true, 00:05:38.571 "reset": true, 00:05:38.571 "compare": false, 00:05:38.571 "compare_and_write": false, 00:05:38.571 "abort": true, 00:05:38.571 "nvme_admin": false, 00:05:38.571 "nvme_io": false 00:05:38.571 }, 00:05:38.571 "memory_domains": [ 00:05:38.571 { 00:05:38.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:38.571 "dma_device_type": 2 00:05:38.572 } 00:05:38.572 ], 00:05:38.572 "driver_specific": { 00:05:38.572 "passthru": { 00:05:38.572 "name": "Passthru0", 00:05:38.572 "base_bdev_name": "Malloc0" 00:05:38.572 } 00:05:38.572 } 00:05:38.572 } 00:05:38.572 ]' 00:05:38.572 16:41:24 -- rpc/rpc.sh@21 -- # jq length 00:05:38.572 16:41:24 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:38.572 16:41:24 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:38.572 16:41:24 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:38.572 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:38.572 16:41:24 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:38.572 16:41:24 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:38.572 16:41:24 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:38.572 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:38.572 16:41:24 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:38.572 16:41:24 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:38.572 16:41:24 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:38.572 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:38.572 16:41:24 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:38.572 16:41:24 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:38.572 16:41:24 -- rpc/rpc.sh@26 -- # jq length 00:05:38.572 16:41:24 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:38.572 00:05:38.572 real 0m0.273s 00:05:38.572 user 0m0.157s 00:05:38.572 sys 0m0.051s 00:05:38.572 16:41:24 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:38.572 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:38.572 ************************************ 00:05:38.572 END TEST rpc_integrity 00:05:38.572 ************************************ 00:05:38.572 16:41:24 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:38.572 16:41:24 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:38.572 16:41:24 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:38.572 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:38.572 ************************************ 00:05:38.572 START TEST rpc_plugins 00:05:38.572 ************************************ 00:05:38.572 16:41:24 -- common/autotest_common.sh@1114 -- # rpc_plugins 00:05:38.572 16:41:24 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:38.572 16:41:24 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:38.572 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:38.572 16:41:24 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:38.572 16:41:24 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:38.572 16:41:24 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:38.572 16:41:24 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:38.572 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:38.572 16:41:24 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:38.572 16:41:24 -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:38.572 { 00:05:38.572 "name": "Malloc1", 00:05:38.572 "aliases": [ 00:05:38.572 "ab195dc1-45a2-4eb6-9303-ddd320e7f949" 00:05:38.572 ], 00:05:38.572 "product_name": "Malloc disk", 00:05:38.572 "block_size": 4096, 00:05:38.572 "num_blocks": 256, 00:05:38.572 "uuid": "ab195dc1-45a2-4eb6-9303-ddd320e7f949", 00:05:38.572 "assigned_rate_limits": { 00:05:38.572 "rw_ios_per_sec": 0, 00:05:38.572 "rw_mbytes_per_sec": 0, 00:05:38.572 "r_mbytes_per_sec": 0, 00:05:38.572 "w_mbytes_per_sec": 0 00:05:38.572 }, 00:05:38.572 "claimed": false, 00:05:38.572 "zoned": false, 00:05:38.572 "supported_io_types": { 00:05:38.572 "read": true, 00:05:38.572 "write": true, 00:05:38.572 "unmap": true, 00:05:38.572 "write_zeroes": true, 00:05:38.572 "flush": true, 00:05:38.572 "reset": true, 00:05:38.572 "compare": false, 00:05:38.572 "compare_and_write": false, 00:05:38.572 "abort": true, 00:05:38.572 "nvme_admin": false, 00:05:38.572 "nvme_io": false 00:05:38.572 }, 00:05:38.572 "memory_domains": [ 00:05:38.572 { 00:05:38.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:38.572 "dma_device_type": 2 00:05:38.572 } 00:05:38.572 ], 00:05:38.572 "driver_specific": {} 00:05:38.572 } 00:05:38.572 ]' 00:05:38.572 16:41:24 -- rpc/rpc.sh@32 -- # jq length 00:05:38.831 16:41:24 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:38.831 16:41:24 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:38.831 16:41:24 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:38.831 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:38.831 16:41:24 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:38.831 16:41:24 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:38.831 16:41:24 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:38.831 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:38.831 16:41:24 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:38.831 16:41:24 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:38.831 16:41:24 -- rpc/rpc.sh@36 -- # jq length 00:05:38.831 16:41:24 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:38.831 00:05:38.831 real 0m0.149s 00:05:38.831 user 0m0.090s 00:05:38.831 sys 0m0.023s 00:05:38.831 16:41:24 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:38.831 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:38.831 ************************************ 00:05:38.831 END TEST rpc_plugins 00:05:38.831 ************************************ 00:05:38.831 16:41:24 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:38.831 16:41:24 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:38.831 16:41:24 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:38.831 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:38.831 ************************************ 00:05:38.831 START TEST rpc_trace_cmd_test 00:05:38.831 ************************************ 00:05:38.831 16:41:24 -- common/autotest_common.sh@1114 -- # rpc_trace_cmd_test 00:05:38.831 16:41:24 -- rpc/rpc.sh@40 -- # local info 00:05:38.831 16:41:24 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:38.831 16:41:24 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:38.832 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:38.832 16:41:24 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:38.832 16:41:24 -- rpc/rpc.sh@42 -- # info='{ 00:05:38.832 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid459985", 00:05:38.832 "tpoint_group_mask": "0x8", 00:05:38.832 "iscsi_conn": { 00:05:38.832 "mask": "0x2", 00:05:38.832 "tpoint_mask": "0x0" 00:05:38.832 }, 00:05:38.832 "scsi": { 00:05:38.832 "mask": "0x4", 00:05:38.832 "tpoint_mask": "0x0" 00:05:38.832 }, 00:05:38.832 "bdev": { 00:05:38.832 "mask": "0x8", 00:05:38.832 "tpoint_mask": "0xffffffffffffffff" 00:05:38.832 }, 00:05:38.832 "nvmf_rdma": { 00:05:38.832 "mask": "0x10", 00:05:38.832 "tpoint_mask": "0x0" 00:05:38.832 }, 00:05:38.832 "nvmf_tcp": { 00:05:38.832 "mask": "0x20", 00:05:38.832 "tpoint_mask": "0x0" 00:05:38.832 }, 00:05:38.832 "ftl": { 00:05:38.832 "mask": "0x40", 00:05:38.832 "tpoint_mask": "0x0" 00:05:38.832 }, 00:05:38.832 "blobfs": { 00:05:38.832 "mask": "0x80", 00:05:38.832 "tpoint_mask": "0x0" 00:05:38.832 }, 00:05:38.832 "dsa": { 00:05:38.832 "mask": "0x200", 00:05:38.832 "tpoint_mask": "0x0" 00:05:38.832 }, 00:05:38.832 "thread": { 00:05:38.832 "mask": "0x400", 00:05:38.832 "tpoint_mask": "0x0" 00:05:38.832 }, 00:05:38.832 "nvme_pcie": { 00:05:38.832 "mask": "0x800", 00:05:38.832 "tpoint_mask": "0x0" 00:05:38.832 }, 00:05:38.832 "iaa": { 00:05:38.832 "mask": "0x1000", 00:05:38.832 "tpoint_mask": "0x0" 00:05:38.832 }, 00:05:38.832 "nvme_tcp": { 00:05:38.832 "mask": "0x2000", 00:05:38.832 "tpoint_mask": "0x0" 00:05:38.832 }, 00:05:38.832 "bdev_nvme": { 00:05:38.832 "mask": "0x4000", 00:05:38.832 "tpoint_mask": "0x0" 00:05:38.832 } 00:05:38.832 }' 00:05:38.832 16:41:24 -- rpc/rpc.sh@43 -- # jq length 00:05:38.832 16:41:24 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:05:38.832 16:41:24 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:38.832 16:41:24 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:38.832 16:41:24 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:39.091 16:41:24 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:39.091 16:41:24 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:39.091 16:41:24 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:39.091 16:41:24 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:39.091 16:41:24 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:39.091 00:05:39.091 real 0m0.228s 00:05:39.091 user 0m0.189s 00:05:39.091 sys 0m0.033s 00:05:39.091 16:41:24 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:39.091 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:39.091 ************************************ 00:05:39.091 END TEST rpc_trace_cmd_test 00:05:39.091 ************************************ 00:05:39.091 16:41:24 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:39.091 16:41:24 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:39.091 16:41:24 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:39.091 16:41:24 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:39.091 16:41:24 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:39.091 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:39.091 ************************************ 00:05:39.091 START TEST rpc_daemon_integrity 00:05:39.091 ************************************ 00:05:39.091 16:41:24 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:05:39.091 16:41:24 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:39.091 16:41:24 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.091 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:39.091 16:41:24 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.091 16:41:24 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:39.091 16:41:24 -- rpc/rpc.sh@13 -- # jq length 00:05:39.091 16:41:24 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:39.091 16:41:24 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:39.091 16:41:24 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.091 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:39.091 16:41:24 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.091 16:41:24 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:39.091 16:41:24 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:39.091 16:41:24 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.091 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:39.091 16:41:24 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.091 16:41:24 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:39.091 { 00:05:39.091 "name": "Malloc2", 00:05:39.091 "aliases": [ 00:05:39.091 "00389569-73ec-47f0-8025-d9852f9cd08c" 00:05:39.091 ], 00:05:39.091 "product_name": "Malloc disk", 00:05:39.091 "block_size": 512, 00:05:39.091 "num_blocks": 16384, 00:05:39.091 "uuid": "00389569-73ec-47f0-8025-d9852f9cd08c", 00:05:39.091 "assigned_rate_limits": { 00:05:39.091 "rw_ios_per_sec": 0, 00:05:39.091 "rw_mbytes_per_sec": 0, 00:05:39.091 "r_mbytes_per_sec": 0, 00:05:39.091 "w_mbytes_per_sec": 0 00:05:39.091 }, 00:05:39.091 "claimed": false, 00:05:39.091 "zoned": false, 00:05:39.091 "supported_io_types": { 00:05:39.091 "read": true, 00:05:39.091 "write": true, 00:05:39.091 "unmap": true, 00:05:39.091 "write_zeroes": true, 00:05:39.091 "flush": true, 00:05:39.091 "reset": true, 00:05:39.091 "compare": false, 00:05:39.091 "compare_and_write": false, 00:05:39.091 "abort": true, 00:05:39.091 "nvme_admin": false, 00:05:39.091 "nvme_io": false 00:05:39.091 }, 00:05:39.091 "memory_domains": [ 00:05:39.091 { 00:05:39.091 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:39.091 "dma_device_type": 2 00:05:39.091 } 00:05:39.091 ], 00:05:39.091 "driver_specific": {} 00:05:39.091 } 00:05:39.091 ]' 00:05:39.091 16:41:24 -- rpc/rpc.sh@17 -- # jq length 00:05:39.351 16:41:24 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:39.351 16:41:24 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:39.351 16:41:24 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.351 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:39.351 [2024-11-16 16:41:24.877933] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:39.351 [2024-11-16 16:41:24.877962] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:39.351 [2024-11-16 16:41:24.877977] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x50614c0 00:05:39.351 [2024-11-16 16:41:24.877987] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:39.352 [2024-11-16 16:41:24.878800] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:39.352 [2024-11-16 16:41:24.878825] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:39.352 Passthru0 00:05:39.352 16:41:24 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.352 16:41:24 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:39.352 16:41:24 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.352 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:39.352 16:41:24 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.352 16:41:24 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:39.352 { 00:05:39.352 "name": "Malloc2", 00:05:39.352 "aliases": [ 00:05:39.352 "00389569-73ec-47f0-8025-d9852f9cd08c" 00:05:39.352 ], 00:05:39.352 "product_name": "Malloc disk", 00:05:39.352 "block_size": 512, 00:05:39.352 "num_blocks": 16384, 00:05:39.352 "uuid": "00389569-73ec-47f0-8025-d9852f9cd08c", 00:05:39.352 "assigned_rate_limits": { 00:05:39.352 "rw_ios_per_sec": 0, 00:05:39.352 "rw_mbytes_per_sec": 0, 00:05:39.352 "r_mbytes_per_sec": 0, 00:05:39.352 "w_mbytes_per_sec": 0 00:05:39.352 }, 00:05:39.352 "claimed": true, 00:05:39.352 "claim_type": "exclusive_write", 00:05:39.352 "zoned": false, 00:05:39.352 "supported_io_types": { 00:05:39.352 "read": true, 00:05:39.352 "write": true, 00:05:39.352 "unmap": true, 00:05:39.352 "write_zeroes": true, 00:05:39.352 "flush": true, 00:05:39.352 "reset": true, 00:05:39.352 "compare": false, 00:05:39.352 "compare_and_write": false, 00:05:39.352 "abort": true, 00:05:39.352 "nvme_admin": false, 00:05:39.352 "nvme_io": false 00:05:39.352 }, 00:05:39.352 "memory_domains": [ 00:05:39.352 { 00:05:39.352 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:39.352 "dma_device_type": 2 00:05:39.352 } 00:05:39.352 ], 00:05:39.352 "driver_specific": {} 00:05:39.352 }, 00:05:39.352 { 00:05:39.352 "name": "Passthru0", 00:05:39.352 "aliases": [ 00:05:39.352 "aae99457-49a9-57bc-94f5-01fa431e3966" 00:05:39.352 ], 00:05:39.352 "product_name": "passthru", 00:05:39.352 "block_size": 512, 00:05:39.352 "num_blocks": 16384, 00:05:39.352 "uuid": "aae99457-49a9-57bc-94f5-01fa431e3966", 00:05:39.352 "assigned_rate_limits": { 00:05:39.352 "rw_ios_per_sec": 0, 00:05:39.352 "rw_mbytes_per_sec": 0, 00:05:39.352 "r_mbytes_per_sec": 0, 00:05:39.352 "w_mbytes_per_sec": 0 00:05:39.352 }, 00:05:39.352 "claimed": false, 00:05:39.352 "zoned": false, 00:05:39.352 "supported_io_types": { 00:05:39.352 "read": true, 00:05:39.352 "write": true, 00:05:39.352 "unmap": true, 00:05:39.352 "write_zeroes": true, 00:05:39.352 "flush": true, 00:05:39.352 "reset": true, 00:05:39.352 "compare": false, 00:05:39.352 "compare_and_write": false, 00:05:39.352 "abort": true, 00:05:39.352 "nvme_admin": false, 00:05:39.352 "nvme_io": false 00:05:39.352 }, 00:05:39.352 "memory_domains": [ 00:05:39.352 { 00:05:39.352 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:39.352 "dma_device_type": 2 00:05:39.352 } 00:05:39.352 ], 00:05:39.352 "driver_specific": { 00:05:39.352 "passthru": { 00:05:39.352 "name": "Passthru0", 00:05:39.352 "base_bdev_name": "Malloc2" 00:05:39.352 } 00:05:39.352 } 00:05:39.352 } 00:05:39.352 ]' 00:05:39.352 16:41:24 -- rpc/rpc.sh@21 -- # jq length 00:05:39.352 16:41:24 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:39.352 16:41:24 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:39.352 16:41:24 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.352 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:39.352 16:41:24 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.352 16:41:24 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:39.352 16:41:24 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.352 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:39.352 16:41:24 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.352 16:41:24 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:39.352 16:41:24 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.352 16:41:24 -- common/autotest_common.sh@10 -- # set +x 00:05:39.352 16:41:24 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.352 16:41:24 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:39.352 16:41:24 -- rpc/rpc.sh@26 -- # jq length 00:05:39.352 16:41:25 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:39.352 00:05:39.352 real 0m0.283s 00:05:39.352 user 0m0.194s 00:05:39.352 sys 0m0.030s 00:05:39.352 16:41:25 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:39.352 16:41:25 -- common/autotest_common.sh@10 -- # set +x 00:05:39.352 ************************************ 00:05:39.352 END TEST rpc_daemon_integrity 00:05:39.352 ************************************ 00:05:39.352 16:41:25 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:39.352 16:41:25 -- rpc/rpc.sh@84 -- # killprocess 459985 00:05:39.352 16:41:25 -- common/autotest_common.sh@936 -- # '[' -z 459985 ']' 00:05:39.352 16:41:25 -- common/autotest_common.sh@940 -- # kill -0 459985 00:05:39.352 16:41:25 -- common/autotest_common.sh@941 -- # uname 00:05:39.352 16:41:25 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:39.352 16:41:25 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 459985 00:05:39.611 16:41:25 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:39.611 16:41:25 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:39.611 16:41:25 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 459985' 00:05:39.611 killing process with pid 459985 00:05:39.611 16:41:25 -- common/autotest_common.sh@955 -- # kill 459985 00:05:39.611 16:41:25 -- common/autotest_common.sh@960 -- # wait 459985 00:05:39.871 00:05:39.871 real 0m2.522s 00:05:39.871 user 0m3.157s 00:05:39.871 sys 0m0.765s 00:05:39.871 16:41:25 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:39.871 16:41:25 -- common/autotest_common.sh@10 -- # set +x 00:05:39.871 ************************************ 00:05:39.871 END TEST rpc 00:05:39.871 ************************************ 00:05:39.871 16:41:25 -- spdk/autotest.sh@164 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:39.871 16:41:25 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:39.871 16:41:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:39.871 16:41:25 -- common/autotest_common.sh@10 -- # set +x 00:05:39.871 ************************************ 00:05:39.871 START TEST rpc_client 00:05:39.871 ************************************ 00:05:39.871 16:41:25 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:39.871 * Looking for test storage... 00:05:39.871 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:05:39.871 16:41:25 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:39.871 16:41:25 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:39.871 16:41:25 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:40.131 16:41:25 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:40.131 16:41:25 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:40.131 16:41:25 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:40.131 16:41:25 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:40.131 16:41:25 -- scripts/common.sh@335 -- # IFS=.-: 00:05:40.131 16:41:25 -- scripts/common.sh@335 -- # read -ra ver1 00:05:40.131 16:41:25 -- scripts/common.sh@336 -- # IFS=.-: 00:05:40.131 16:41:25 -- scripts/common.sh@336 -- # read -ra ver2 00:05:40.131 16:41:25 -- scripts/common.sh@337 -- # local 'op=<' 00:05:40.131 16:41:25 -- scripts/common.sh@339 -- # ver1_l=2 00:05:40.131 16:41:25 -- scripts/common.sh@340 -- # ver2_l=1 00:05:40.131 16:41:25 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:40.131 16:41:25 -- scripts/common.sh@343 -- # case "$op" in 00:05:40.131 16:41:25 -- scripts/common.sh@344 -- # : 1 00:05:40.131 16:41:25 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:40.131 16:41:25 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:40.131 16:41:25 -- scripts/common.sh@364 -- # decimal 1 00:05:40.131 16:41:25 -- scripts/common.sh@352 -- # local d=1 00:05:40.131 16:41:25 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:40.131 16:41:25 -- scripts/common.sh@354 -- # echo 1 00:05:40.131 16:41:25 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:40.131 16:41:25 -- scripts/common.sh@365 -- # decimal 2 00:05:40.131 16:41:25 -- scripts/common.sh@352 -- # local d=2 00:05:40.131 16:41:25 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:40.131 16:41:25 -- scripts/common.sh@354 -- # echo 2 00:05:40.131 16:41:25 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:40.131 16:41:25 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:40.131 16:41:25 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:40.131 16:41:25 -- scripts/common.sh@367 -- # return 0 00:05:40.131 16:41:25 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:40.131 16:41:25 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:40.131 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.131 --rc genhtml_branch_coverage=1 00:05:40.131 --rc genhtml_function_coverage=1 00:05:40.131 --rc genhtml_legend=1 00:05:40.131 --rc geninfo_all_blocks=1 00:05:40.131 --rc geninfo_unexecuted_blocks=1 00:05:40.131 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.131 ' 00:05:40.131 16:41:25 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:40.131 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.131 --rc genhtml_branch_coverage=1 00:05:40.131 --rc genhtml_function_coverage=1 00:05:40.131 --rc genhtml_legend=1 00:05:40.131 --rc geninfo_all_blocks=1 00:05:40.131 --rc geninfo_unexecuted_blocks=1 00:05:40.131 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.131 ' 00:05:40.131 16:41:25 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:40.131 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.131 --rc genhtml_branch_coverage=1 00:05:40.131 --rc genhtml_function_coverage=1 00:05:40.131 --rc genhtml_legend=1 00:05:40.131 --rc geninfo_all_blocks=1 00:05:40.131 --rc geninfo_unexecuted_blocks=1 00:05:40.131 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.131 ' 00:05:40.131 16:41:25 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:40.131 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.131 --rc genhtml_branch_coverage=1 00:05:40.131 --rc genhtml_function_coverage=1 00:05:40.131 --rc genhtml_legend=1 00:05:40.131 --rc geninfo_all_blocks=1 00:05:40.131 --rc geninfo_unexecuted_blocks=1 00:05:40.131 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.131 ' 00:05:40.131 16:41:25 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:40.131 OK 00:05:40.131 16:41:25 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:40.131 00:05:40.131 real 0m0.213s 00:05:40.131 user 0m0.111s 00:05:40.131 sys 0m0.121s 00:05:40.131 16:41:25 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:40.131 16:41:25 -- common/autotest_common.sh@10 -- # set +x 00:05:40.131 ************************************ 00:05:40.131 END TEST rpc_client 00:05:40.131 ************************************ 00:05:40.131 16:41:25 -- spdk/autotest.sh@165 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:40.131 16:41:25 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:40.131 16:41:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:40.131 16:41:25 -- common/autotest_common.sh@10 -- # set +x 00:05:40.131 ************************************ 00:05:40.131 START TEST json_config 00:05:40.131 ************************************ 00:05:40.131 16:41:25 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:40.131 16:41:25 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:40.131 16:41:25 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:40.131 16:41:25 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:40.131 16:41:25 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:40.131 16:41:25 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:40.131 16:41:25 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:40.131 16:41:25 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:40.131 16:41:25 -- scripts/common.sh@335 -- # IFS=.-: 00:05:40.131 16:41:25 -- scripts/common.sh@335 -- # read -ra ver1 00:05:40.131 16:41:25 -- scripts/common.sh@336 -- # IFS=.-: 00:05:40.131 16:41:25 -- scripts/common.sh@336 -- # read -ra ver2 00:05:40.131 16:41:25 -- scripts/common.sh@337 -- # local 'op=<' 00:05:40.131 16:41:25 -- scripts/common.sh@339 -- # ver1_l=2 00:05:40.131 16:41:25 -- scripts/common.sh@340 -- # ver2_l=1 00:05:40.131 16:41:25 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:40.131 16:41:25 -- scripts/common.sh@343 -- # case "$op" in 00:05:40.131 16:41:25 -- scripts/common.sh@344 -- # : 1 00:05:40.392 16:41:25 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:40.392 16:41:25 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:40.392 16:41:25 -- scripts/common.sh@364 -- # decimal 1 00:05:40.392 16:41:25 -- scripts/common.sh@352 -- # local d=1 00:05:40.392 16:41:25 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:40.392 16:41:25 -- scripts/common.sh@354 -- # echo 1 00:05:40.392 16:41:25 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:40.392 16:41:25 -- scripts/common.sh@365 -- # decimal 2 00:05:40.392 16:41:25 -- scripts/common.sh@352 -- # local d=2 00:05:40.392 16:41:25 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:40.392 16:41:25 -- scripts/common.sh@354 -- # echo 2 00:05:40.392 16:41:25 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:40.392 16:41:25 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:40.392 16:41:25 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:40.392 16:41:25 -- scripts/common.sh@367 -- # return 0 00:05:40.392 16:41:25 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:40.392 16:41:25 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:40.392 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.392 --rc genhtml_branch_coverage=1 00:05:40.392 --rc genhtml_function_coverage=1 00:05:40.392 --rc genhtml_legend=1 00:05:40.392 --rc geninfo_all_blocks=1 00:05:40.392 --rc geninfo_unexecuted_blocks=1 00:05:40.392 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.392 ' 00:05:40.392 16:41:25 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:40.392 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.392 --rc genhtml_branch_coverage=1 00:05:40.392 --rc genhtml_function_coverage=1 00:05:40.392 --rc genhtml_legend=1 00:05:40.392 --rc geninfo_all_blocks=1 00:05:40.392 --rc geninfo_unexecuted_blocks=1 00:05:40.392 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.392 ' 00:05:40.392 16:41:25 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:40.392 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.392 --rc genhtml_branch_coverage=1 00:05:40.392 --rc genhtml_function_coverage=1 00:05:40.392 --rc genhtml_legend=1 00:05:40.392 --rc geninfo_all_blocks=1 00:05:40.392 --rc geninfo_unexecuted_blocks=1 00:05:40.392 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.392 ' 00:05:40.392 16:41:25 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:40.392 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.392 --rc genhtml_branch_coverage=1 00:05:40.392 --rc genhtml_function_coverage=1 00:05:40.392 --rc genhtml_legend=1 00:05:40.392 --rc geninfo_all_blocks=1 00:05:40.392 --rc geninfo_unexecuted_blocks=1 00:05:40.392 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.392 ' 00:05:40.392 16:41:25 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:40.392 16:41:25 -- nvmf/common.sh@7 -- # uname -s 00:05:40.392 16:41:25 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:40.392 16:41:25 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:40.392 16:41:25 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:40.392 16:41:25 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:40.392 16:41:25 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:40.392 16:41:25 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:40.392 16:41:25 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:40.392 16:41:25 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:40.392 16:41:25 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:40.392 16:41:25 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:40.392 16:41:25 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:40.392 16:41:25 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:40.392 16:41:25 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:40.392 16:41:25 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:40.392 16:41:25 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:40.392 16:41:25 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:40.392 16:41:25 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:40.392 16:41:25 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:40.392 16:41:25 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:40.392 16:41:25 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:40.392 16:41:25 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:40.392 16:41:25 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:40.392 16:41:25 -- paths/export.sh@5 -- # export PATH 00:05:40.392 16:41:25 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:40.392 16:41:25 -- nvmf/common.sh@46 -- # : 0 00:05:40.392 16:41:25 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:40.392 16:41:25 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:40.392 16:41:25 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:40.392 16:41:25 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:40.392 16:41:25 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:40.392 16:41:25 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:40.392 16:41:25 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:40.392 16:41:25 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:40.392 16:41:25 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:05:40.392 16:41:25 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:05:40.392 16:41:25 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:05:40.392 16:41:25 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:40.392 16:41:25 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:05:40.392 WARNING: No tests are enabled so not running JSON configuration tests 00:05:40.392 16:41:25 -- json_config/json_config.sh@27 -- # exit 0 00:05:40.392 00:05:40.392 real 0m0.189s 00:05:40.392 user 0m0.115s 00:05:40.392 sys 0m0.083s 00:05:40.392 16:41:25 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:40.392 16:41:25 -- common/autotest_common.sh@10 -- # set +x 00:05:40.392 ************************************ 00:05:40.392 END TEST json_config 00:05:40.392 ************************************ 00:05:40.392 16:41:25 -- spdk/autotest.sh@166 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:40.392 16:41:25 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:40.392 16:41:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:40.392 16:41:25 -- common/autotest_common.sh@10 -- # set +x 00:05:40.392 ************************************ 00:05:40.392 START TEST json_config_extra_key 00:05:40.392 ************************************ 00:05:40.392 16:41:25 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:40.392 16:41:26 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:40.392 16:41:26 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:40.392 16:41:26 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:40.392 16:41:26 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:40.392 16:41:26 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:40.392 16:41:26 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:40.392 16:41:26 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:40.392 16:41:26 -- scripts/common.sh@335 -- # IFS=.-: 00:05:40.392 16:41:26 -- scripts/common.sh@335 -- # read -ra ver1 00:05:40.392 16:41:26 -- scripts/common.sh@336 -- # IFS=.-: 00:05:40.392 16:41:26 -- scripts/common.sh@336 -- # read -ra ver2 00:05:40.392 16:41:26 -- scripts/common.sh@337 -- # local 'op=<' 00:05:40.392 16:41:26 -- scripts/common.sh@339 -- # ver1_l=2 00:05:40.392 16:41:26 -- scripts/common.sh@340 -- # ver2_l=1 00:05:40.392 16:41:26 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:40.392 16:41:26 -- scripts/common.sh@343 -- # case "$op" in 00:05:40.392 16:41:26 -- scripts/common.sh@344 -- # : 1 00:05:40.392 16:41:26 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:40.392 16:41:26 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:40.392 16:41:26 -- scripts/common.sh@364 -- # decimal 1 00:05:40.392 16:41:26 -- scripts/common.sh@352 -- # local d=1 00:05:40.393 16:41:26 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:40.393 16:41:26 -- scripts/common.sh@354 -- # echo 1 00:05:40.393 16:41:26 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:40.393 16:41:26 -- scripts/common.sh@365 -- # decimal 2 00:05:40.393 16:41:26 -- scripts/common.sh@352 -- # local d=2 00:05:40.393 16:41:26 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:40.393 16:41:26 -- scripts/common.sh@354 -- # echo 2 00:05:40.393 16:41:26 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:40.393 16:41:26 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:40.393 16:41:26 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:40.393 16:41:26 -- scripts/common.sh@367 -- # return 0 00:05:40.393 16:41:26 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:40.393 16:41:26 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:40.393 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.393 --rc genhtml_branch_coverage=1 00:05:40.393 --rc genhtml_function_coverage=1 00:05:40.393 --rc genhtml_legend=1 00:05:40.393 --rc geninfo_all_blocks=1 00:05:40.393 --rc geninfo_unexecuted_blocks=1 00:05:40.393 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.393 ' 00:05:40.393 16:41:26 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:40.393 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.393 --rc genhtml_branch_coverage=1 00:05:40.393 --rc genhtml_function_coverage=1 00:05:40.393 --rc genhtml_legend=1 00:05:40.393 --rc geninfo_all_blocks=1 00:05:40.393 --rc geninfo_unexecuted_blocks=1 00:05:40.393 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.393 ' 00:05:40.393 16:41:26 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:40.393 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.393 --rc genhtml_branch_coverage=1 00:05:40.393 --rc genhtml_function_coverage=1 00:05:40.393 --rc genhtml_legend=1 00:05:40.393 --rc geninfo_all_blocks=1 00:05:40.393 --rc geninfo_unexecuted_blocks=1 00:05:40.393 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.393 ' 00:05:40.393 16:41:26 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:40.393 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:40.393 --rc genhtml_branch_coverage=1 00:05:40.393 --rc genhtml_function_coverage=1 00:05:40.393 --rc genhtml_legend=1 00:05:40.393 --rc geninfo_all_blocks=1 00:05:40.393 --rc geninfo_unexecuted_blocks=1 00:05:40.393 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:40.393 ' 00:05:40.393 16:41:26 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:40.393 16:41:26 -- nvmf/common.sh@7 -- # uname -s 00:05:40.393 16:41:26 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:40.393 16:41:26 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:40.393 16:41:26 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:40.393 16:41:26 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:40.653 16:41:26 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:40.653 16:41:26 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:40.653 16:41:26 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:40.653 16:41:26 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:40.653 16:41:26 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:40.653 16:41:26 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:40.653 16:41:26 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:40.653 16:41:26 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:40.653 16:41:26 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:40.653 16:41:26 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:40.653 16:41:26 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:40.653 16:41:26 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:40.653 16:41:26 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:40.653 16:41:26 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:40.653 16:41:26 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:40.653 16:41:26 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:40.653 16:41:26 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:40.653 16:41:26 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:40.653 16:41:26 -- paths/export.sh@5 -- # export PATH 00:05:40.653 16:41:26 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:40.653 16:41:26 -- nvmf/common.sh@46 -- # : 0 00:05:40.653 16:41:26 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:40.653 16:41:26 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:40.653 16:41:26 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:40.653 16:41:26 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:40.653 16:41:26 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:40.653 16:41:26 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:40.653 16:41:26 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:40.653 16:41:26 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:40.653 16:41:26 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:05:40.653 16:41:26 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:05:40.653 16:41:26 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:40.653 16:41:26 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:05:40.653 16:41:26 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:40.653 16:41:26 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:05:40.653 16:41:26 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:40.653 16:41:26 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:05:40.653 16:41:26 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:40.653 16:41:26 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:05:40.653 INFO: launching applications... 00:05:40.653 16:41:26 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:40.653 16:41:26 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:05:40.653 16:41:26 -- json_config/json_config_extra_key.sh@25 -- # shift 00:05:40.653 16:41:26 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:05:40.653 16:41:26 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:05:40.653 16:41:26 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=460783 00:05:40.653 16:41:26 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:05:40.653 Waiting for target to run... 00:05:40.653 16:41:26 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 460783 /var/tmp/spdk_tgt.sock 00:05:40.653 16:41:26 -- common/autotest_common.sh@829 -- # '[' -z 460783 ']' 00:05:40.653 16:41:26 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:40.653 16:41:26 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:40.653 16:41:26 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:40.653 16:41:26 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:40.653 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:40.653 16:41:26 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:40.653 16:41:26 -- common/autotest_common.sh@10 -- # set +x 00:05:40.653 [2024-11-16 16:41:26.191292] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:40.653 [2024-11-16 16:41:26.191388] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid460783 ] 00:05:40.653 EAL: No free 2048 kB hugepages reported on node 1 00:05:40.912 [2024-11-16 16:41:26.632225] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:40.912 [2024-11-16 16:41:26.659955] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:40.912 [2024-11-16 16:41:26.660064] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:41.479 16:41:27 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:41.479 16:41:27 -- common/autotest_common.sh@862 -- # return 0 00:05:41.479 16:41:27 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:05:41.479 00:05:41.479 16:41:27 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:05:41.479 INFO: shutting down applications... 00:05:41.479 16:41:27 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:05:41.479 16:41:27 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:05:41.479 16:41:27 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:05:41.479 16:41:27 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 460783 ]] 00:05:41.479 16:41:27 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 460783 00:05:41.479 16:41:27 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:05:41.479 16:41:27 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:41.479 16:41:27 -- json_config/json_config_extra_key.sh@50 -- # kill -0 460783 00:05:41.479 16:41:27 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:05:42.047 16:41:27 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:05:42.048 16:41:27 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:42.048 16:41:27 -- json_config/json_config_extra_key.sh@50 -- # kill -0 460783 00:05:42.048 16:41:27 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:05:42.048 16:41:27 -- json_config/json_config_extra_key.sh@52 -- # break 00:05:42.048 16:41:27 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:05:42.048 16:41:27 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:05:42.048 SPDK target shutdown done 00:05:42.048 16:41:27 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:05:42.048 Success 00:05:42.048 00:05:42.048 real 0m1.564s 00:05:42.048 user 0m1.166s 00:05:42.048 sys 0m0.557s 00:05:42.048 16:41:27 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:42.048 16:41:27 -- common/autotest_common.sh@10 -- # set +x 00:05:42.048 ************************************ 00:05:42.048 END TEST json_config_extra_key 00:05:42.048 ************************************ 00:05:42.048 16:41:27 -- spdk/autotest.sh@167 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:42.048 16:41:27 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:42.048 16:41:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:42.048 16:41:27 -- common/autotest_common.sh@10 -- # set +x 00:05:42.048 ************************************ 00:05:42.048 START TEST alias_rpc 00:05:42.048 ************************************ 00:05:42.048 16:41:27 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:42.048 * Looking for test storage... 00:05:42.048 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:05:42.048 16:41:27 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:42.048 16:41:27 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:42.048 16:41:27 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:42.048 16:41:27 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:42.048 16:41:27 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:42.048 16:41:27 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:42.048 16:41:27 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:42.048 16:41:27 -- scripts/common.sh@335 -- # IFS=.-: 00:05:42.048 16:41:27 -- scripts/common.sh@335 -- # read -ra ver1 00:05:42.048 16:41:27 -- scripts/common.sh@336 -- # IFS=.-: 00:05:42.048 16:41:27 -- scripts/common.sh@336 -- # read -ra ver2 00:05:42.048 16:41:27 -- scripts/common.sh@337 -- # local 'op=<' 00:05:42.048 16:41:27 -- scripts/common.sh@339 -- # ver1_l=2 00:05:42.048 16:41:27 -- scripts/common.sh@340 -- # ver2_l=1 00:05:42.048 16:41:27 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:42.048 16:41:27 -- scripts/common.sh@343 -- # case "$op" in 00:05:42.048 16:41:27 -- scripts/common.sh@344 -- # : 1 00:05:42.048 16:41:27 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:42.048 16:41:27 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:42.048 16:41:27 -- scripts/common.sh@364 -- # decimal 1 00:05:42.048 16:41:27 -- scripts/common.sh@352 -- # local d=1 00:05:42.048 16:41:27 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:42.048 16:41:27 -- scripts/common.sh@354 -- # echo 1 00:05:42.048 16:41:27 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:42.048 16:41:27 -- scripts/common.sh@365 -- # decimal 2 00:05:42.048 16:41:27 -- scripts/common.sh@352 -- # local d=2 00:05:42.048 16:41:27 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:42.048 16:41:27 -- scripts/common.sh@354 -- # echo 2 00:05:42.048 16:41:27 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:42.048 16:41:27 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:42.048 16:41:27 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:42.048 16:41:27 -- scripts/common.sh@367 -- # return 0 00:05:42.048 16:41:27 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:42.048 16:41:27 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:42.048 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.048 --rc genhtml_branch_coverage=1 00:05:42.048 --rc genhtml_function_coverage=1 00:05:42.048 --rc genhtml_legend=1 00:05:42.048 --rc geninfo_all_blocks=1 00:05:42.048 --rc geninfo_unexecuted_blocks=1 00:05:42.048 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.048 ' 00:05:42.048 16:41:27 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:42.048 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.048 --rc genhtml_branch_coverage=1 00:05:42.048 --rc genhtml_function_coverage=1 00:05:42.048 --rc genhtml_legend=1 00:05:42.048 --rc geninfo_all_blocks=1 00:05:42.048 --rc geninfo_unexecuted_blocks=1 00:05:42.048 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.048 ' 00:05:42.048 16:41:27 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:42.048 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.048 --rc genhtml_branch_coverage=1 00:05:42.048 --rc genhtml_function_coverage=1 00:05:42.048 --rc genhtml_legend=1 00:05:42.048 --rc geninfo_all_blocks=1 00:05:42.048 --rc geninfo_unexecuted_blocks=1 00:05:42.048 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.048 ' 00:05:42.048 16:41:27 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:42.048 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:42.048 --rc genhtml_branch_coverage=1 00:05:42.048 --rc genhtml_function_coverage=1 00:05:42.048 --rc genhtml_legend=1 00:05:42.048 --rc geninfo_all_blocks=1 00:05:42.048 --rc geninfo_unexecuted_blocks=1 00:05:42.048 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:42.048 ' 00:05:42.048 16:41:27 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:42.048 16:41:27 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=461112 00:05:42.048 16:41:27 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 461112 00:05:42.048 16:41:27 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:42.048 16:41:27 -- common/autotest_common.sh@829 -- # '[' -z 461112 ']' 00:05:42.048 16:41:27 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:42.048 16:41:27 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:42.048 16:41:27 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:42.048 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:42.048 16:41:27 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:42.048 16:41:27 -- common/autotest_common.sh@10 -- # set +x 00:05:42.308 [2024-11-16 16:41:27.807534] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:42.308 [2024-11-16 16:41:27.807599] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid461112 ] 00:05:42.308 EAL: No free 2048 kB hugepages reported on node 1 00:05:42.308 [2024-11-16 16:41:27.886840] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:42.308 [2024-11-16 16:41:27.922062] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:42.308 [2024-11-16 16:41:27.922191] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:43.245 16:41:28 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:43.245 16:41:28 -- common/autotest_common.sh@862 -- # return 0 00:05:43.245 16:41:28 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:43.246 16:41:28 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 461112 00:05:43.246 16:41:28 -- common/autotest_common.sh@936 -- # '[' -z 461112 ']' 00:05:43.246 16:41:28 -- common/autotest_common.sh@940 -- # kill -0 461112 00:05:43.246 16:41:28 -- common/autotest_common.sh@941 -- # uname 00:05:43.246 16:41:28 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:43.246 16:41:28 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 461112 00:05:43.246 16:41:28 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:43.246 16:41:28 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:43.246 16:41:28 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 461112' 00:05:43.246 killing process with pid 461112 00:05:43.246 16:41:28 -- common/autotest_common.sh@955 -- # kill 461112 00:05:43.246 16:41:28 -- common/autotest_common.sh@960 -- # wait 461112 00:05:43.505 00:05:43.505 real 0m1.581s 00:05:43.505 user 0m1.672s 00:05:43.505 sys 0m0.476s 00:05:43.505 16:41:29 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:43.505 16:41:29 -- common/autotest_common.sh@10 -- # set +x 00:05:43.505 ************************************ 00:05:43.505 END TEST alias_rpc 00:05:43.505 ************************************ 00:05:43.505 16:41:29 -- spdk/autotest.sh@169 -- # [[ 0 -eq 0 ]] 00:05:43.505 16:41:29 -- spdk/autotest.sh@170 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:43.505 16:41:29 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:43.505 16:41:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:43.505 16:41:29 -- common/autotest_common.sh@10 -- # set +x 00:05:43.505 ************************************ 00:05:43.505 START TEST spdkcli_tcp 00:05:43.505 ************************************ 00:05:43.505 16:41:29 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:43.765 * Looking for test storage... 00:05:43.765 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:05:43.765 16:41:29 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:43.765 16:41:29 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:43.765 16:41:29 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:43.765 16:41:29 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:43.765 16:41:29 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:43.765 16:41:29 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:43.765 16:41:29 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:43.765 16:41:29 -- scripts/common.sh@335 -- # IFS=.-: 00:05:43.765 16:41:29 -- scripts/common.sh@335 -- # read -ra ver1 00:05:43.765 16:41:29 -- scripts/common.sh@336 -- # IFS=.-: 00:05:43.765 16:41:29 -- scripts/common.sh@336 -- # read -ra ver2 00:05:43.765 16:41:29 -- scripts/common.sh@337 -- # local 'op=<' 00:05:43.765 16:41:29 -- scripts/common.sh@339 -- # ver1_l=2 00:05:43.765 16:41:29 -- scripts/common.sh@340 -- # ver2_l=1 00:05:43.765 16:41:29 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:43.765 16:41:29 -- scripts/common.sh@343 -- # case "$op" in 00:05:43.765 16:41:29 -- scripts/common.sh@344 -- # : 1 00:05:43.765 16:41:29 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:43.765 16:41:29 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:43.765 16:41:29 -- scripts/common.sh@364 -- # decimal 1 00:05:43.765 16:41:29 -- scripts/common.sh@352 -- # local d=1 00:05:43.765 16:41:29 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:43.765 16:41:29 -- scripts/common.sh@354 -- # echo 1 00:05:43.765 16:41:29 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:43.765 16:41:29 -- scripts/common.sh@365 -- # decimal 2 00:05:43.765 16:41:29 -- scripts/common.sh@352 -- # local d=2 00:05:43.765 16:41:29 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:43.765 16:41:29 -- scripts/common.sh@354 -- # echo 2 00:05:43.765 16:41:29 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:43.765 16:41:29 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:43.765 16:41:29 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:43.765 16:41:29 -- scripts/common.sh@367 -- # return 0 00:05:43.765 16:41:29 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:43.765 16:41:29 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:43.765 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:43.765 --rc genhtml_branch_coverage=1 00:05:43.765 --rc genhtml_function_coverage=1 00:05:43.765 --rc genhtml_legend=1 00:05:43.765 --rc geninfo_all_blocks=1 00:05:43.765 --rc geninfo_unexecuted_blocks=1 00:05:43.765 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:43.765 ' 00:05:43.765 16:41:29 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:43.765 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:43.765 --rc genhtml_branch_coverage=1 00:05:43.765 --rc genhtml_function_coverage=1 00:05:43.765 --rc genhtml_legend=1 00:05:43.765 --rc geninfo_all_blocks=1 00:05:43.765 --rc geninfo_unexecuted_blocks=1 00:05:43.765 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:43.765 ' 00:05:43.765 16:41:29 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:43.765 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:43.765 --rc genhtml_branch_coverage=1 00:05:43.765 --rc genhtml_function_coverage=1 00:05:43.765 --rc genhtml_legend=1 00:05:43.765 --rc geninfo_all_blocks=1 00:05:43.765 --rc geninfo_unexecuted_blocks=1 00:05:43.765 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:43.765 ' 00:05:43.765 16:41:29 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:43.765 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:43.765 --rc genhtml_branch_coverage=1 00:05:43.765 --rc genhtml_function_coverage=1 00:05:43.765 --rc genhtml_legend=1 00:05:43.765 --rc geninfo_all_blocks=1 00:05:43.765 --rc geninfo_unexecuted_blocks=1 00:05:43.765 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:43.765 ' 00:05:43.765 16:41:29 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:05:43.765 16:41:29 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:43.765 16:41:29 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:05:43.765 16:41:29 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:43.765 16:41:29 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:43.765 16:41:29 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:43.765 16:41:29 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:43.765 16:41:29 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:43.765 16:41:29 -- common/autotest_common.sh@10 -- # set +x 00:05:43.765 16:41:29 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=461449 00:05:43.765 16:41:29 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:43.765 16:41:29 -- spdkcli/tcp.sh@27 -- # waitforlisten 461449 00:05:43.765 16:41:29 -- common/autotest_common.sh@829 -- # '[' -z 461449 ']' 00:05:43.765 16:41:29 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:43.765 16:41:29 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:43.765 16:41:29 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:43.765 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:43.765 16:41:29 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:43.765 16:41:29 -- common/autotest_common.sh@10 -- # set +x 00:05:43.765 [2024-11-16 16:41:29.445483] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:43.765 [2024-11-16 16:41:29.445553] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid461449 ] 00:05:43.765 EAL: No free 2048 kB hugepages reported on node 1 00:05:44.026 [2024-11-16 16:41:29.526405] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:44.026 [2024-11-16 16:41:29.562877] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:44.026 [2024-11-16 16:41:29.563102] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:44.026 [2024-11-16 16:41:29.563104] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:44.595 16:41:30 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:44.595 16:41:30 -- common/autotest_common.sh@862 -- # return 0 00:05:44.595 16:41:30 -- spdkcli/tcp.sh@31 -- # socat_pid=461707 00:05:44.595 16:41:30 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:44.595 16:41:30 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:44.856 [ 00:05:44.856 "spdk_get_version", 00:05:44.856 "rpc_get_methods", 00:05:44.856 "trace_get_info", 00:05:44.856 "trace_get_tpoint_group_mask", 00:05:44.856 "trace_disable_tpoint_group", 00:05:44.856 "trace_enable_tpoint_group", 00:05:44.856 "trace_clear_tpoint_mask", 00:05:44.856 "trace_set_tpoint_mask", 00:05:44.856 "vfu_tgt_set_base_path", 00:05:44.856 "framework_get_pci_devices", 00:05:44.856 "framework_get_config", 00:05:44.856 "framework_get_subsystems", 00:05:44.856 "iobuf_get_stats", 00:05:44.856 "iobuf_set_options", 00:05:44.856 "sock_set_default_impl", 00:05:44.856 "sock_impl_set_options", 00:05:44.856 "sock_impl_get_options", 00:05:44.856 "vmd_rescan", 00:05:44.856 "vmd_remove_device", 00:05:44.856 "vmd_enable", 00:05:44.856 "accel_get_stats", 00:05:44.856 "accel_set_options", 00:05:44.856 "accel_set_driver", 00:05:44.856 "accel_crypto_key_destroy", 00:05:44.856 "accel_crypto_keys_get", 00:05:44.856 "accel_crypto_key_create", 00:05:44.856 "accel_assign_opc", 00:05:44.856 "accel_get_module_info", 00:05:44.856 "accel_get_opc_assignments", 00:05:44.856 "notify_get_notifications", 00:05:44.856 "notify_get_types", 00:05:44.856 "bdev_get_histogram", 00:05:44.856 "bdev_enable_histogram", 00:05:44.856 "bdev_set_qos_limit", 00:05:44.856 "bdev_set_qd_sampling_period", 00:05:44.856 "bdev_get_bdevs", 00:05:44.856 "bdev_reset_iostat", 00:05:44.856 "bdev_get_iostat", 00:05:44.856 "bdev_examine", 00:05:44.856 "bdev_wait_for_examine", 00:05:44.856 "bdev_set_options", 00:05:44.856 "scsi_get_devices", 00:05:44.856 "thread_set_cpumask", 00:05:44.856 "framework_get_scheduler", 00:05:44.856 "framework_set_scheduler", 00:05:44.856 "framework_get_reactors", 00:05:44.856 "thread_get_io_channels", 00:05:44.856 "thread_get_pollers", 00:05:44.856 "thread_get_stats", 00:05:44.856 "framework_monitor_context_switch", 00:05:44.856 "spdk_kill_instance", 00:05:44.856 "log_enable_timestamps", 00:05:44.856 "log_get_flags", 00:05:44.856 "log_clear_flag", 00:05:44.856 "log_set_flag", 00:05:44.856 "log_get_level", 00:05:44.856 "log_set_level", 00:05:44.856 "log_get_print_level", 00:05:44.857 "log_set_print_level", 00:05:44.857 "framework_enable_cpumask_locks", 00:05:44.857 "framework_disable_cpumask_locks", 00:05:44.857 "framework_wait_init", 00:05:44.857 "framework_start_init", 00:05:44.857 "virtio_blk_create_transport", 00:05:44.857 "virtio_blk_get_transports", 00:05:44.857 "vhost_controller_set_coalescing", 00:05:44.857 "vhost_get_controllers", 00:05:44.857 "vhost_delete_controller", 00:05:44.857 "vhost_create_blk_controller", 00:05:44.857 "vhost_scsi_controller_remove_target", 00:05:44.857 "vhost_scsi_controller_add_target", 00:05:44.857 "vhost_start_scsi_controller", 00:05:44.857 "vhost_create_scsi_controller", 00:05:44.857 "ublk_recover_disk", 00:05:44.857 "ublk_get_disks", 00:05:44.857 "ublk_stop_disk", 00:05:44.857 "ublk_start_disk", 00:05:44.857 "ublk_destroy_target", 00:05:44.857 "ublk_create_target", 00:05:44.857 "nbd_get_disks", 00:05:44.857 "nbd_stop_disk", 00:05:44.857 "nbd_start_disk", 00:05:44.857 "env_dpdk_get_mem_stats", 00:05:44.857 "nvmf_subsystem_get_listeners", 00:05:44.857 "nvmf_subsystem_get_qpairs", 00:05:44.857 "nvmf_subsystem_get_controllers", 00:05:44.857 "nvmf_get_stats", 00:05:44.857 "nvmf_get_transports", 00:05:44.857 "nvmf_create_transport", 00:05:44.857 "nvmf_get_targets", 00:05:44.857 "nvmf_delete_target", 00:05:44.857 "nvmf_create_target", 00:05:44.857 "nvmf_subsystem_allow_any_host", 00:05:44.857 "nvmf_subsystem_remove_host", 00:05:44.857 "nvmf_subsystem_add_host", 00:05:44.857 "nvmf_subsystem_remove_ns", 00:05:44.857 "nvmf_subsystem_add_ns", 00:05:44.857 "nvmf_subsystem_listener_set_ana_state", 00:05:44.857 "nvmf_discovery_get_referrals", 00:05:44.857 "nvmf_discovery_remove_referral", 00:05:44.857 "nvmf_discovery_add_referral", 00:05:44.857 "nvmf_subsystem_remove_listener", 00:05:44.857 "nvmf_subsystem_add_listener", 00:05:44.857 "nvmf_delete_subsystem", 00:05:44.857 "nvmf_create_subsystem", 00:05:44.857 "nvmf_get_subsystems", 00:05:44.857 "nvmf_set_crdt", 00:05:44.857 "nvmf_set_config", 00:05:44.857 "nvmf_set_max_subsystems", 00:05:44.857 "iscsi_set_options", 00:05:44.857 "iscsi_get_auth_groups", 00:05:44.857 "iscsi_auth_group_remove_secret", 00:05:44.857 "iscsi_auth_group_add_secret", 00:05:44.857 "iscsi_delete_auth_group", 00:05:44.857 "iscsi_create_auth_group", 00:05:44.857 "iscsi_set_discovery_auth", 00:05:44.857 "iscsi_get_options", 00:05:44.857 "iscsi_target_node_request_logout", 00:05:44.857 "iscsi_target_node_set_redirect", 00:05:44.857 "iscsi_target_node_set_auth", 00:05:44.857 "iscsi_target_node_add_lun", 00:05:44.857 "iscsi_get_connections", 00:05:44.857 "iscsi_portal_group_set_auth", 00:05:44.857 "iscsi_start_portal_group", 00:05:44.857 "iscsi_delete_portal_group", 00:05:44.857 "iscsi_create_portal_group", 00:05:44.857 "iscsi_get_portal_groups", 00:05:44.857 "iscsi_delete_target_node", 00:05:44.857 "iscsi_target_node_remove_pg_ig_maps", 00:05:44.857 "iscsi_target_node_add_pg_ig_maps", 00:05:44.857 "iscsi_create_target_node", 00:05:44.857 "iscsi_get_target_nodes", 00:05:44.857 "iscsi_delete_initiator_group", 00:05:44.857 "iscsi_initiator_group_remove_initiators", 00:05:44.857 "iscsi_initiator_group_add_initiators", 00:05:44.857 "iscsi_create_initiator_group", 00:05:44.857 "iscsi_get_initiator_groups", 00:05:44.857 "vfu_virtio_create_scsi_endpoint", 00:05:44.857 "vfu_virtio_scsi_remove_target", 00:05:44.857 "vfu_virtio_scsi_add_target", 00:05:44.857 "vfu_virtio_create_blk_endpoint", 00:05:44.857 "vfu_virtio_delete_endpoint", 00:05:44.857 "iaa_scan_accel_module", 00:05:44.857 "dsa_scan_accel_module", 00:05:44.857 "ioat_scan_accel_module", 00:05:44.857 "accel_error_inject_error", 00:05:44.857 "bdev_iscsi_delete", 00:05:44.857 "bdev_iscsi_create", 00:05:44.857 "bdev_iscsi_set_options", 00:05:44.857 "bdev_virtio_attach_controller", 00:05:44.857 "bdev_virtio_scsi_get_devices", 00:05:44.857 "bdev_virtio_detach_controller", 00:05:44.857 "bdev_virtio_blk_set_hotplug", 00:05:44.857 "bdev_ftl_set_property", 00:05:44.857 "bdev_ftl_get_properties", 00:05:44.857 "bdev_ftl_get_stats", 00:05:44.857 "bdev_ftl_unmap", 00:05:44.857 "bdev_ftl_unload", 00:05:44.857 "bdev_ftl_delete", 00:05:44.857 "bdev_ftl_load", 00:05:44.857 "bdev_ftl_create", 00:05:44.857 "bdev_aio_delete", 00:05:44.857 "bdev_aio_rescan", 00:05:44.857 "bdev_aio_create", 00:05:44.857 "blobfs_create", 00:05:44.857 "blobfs_detect", 00:05:44.857 "blobfs_set_cache_size", 00:05:44.857 "bdev_zone_block_delete", 00:05:44.857 "bdev_zone_block_create", 00:05:44.857 "bdev_delay_delete", 00:05:44.857 "bdev_delay_create", 00:05:44.857 "bdev_delay_update_latency", 00:05:44.857 "bdev_split_delete", 00:05:44.857 "bdev_split_create", 00:05:44.857 "bdev_error_inject_error", 00:05:44.857 "bdev_error_delete", 00:05:44.857 "bdev_error_create", 00:05:44.857 "bdev_raid_set_options", 00:05:44.857 "bdev_raid_remove_base_bdev", 00:05:44.857 "bdev_raid_add_base_bdev", 00:05:44.857 "bdev_raid_delete", 00:05:44.857 "bdev_raid_create", 00:05:44.857 "bdev_raid_get_bdevs", 00:05:44.857 "bdev_lvol_grow_lvstore", 00:05:44.857 "bdev_lvol_get_lvols", 00:05:44.857 "bdev_lvol_get_lvstores", 00:05:44.857 "bdev_lvol_delete", 00:05:44.857 "bdev_lvol_set_read_only", 00:05:44.857 "bdev_lvol_resize", 00:05:44.857 "bdev_lvol_decouple_parent", 00:05:44.857 "bdev_lvol_inflate", 00:05:44.857 "bdev_lvol_rename", 00:05:44.857 "bdev_lvol_clone_bdev", 00:05:44.857 "bdev_lvol_clone", 00:05:44.857 "bdev_lvol_snapshot", 00:05:44.857 "bdev_lvol_create", 00:05:44.857 "bdev_lvol_delete_lvstore", 00:05:44.857 "bdev_lvol_rename_lvstore", 00:05:44.857 "bdev_lvol_create_lvstore", 00:05:44.857 "bdev_passthru_delete", 00:05:44.857 "bdev_passthru_create", 00:05:44.857 "bdev_nvme_cuse_unregister", 00:05:44.857 "bdev_nvme_cuse_register", 00:05:44.857 "bdev_opal_new_user", 00:05:44.857 "bdev_opal_set_lock_state", 00:05:44.857 "bdev_opal_delete", 00:05:44.857 "bdev_opal_get_info", 00:05:44.857 "bdev_opal_create", 00:05:44.857 "bdev_nvme_opal_revert", 00:05:44.857 "bdev_nvme_opal_init", 00:05:44.857 "bdev_nvme_send_cmd", 00:05:44.857 "bdev_nvme_get_path_iostat", 00:05:44.857 "bdev_nvme_get_mdns_discovery_info", 00:05:44.857 "bdev_nvme_stop_mdns_discovery", 00:05:44.857 "bdev_nvme_start_mdns_discovery", 00:05:44.857 "bdev_nvme_set_multipath_policy", 00:05:44.857 "bdev_nvme_set_preferred_path", 00:05:44.857 "bdev_nvme_get_io_paths", 00:05:44.857 "bdev_nvme_remove_error_injection", 00:05:44.857 "bdev_nvme_add_error_injection", 00:05:44.857 "bdev_nvme_get_discovery_info", 00:05:44.857 "bdev_nvme_stop_discovery", 00:05:44.857 "bdev_nvme_start_discovery", 00:05:44.857 "bdev_nvme_get_controller_health_info", 00:05:44.857 "bdev_nvme_disable_controller", 00:05:44.857 "bdev_nvme_enable_controller", 00:05:44.857 "bdev_nvme_reset_controller", 00:05:44.857 "bdev_nvme_get_transport_statistics", 00:05:44.857 "bdev_nvme_apply_firmware", 00:05:44.857 "bdev_nvme_detach_controller", 00:05:44.857 "bdev_nvme_get_controllers", 00:05:44.857 "bdev_nvme_attach_controller", 00:05:44.857 "bdev_nvme_set_hotplug", 00:05:44.857 "bdev_nvme_set_options", 00:05:44.857 "bdev_null_resize", 00:05:44.857 "bdev_null_delete", 00:05:44.857 "bdev_null_create", 00:05:44.857 "bdev_malloc_delete", 00:05:44.857 "bdev_malloc_create" 00:05:44.857 ] 00:05:44.857 16:41:30 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:44.857 16:41:30 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:44.857 16:41:30 -- common/autotest_common.sh@10 -- # set +x 00:05:44.857 16:41:30 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:44.857 16:41:30 -- spdkcli/tcp.sh@38 -- # killprocess 461449 00:05:44.857 16:41:30 -- common/autotest_common.sh@936 -- # '[' -z 461449 ']' 00:05:44.857 16:41:30 -- common/autotest_common.sh@940 -- # kill -0 461449 00:05:44.857 16:41:30 -- common/autotest_common.sh@941 -- # uname 00:05:44.857 16:41:30 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:44.857 16:41:30 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 461449 00:05:44.857 16:41:30 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:44.857 16:41:30 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:44.857 16:41:30 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 461449' 00:05:44.857 killing process with pid 461449 00:05:44.857 16:41:30 -- common/autotest_common.sh@955 -- # kill 461449 00:05:44.858 16:41:30 -- common/autotest_common.sh@960 -- # wait 461449 00:05:45.118 00:05:45.118 real 0m1.631s 00:05:45.118 user 0m2.975s 00:05:45.118 sys 0m0.517s 00:05:45.118 16:41:30 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:45.118 16:41:30 -- common/autotest_common.sh@10 -- # set +x 00:05:45.118 ************************************ 00:05:45.118 END TEST spdkcli_tcp 00:05:45.118 ************************************ 00:05:45.379 16:41:30 -- spdk/autotest.sh@173 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:45.379 16:41:30 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:45.379 16:41:30 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:45.379 16:41:30 -- common/autotest_common.sh@10 -- # set +x 00:05:45.379 ************************************ 00:05:45.379 START TEST dpdk_mem_utility 00:05:45.379 ************************************ 00:05:45.379 16:41:30 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:45.379 * Looking for test storage... 00:05:45.379 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:05:45.379 16:41:31 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:45.379 16:41:31 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:45.379 16:41:31 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:45.379 16:41:31 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:45.379 16:41:31 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:45.379 16:41:31 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:45.379 16:41:31 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:45.379 16:41:31 -- scripts/common.sh@335 -- # IFS=.-: 00:05:45.379 16:41:31 -- scripts/common.sh@335 -- # read -ra ver1 00:05:45.379 16:41:31 -- scripts/common.sh@336 -- # IFS=.-: 00:05:45.379 16:41:31 -- scripts/common.sh@336 -- # read -ra ver2 00:05:45.379 16:41:31 -- scripts/common.sh@337 -- # local 'op=<' 00:05:45.379 16:41:31 -- scripts/common.sh@339 -- # ver1_l=2 00:05:45.379 16:41:31 -- scripts/common.sh@340 -- # ver2_l=1 00:05:45.379 16:41:31 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:45.379 16:41:31 -- scripts/common.sh@343 -- # case "$op" in 00:05:45.379 16:41:31 -- scripts/common.sh@344 -- # : 1 00:05:45.379 16:41:31 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:45.379 16:41:31 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:45.379 16:41:31 -- scripts/common.sh@364 -- # decimal 1 00:05:45.379 16:41:31 -- scripts/common.sh@352 -- # local d=1 00:05:45.379 16:41:31 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:45.379 16:41:31 -- scripts/common.sh@354 -- # echo 1 00:05:45.379 16:41:31 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:45.379 16:41:31 -- scripts/common.sh@365 -- # decimal 2 00:05:45.379 16:41:31 -- scripts/common.sh@352 -- # local d=2 00:05:45.379 16:41:31 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:45.379 16:41:31 -- scripts/common.sh@354 -- # echo 2 00:05:45.379 16:41:31 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:45.379 16:41:31 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:45.379 16:41:31 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:45.379 16:41:31 -- scripts/common.sh@367 -- # return 0 00:05:45.379 16:41:31 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:45.379 16:41:31 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:45.379 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.379 --rc genhtml_branch_coverage=1 00:05:45.379 --rc genhtml_function_coverage=1 00:05:45.379 --rc genhtml_legend=1 00:05:45.379 --rc geninfo_all_blocks=1 00:05:45.379 --rc geninfo_unexecuted_blocks=1 00:05:45.379 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.379 ' 00:05:45.379 16:41:31 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:45.379 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.379 --rc genhtml_branch_coverage=1 00:05:45.379 --rc genhtml_function_coverage=1 00:05:45.379 --rc genhtml_legend=1 00:05:45.379 --rc geninfo_all_blocks=1 00:05:45.379 --rc geninfo_unexecuted_blocks=1 00:05:45.379 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.379 ' 00:05:45.379 16:41:31 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:45.379 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.379 --rc genhtml_branch_coverage=1 00:05:45.379 --rc genhtml_function_coverage=1 00:05:45.379 --rc genhtml_legend=1 00:05:45.379 --rc geninfo_all_blocks=1 00:05:45.379 --rc geninfo_unexecuted_blocks=1 00:05:45.379 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.379 ' 00:05:45.379 16:41:31 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:45.379 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:45.379 --rc genhtml_branch_coverage=1 00:05:45.379 --rc genhtml_function_coverage=1 00:05:45.379 --rc genhtml_legend=1 00:05:45.379 --rc geninfo_all_blocks=1 00:05:45.379 --rc geninfo_unexecuted_blocks=1 00:05:45.379 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:45.379 ' 00:05:45.379 16:41:31 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:45.379 16:41:31 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=461788 00:05:45.379 16:41:31 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 461788 00:05:45.379 16:41:31 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:45.379 16:41:31 -- common/autotest_common.sh@829 -- # '[' -z 461788 ']' 00:05:45.379 16:41:31 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:45.379 16:41:31 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:45.379 16:41:31 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:45.379 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:45.379 16:41:31 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:45.379 16:41:31 -- common/autotest_common.sh@10 -- # set +x 00:05:45.379 [2024-11-16 16:41:31.116268] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:45.379 [2024-11-16 16:41:31.116342] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid461788 ] 00:05:45.640 EAL: No free 2048 kB hugepages reported on node 1 00:05:45.640 [2024-11-16 16:41:31.195881] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:45.640 [2024-11-16 16:41:31.233135] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:45.640 [2024-11-16 16:41:31.233248] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:46.209 16:41:31 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:46.209 16:41:31 -- common/autotest_common.sh@862 -- # return 0 00:05:46.209 16:41:31 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:46.209 16:41:31 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:46.209 16:41:31 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:46.209 16:41:31 -- common/autotest_common.sh@10 -- # set +x 00:05:46.209 { 00:05:46.209 "filename": "/tmp/spdk_mem_dump.txt" 00:05:46.209 } 00:05:46.209 16:41:31 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:46.209 16:41:31 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:46.470 DPDK memory size 814.000000 MiB in 1 heap(s) 00:05:46.470 1 heaps totaling size 814.000000 MiB 00:05:46.470 size: 814.000000 MiB heap id: 0 00:05:46.470 end heaps---------- 00:05:46.470 8 mempools totaling size 598.116089 MiB 00:05:46.470 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:46.470 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:46.470 size: 84.521057 MiB name: bdev_io_461788 00:05:46.470 size: 51.011292 MiB name: evtpool_461788 00:05:46.470 size: 50.003479 MiB name: msgpool_461788 00:05:46.470 size: 21.763794 MiB name: PDU_Pool 00:05:46.470 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:46.470 size: 0.026123 MiB name: Session_Pool 00:05:46.470 end mempools------- 00:05:46.470 6 memzones totaling size 4.142822 MiB 00:05:46.470 size: 1.000366 MiB name: RG_ring_0_461788 00:05:46.470 size: 1.000366 MiB name: RG_ring_1_461788 00:05:46.470 size: 1.000366 MiB name: RG_ring_4_461788 00:05:46.470 size: 1.000366 MiB name: RG_ring_5_461788 00:05:46.470 size: 0.125366 MiB name: RG_ring_2_461788 00:05:46.470 size: 0.015991 MiB name: RG_ring_3_461788 00:05:46.470 end memzones------- 00:05:46.470 16:41:32 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:05:46.470 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:05:46.470 list of free elements. size: 12.519348 MiB 00:05:46.470 element at address: 0x200000400000 with size: 1.999512 MiB 00:05:46.470 element at address: 0x200018e00000 with size: 0.999878 MiB 00:05:46.470 element at address: 0x200019000000 with size: 0.999878 MiB 00:05:46.470 element at address: 0x200003e00000 with size: 0.996277 MiB 00:05:46.470 element at address: 0x200031c00000 with size: 0.994446 MiB 00:05:46.470 element at address: 0x200013800000 with size: 0.978699 MiB 00:05:46.470 element at address: 0x200007000000 with size: 0.959839 MiB 00:05:46.470 element at address: 0x200019200000 with size: 0.936584 MiB 00:05:46.470 element at address: 0x200000200000 with size: 0.841614 MiB 00:05:46.470 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:05:46.470 element at address: 0x20000b200000 with size: 0.490723 MiB 00:05:46.470 element at address: 0x200000800000 with size: 0.487793 MiB 00:05:46.470 element at address: 0x200019400000 with size: 0.485657 MiB 00:05:46.470 element at address: 0x200027e00000 with size: 0.410034 MiB 00:05:46.470 element at address: 0x200003a00000 with size: 0.355530 MiB 00:05:46.470 list of standard malloc elements. size: 199.218079 MiB 00:05:46.470 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:05:46.470 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:05:46.470 element at address: 0x200018efff80 with size: 1.000122 MiB 00:05:46.470 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:05:46.470 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:46.470 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:46.470 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:05:46.470 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:46.470 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:05:46.470 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:05:46.470 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:05:46.470 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:05:46.470 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:05:46.470 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:05:46.470 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:05:46.470 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:05:46.470 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:05:46.470 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:05:46.470 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:05:46.470 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:05:46.470 element at address: 0x200003adb300 with size: 0.000183 MiB 00:05:46.470 element at address: 0x200003adb500 with size: 0.000183 MiB 00:05:46.470 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:05:46.470 element at address: 0x200003affa80 with size: 0.000183 MiB 00:05:46.470 element at address: 0x200003affb40 with size: 0.000183 MiB 00:05:46.470 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:05:46.470 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:05:46.470 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:05:46.470 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:05:46.470 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:05:46.470 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:05:46.470 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:05:46.470 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:05:46.470 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:05:46.470 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:05:46.470 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:05:46.470 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:05:46.470 element at address: 0x200027e69040 with size: 0.000183 MiB 00:05:46.470 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:05:46.470 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:05:46.470 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:05:46.470 list of memzone associated elements. size: 602.262573 MiB 00:05:46.470 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:05:46.470 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:46.470 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:05:46.470 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:46.470 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:05:46.470 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_461788_0 00:05:46.470 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:05:46.470 associated memzone info: size: 48.002930 MiB name: MP_evtpool_461788_0 00:05:46.470 element at address: 0x200003fff380 with size: 48.003052 MiB 00:05:46.470 associated memzone info: size: 48.002930 MiB name: MP_msgpool_461788_0 00:05:46.470 element at address: 0x2000195be940 with size: 20.255554 MiB 00:05:46.470 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:46.470 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:05:46.470 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:46.470 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:05:46.470 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_461788 00:05:46.470 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:05:46.470 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_461788 00:05:46.470 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:05:46.470 associated memzone info: size: 1.007996 MiB name: MP_evtpool_461788 00:05:46.470 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:05:46.470 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:46.470 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:05:46.470 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:46.470 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:05:46.470 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:46.470 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:05:46.470 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:46.470 element at address: 0x200003eff180 with size: 1.000488 MiB 00:05:46.470 associated memzone info: size: 1.000366 MiB name: RG_ring_0_461788 00:05:46.471 element at address: 0x200003affc00 with size: 1.000488 MiB 00:05:46.471 associated memzone info: size: 1.000366 MiB name: RG_ring_1_461788 00:05:46.471 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:05:46.471 associated memzone info: size: 1.000366 MiB name: RG_ring_4_461788 00:05:46.471 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:05:46.471 associated memzone info: size: 1.000366 MiB name: RG_ring_5_461788 00:05:46.471 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:05:46.471 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_461788 00:05:46.471 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:05:46.471 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:46.471 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:05:46.471 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:46.471 element at address: 0x20001947c540 with size: 0.250488 MiB 00:05:46.471 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:46.471 element at address: 0x200003adf880 with size: 0.125488 MiB 00:05:46.471 associated memzone info: size: 0.125366 MiB name: RG_ring_2_461788 00:05:46.471 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:05:46.471 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:46.471 element at address: 0x200027e69100 with size: 0.023743 MiB 00:05:46.471 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:46.471 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:05:46.471 associated memzone info: size: 0.015991 MiB name: RG_ring_3_461788 00:05:46.471 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:05:46.471 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:46.471 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:05:46.471 associated memzone info: size: 0.000183 MiB name: MP_msgpool_461788 00:05:46.471 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:05:46.471 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_461788 00:05:46.471 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:05:46.471 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:46.471 16:41:32 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:46.471 16:41:32 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 461788 00:05:46.471 16:41:32 -- common/autotest_common.sh@936 -- # '[' -z 461788 ']' 00:05:46.471 16:41:32 -- common/autotest_common.sh@940 -- # kill -0 461788 00:05:46.471 16:41:32 -- common/autotest_common.sh@941 -- # uname 00:05:46.471 16:41:32 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:46.471 16:41:32 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 461788 00:05:46.471 16:41:32 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:46.471 16:41:32 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:46.471 16:41:32 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 461788' 00:05:46.471 killing process with pid 461788 00:05:46.471 16:41:32 -- common/autotest_common.sh@955 -- # kill 461788 00:05:46.471 16:41:32 -- common/autotest_common.sh@960 -- # wait 461788 00:05:46.731 00:05:46.731 real 0m1.510s 00:05:46.731 user 0m1.533s 00:05:46.731 sys 0m0.493s 00:05:46.731 16:41:32 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:46.731 16:41:32 -- common/autotest_common.sh@10 -- # set +x 00:05:46.731 ************************************ 00:05:46.731 END TEST dpdk_mem_utility 00:05:46.731 ************************************ 00:05:46.731 16:41:32 -- spdk/autotest.sh@174 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:46.731 16:41:32 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:46.731 16:41:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:46.731 16:41:32 -- common/autotest_common.sh@10 -- # set +x 00:05:46.731 ************************************ 00:05:46.731 START TEST event 00:05:46.731 ************************************ 00:05:46.731 16:41:32 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:46.992 * Looking for test storage... 00:05:46.992 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:46.992 16:41:32 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:46.992 16:41:32 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:46.992 16:41:32 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:46.992 16:41:32 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:46.992 16:41:32 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:46.992 16:41:32 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:46.992 16:41:32 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:46.992 16:41:32 -- scripts/common.sh@335 -- # IFS=.-: 00:05:46.992 16:41:32 -- scripts/common.sh@335 -- # read -ra ver1 00:05:46.992 16:41:32 -- scripts/common.sh@336 -- # IFS=.-: 00:05:46.992 16:41:32 -- scripts/common.sh@336 -- # read -ra ver2 00:05:46.992 16:41:32 -- scripts/common.sh@337 -- # local 'op=<' 00:05:46.992 16:41:32 -- scripts/common.sh@339 -- # ver1_l=2 00:05:46.992 16:41:32 -- scripts/common.sh@340 -- # ver2_l=1 00:05:46.992 16:41:32 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:46.992 16:41:32 -- scripts/common.sh@343 -- # case "$op" in 00:05:46.992 16:41:32 -- scripts/common.sh@344 -- # : 1 00:05:46.992 16:41:32 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:46.992 16:41:32 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:46.992 16:41:32 -- scripts/common.sh@364 -- # decimal 1 00:05:46.992 16:41:32 -- scripts/common.sh@352 -- # local d=1 00:05:46.992 16:41:32 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:46.992 16:41:32 -- scripts/common.sh@354 -- # echo 1 00:05:46.992 16:41:32 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:46.992 16:41:32 -- scripts/common.sh@365 -- # decimal 2 00:05:46.992 16:41:32 -- scripts/common.sh@352 -- # local d=2 00:05:46.992 16:41:32 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:46.992 16:41:32 -- scripts/common.sh@354 -- # echo 2 00:05:46.992 16:41:32 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:46.992 16:41:32 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:46.992 16:41:32 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:46.992 16:41:32 -- scripts/common.sh@367 -- # return 0 00:05:46.992 16:41:32 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:46.992 16:41:32 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:46.992 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:46.992 --rc genhtml_branch_coverage=1 00:05:46.992 --rc genhtml_function_coverage=1 00:05:46.992 --rc genhtml_legend=1 00:05:46.992 --rc geninfo_all_blocks=1 00:05:46.992 --rc geninfo_unexecuted_blocks=1 00:05:46.992 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:46.992 ' 00:05:46.992 16:41:32 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:46.992 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:46.992 --rc genhtml_branch_coverage=1 00:05:46.992 --rc genhtml_function_coverage=1 00:05:46.992 --rc genhtml_legend=1 00:05:46.992 --rc geninfo_all_blocks=1 00:05:46.992 --rc geninfo_unexecuted_blocks=1 00:05:46.992 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:46.992 ' 00:05:46.992 16:41:32 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:46.992 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:46.992 --rc genhtml_branch_coverage=1 00:05:46.992 --rc genhtml_function_coverage=1 00:05:46.992 --rc genhtml_legend=1 00:05:46.992 --rc geninfo_all_blocks=1 00:05:46.992 --rc geninfo_unexecuted_blocks=1 00:05:46.992 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:46.992 ' 00:05:46.992 16:41:32 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:46.992 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:46.992 --rc genhtml_branch_coverage=1 00:05:46.992 --rc genhtml_function_coverage=1 00:05:46.992 --rc genhtml_legend=1 00:05:46.992 --rc geninfo_all_blocks=1 00:05:46.992 --rc geninfo_unexecuted_blocks=1 00:05:46.993 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:46.993 ' 00:05:46.993 16:41:32 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:05:46.993 16:41:32 -- bdev/nbd_common.sh@6 -- # set -e 00:05:46.993 16:41:32 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:46.993 16:41:32 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:05:46.993 16:41:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:46.993 16:41:32 -- common/autotest_common.sh@10 -- # set +x 00:05:46.993 ************************************ 00:05:46.993 START TEST event_perf 00:05:46.993 ************************************ 00:05:46.993 16:41:32 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:46.993 Running I/O for 1 seconds...[2024-11-16 16:41:32.677628] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:46.993 [2024-11-16 16:41:32.677732] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid462127 ] 00:05:46.993 EAL: No free 2048 kB hugepages reported on node 1 00:05:47.253 [2024-11-16 16:41:32.763572] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:47.253 [2024-11-16 16:41:32.801956] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:47.253 [2024-11-16 16:41:32.802070] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:47.253 [2024-11-16 16:41:32.802152] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:47.253 [2024-11-16 16:41:32.802154] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:48.192 Running I/O for 1 seconds... 00:05:48.192 lcore 0: 192842 00:05:48.192 lcore 1: 192842 00:05:48.192 lcore 2: 192840 00:05:48.192 lcore 3: 192840 00:05:48.192 done. 00:05:48.192 00:05:48.192 real 0m1.197s 00:05:48.192 user 0m4.090s 00:05:48.192 sys 0m0.103s 00:05:48.192 16:41:33 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:48.192 16:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:48.192 ************************************ 00:05:48.192 END TEST event_perf 00:05:48.192 ************************************ 00:05:48.192 16:41:33 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:48.192 16:41:33 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:05:48.192 16:41:33 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:48.192 16:41:33 -- common/autotest_common.sh@10 -- # set +x 00:05:48.192 ************************************ 00:05:48.192 START TEST event_reactor 00:05:48.192 ************************************ 00:05:48.192 16:41:33 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:48.192 [2024-11-16 16:41:33.914501] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:48.192 [2024-11-16 16:41:33.914553] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid462420 ] 00:05:48.467 EAL: No free 2048 kB hugepages reported on node 1 00:05:48.467 [2024-11-16 16:41:33.989064] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:48.467 [2024-11-16 16:41:34.023719] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:49.407 test_start 00:05:49.407 oneshot 00:05:49.407 tick 100 00:05:49.407 tick 100 00:05:49.407 tick 250 00:05:49.407 tick 100 00:05:49.407 tick 100 00:05:49.407 tick 100 00:05:49.407 tick 250 00:05:49.407 tick 500 00:05:49.408 tick 100 00:05:49.408 tick 100 00:05:49.408 tick 250 00:05:49.408 tick 100 00:05:49.408 tick 100 00:05:49.408 test_end 00:05:49.408 00:05:49.408 real 0m1.169s 00:05:49.408 user 0m1.076s 00:05:49.408 sys 0m0.089s 00:05:49.408 16:41:35 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:49.408 16:41:35 -- common/autotest_common.sh@10 -- # set +x 00:05:49.408 ************************************ 00:05:49.408 END TEST event_reactor 00:05:49.408 ************************************ 00:05:49.408 16:41:35 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:49.408 16:41:35 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:05:49.408 16:41:35 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:49.408 16:41:35 -- common/autotest_common.sh@10 -- # set +x 00:05:49.408 ************************************ 00:05:49.408 START TEST event_reactor_perf 00:05:49.408 ************************************ 00:05:49.408 16:41:35 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:49.408 [2024-11-16 16:41:35.142835] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:49.408 [2024-11-16 16:41:35.142922] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid462703 ] 00:05:49.668 EAL: No free 2048 kB hugepages reported on node 1 00:05:49.668 [2024-11-16 16:41:35.223246] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:49.668 [2024-11-16 16:41:35.258001] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:50.626 test_start 00:05:50.626 test_end 00:05:50.626 Performance: 971421 events per second 00:05:50.626 00:05:50.626 real 0m1.188s 00:05:50.626 user 0m1.085s 00:05:50.626 sys 0m0.099s 00:05:50.626 16:41:36 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:50.626 16:41:36 -- common/autotest_common.sh@10 -- # set +x 00:05:50.626 ************************************ 00:05:50.626 END TEST event_reactor_perf 00:05:50.626 ************************************ 00:05:50.626 16:41:36 -- event/event.sh@49 -- # uname -s 00:05:50.626 16:41:36 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:50.626 16:41:36 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:50.626 16:41:36 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:50.626 16:41:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:50.626 16:41:36 -- common/autotest_common.sh@10 -- # set +x 00:05:50.626 ************************************ 00:05:50.626 START TEST event_scheduler 00:05:50.626 ************************************ 00:05:50.626 16:41:36 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:50.915 * Looking for test storage... 00:05:50.915 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:05:50.915 16:41:36 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:50.915 16:41:36 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:50.915 16:41:36 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:50.915 16:41:36 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:50.915 16:41:36 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:50.915 16:41:36 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:50.915 16:41:36 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:50.915 16:41:36 -- scripts/common.sh@335 -- # IFS=.-: 00:05:50.915 16:41:36 -- scripts/common.sh@335 -- # read -ra ver1 00:05:50.915 16:41:36 -- scripts/common.sh@336 -- # IFS=.-: 00:05:50.915 16:41:36 -- scripts/common.sh@336 -- # read -ra ver2 00:05:50.915 16:41:36 -- scripts/common.sh@337 -- # local 'op=<' 00:05:50.915 16:41:36 -- scripts/common.sh@339 -- # ver1_l=2 00:05:50.915 16:41:36 -- scripts/common.sh@340 -- # ver2_l=1 00:05:50.915 16:41:36 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:50.915 16:41:36 -- scripts/common.sh@343 -- # case "$op" in 00:05:50.915 16:41:36 -- scripts/common.sh@344 -- # : 1 00:05:50.915 16:41:36 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:50.915 16:41:36 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:50.915 16:41:36 -- scripts/common.sh@364 -- # decimal 1 00:05:50.915 16:41:36 -- scripts/common.sh@352 -- # local d=1 00:05:50.915 16:41:36 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:50.915 16:41:36 -- scripts/common.sh@354 -- # echo 1 00:05:50.915 16:41:36 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:50.915 16:41:36 -- scripts/common.sh@365 -- # decimal 2 00:05:50.915 16:41:36 -- scripts/common.sh@352 -- # local d=2 00:05:50.915 16:41:36 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:50.915 16:41:36 -- scripts/common.sh@354 -- # echo 2 00:05:50.915 16:41:36 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:50.915 16:41:36 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:50.915 16:41:36 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:50.915 16:41:36 -- scripts/common.sh@367 -- # return 0 00:05:50.915 16:41:36 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:50.915 16:41:36 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:50.915 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:50.915 --rc genhtml_branch_coverage=1 00:05:50.915 --rc genhtml_function_coverage=1 00:05:50.915 --rc genhtml_legend=1 00:05:50.915 --rc geninfo_all_blocks=1 00:05:50.915 --rc geninfo_unexecuted_blocks=1 00:05:50.915 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:50.915 ' 00:05:50.915 16:41:36 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:50.915 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:50.915 --rc genhtml_branch_coverage=1 00:05:50.915 --rc genhtml_function_coverage=1 00:05:50.915 --rc genhtml_legend=1 00:05:50.915 --rc geninfo_all_blocks=1 00:05:50.915 --rc geninfo_unexecuted_blocks=1 00:05:50.915 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:50.915 ' 00:05:50.915 16:41:36 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:50.915 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:50.915 --rc genhtml_branch_coverage=1 00:05:50.915 --rc genhtml_function_coverage=1 00:05:50.915 --rc genhtml_legend=1 00:05:50.915 --rc geninfo_all_blocks=1 00:05:50.915 --rc geninfo_unexecuted_blocks=1 00:05:50.915 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:50.915 ' 00:05:50.915 16:41:36 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:50.915 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:50.915 --rc genhtml_branch_coverage=1 00:05:50.915 --rc genhtml_function_coverage=1 00:05:50.915 --rc genhtml_legend=1 00:05:50.915 --rc geninfo_all_blocks=1 00:05:50.915 --rc geninfo_unexecuted_blocks=1 00:05:50.915 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:50.915 ' 00:05:50.915 16:41:36 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:50.915 16:41:36 -- scheduler/scheduler.sh@35 -- # scheduler_pid=463021 00:05:50.916 16:41:36 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:50.916 16:41:36 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:50.916 16:41:36 -- scheduler/scheduler.sh@37 -- # waitforlisten 463021 00:05:50.916 16:41:36 -- common/autotest_common.sh@829 -- # '[' -z 463021 ']' 00:05:50.916 16:41:36 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:50.916 16:41:36 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:50.916 16:41:36 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:50.916 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:50.916 16:41:36 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:50.916 16:41:36 -- common/autotest_common.sh@10 -- # set +x 00:05:50.916 [2024-11-16 16:41:36.583708] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:50.916 [2024-11-16 16:41:36.583793] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid463021 ] 00:05:50.916 EAL: No free 2048 kB hugepages reported on node 1 00:05:51.222 [2024-11-16 16:41:36.666126] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:51.222 [2024-11-16 16:41:36.706118] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:51.222 [2024-11-16 16:41:36.706229] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:51.222 [2024-11-16 16:41:36.706316] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:51.222 [2024-11-16 16:41:36.706317] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:51.222 16:41:36 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:51.222 16:41:36 -- common/autotest_common.sh@862 -- # return 0 00:05:51.222 16:41:36 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:51.222 16:41:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:51.222 16:41:36 -- common/autotest_common.sh@10 -- # set +x 00:05:51.222 POWER: Env isn't set yet! 00:05:51.222 POWER: Attempting to initialise ACPI cpufreq power management... 00:05:51.222 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:51.222 POWER: Cannot set governor of lcore 0 to userspace 00:05:51.222 POWER: Attempting to initialise PSTAT power management... 00:05:51.222 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:05:51.222 POWER: Initialized successfully for lcore 0 power management 00:05:51.222 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:05:51.222 POWER: Initialized successfully for lcore 1 power management 00:05:51.222 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:05:51.222 POWER: Initialized successfully for lcore 2 power management 00:05:51.222 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:05:51.222 POWER: Initialized successfully for lcore 3 power management 00:05:51.222 [2024-11-16 16:41:36.781862] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:05:51.222 [2024-11-16 16:41:36.781876] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:05:51.222 [2024-11-16 16:41:36.781885] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:05:51.222 16:41:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:51.222 16:41:36 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:51.222 16:41:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:51.222 16:41:36 -- common/autotest_common.sh@10 -- # set +x 00:05:51.222 [2024-11-16 16:41:36.843781] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:51.222 16:41:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:51.222 16:41:36 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:51.222 16:41:36 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:51.222 16:41:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:51.222 16:41:36 -- common/autotest_common.sh@10 -- # set +x 00:05:51.222 ************************************ 00:05:51.222 START TEST scheduler_create_thread 00:05:51.222 ************************************ 00:05:51.222 16:41:36 -- common/autotest_common.sh@1114 -- # scheduler_create_thread 00:05:51.222 16:41:36 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:51.222 16:41:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:51.222 16:41:36 -- common/autotest_common.sh@10 -- # set +x 00:05:51.222 2 00:05:51.222 16:41:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:51.222 16:41:36 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:51.222 16:41:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:51.222 16:41:36 -- common/autotest_common.sh@10 -- # set +x 00:05:51.222 3 00:05:51.222 16:41:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:51.222 16:41:36 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:51.222 16:41:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:51.222 16:41:36 -- common/autotest_common.sh@10 -- # set +x 00:05:51.222 4 00:05:51.222 16:41:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:51.222 16:41:36 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:51.222 16:41:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:51.223 16:41:36 -- common/autotest_common.sh@10 -- # set +x 00:05:51.223 5 00:05:51.223 16:41:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:51.223 16:41:36 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:51.223 16:41:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:51.223 16:41:36 -- common/autotest_common.sh@10 -- # set +x 00:05:51.223 6 00:05:51.223 16:41:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:51.223 16:41:36 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:51.223 16:41:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:51.223 16:41:36 -- common/autotest_common.sh@10 -- # set +x 00:05:51.223 7 00:05:51.223 16:41:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:51.223 16:41:36 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:51.223 16:41:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:51.223 16:41:36 -- common/autotest_common.sh@10 -- # set +x 00:05:51.223 8 00:05:51.223 16:41:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:51.223 16:41:36 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:51.223 16:41:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:51.223 16:41:36 -- common/autotest_common.sh@10 -- # set +x 00:05:51.223 9 00:05:51.223 16:41:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:51.223 16:41:36 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:51.223 16:41:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:51.223 16:41:36 -- common/autotest_common.sh@10 -- # set +x 00:05:51.223 10 00:05:51.223 16:41:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:51.223 16:41:36 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:51.223 16:41:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:51.223 16:41:36 -- common/autotest_common.sh@10 -- # set +x 00:05:52.674 16:41:38 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:52.674 16:41:38 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:52.674 16:41:38 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:52.674 16:41:38 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:52.674 16:41:38 -- common/autotest_common.sh@10 -- # set +x 00:05:53.672 16:41:39 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:53.672 16:41:39 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:05:53.672 16:41:39 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:53.672 16:41:39 -- common/autotest_common.sh@10 -- # set +x 00:05:54.295 16:41:39 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:54.295 16:41:39 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:05:54.295 16:41:39 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:05:54.295 16:41:39 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:54.295 16:41:39 -- common/autotest_common.sh@10 -- # set +x 00:05:55.232 16:41:40 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:55.232 00:05:55.232 real 0m3.894s 00:05:55.232 user 0m0.023s 00:05:55.232 sys 0m0.008s 00:05:55.232 16:41:40 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:55.232 16:41:40 -- common/autotest_common.sh@10 -- # set +x 00:05:55.232 ************************************ 00:05:55.232 END TEST scheduler_create_thread 00:05:55.232 ************************************ 00:05:55.232 16:41:40 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:05:55.232 16:41:40 -- scheduler/scheduler.sh@46 -- # killprocess 463021 00:05:55.232 16:41:40 -- common/autotest_common.sh@936 -- # '[' -z 463021 ']' 00:05:55.232 16:41:40 -- common/autotest_common.sh@940 -- # kill -0 463021 00:05:55.232 16:41:40 -- common/autotest_common.sh@941 -- # uname 00:05:55.232 16:41:40 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:55.232 16:41:40 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 463021 00:05:55.232 16:41:40 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:05:55.232 16:41:40 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:05:55.232 16:41:40 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 463021' 00:05:55.232 killing process with pid 463021 00:05:55.232 16:41:40 -- common/autotest_common.sh@955 -- # kill 463021 00:05:55.232 16:41:40 -- common/autotest_common.sh@960 -- # wait 463021 00:05:55.492 [2024-11-16 16:41:41.127284] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:05:55.751 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:05:55.751 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:05:55.751 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:05:55.751 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:05:55.751 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:05:55.751 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:05:55.751 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:05:55.751 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:05:55.751 00:05:55.751 real 0m5.011s 00:05:55.751 user 0m9.465s 00:05:55.751 sys 0m0.399s 00:05:55.751 16:41:41 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:55.751 16:41:41 -- common/autotest_common.sh@10 -- # set +x 00:05:55.751 ************************************ 00:05:55.751 END TEST event_scheduler 00:05:55.751 ************************************ 00:05:55.751 16:41:41 -- event/event.sh@51 -- # modprobe -n nbd 00:05:55.751 16:41:41 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:05:55.751 16:41:41 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:55.751 16:41:41 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:55.751 16:41:41 -- common/autotest_common.sh@10 -- # set +x 00:05:55.751 ************************************ 00:05:55.751 START TEST app_repeat 00:05:55.751 ************************************ 00:05:55.751 16:41:41 -- common/autotest_common.sh@1114 -- # app_repeat_test 00:05:55.751 16:41:41 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:55.751 16:41:41 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:55.751 16:41:41 -- event/event.sh@13 -- # local nbd_list 00:05:55.751 16:41:41 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:55.751 16:41:41 -- event/event.sh@14 -- # local bdev_list 00:05:55.751 16:41:41 -- event/event.sh@15 -- # local repeat_times=4 00:05:55.751 16:41:41 -- event/event.sh@17 -- # modprobe nbd 00:05:55.751 16:41:41 -- event/event.sh@19 -- # repeat_pid=463902 00:05:55.751 16:41:41 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:05:55.751 16:41:41 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:05:55.751 16:41:41 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 463902' 00:05:55.751 Process app_repeat pid: 463902 00:05:55.751 16:41:41 -- event/event.sh@23 -- # for i in {0..2} 00:05:55.751 16:41:41 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:05:55.751 spdk_app_start Round 0 00:05:55.751 16:41:41 -- event/event.sh@25 -- # waitforlisten 463902 /var/tmp/spdk-nbd.sock 00:05:55.751 16:41:41 -- common/autotest_common.sh@829 -- # '[' -z 463902 ']' 00:05:55.751 16:41:41 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:55.751 16:41:41 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:55.751 16:41:41 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:55.751 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:55.751 16:41:41 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:55.751 16:41:41 -- common/autotest_common.sh@10 -- # set +x 00:05:55.751 [2024-11-16 16:41:41.468555] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:05:55.751 [2024-11-16 16:41:41.468644] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid463902 ] 00:05:56.011 EAL: No free 2048 kB hugepages reported on node 1 00:05:56.011 [2024-11-16 16:41:41.536170] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:56.011 [2024-11-16 16:41:41.577693] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:56.011 [2024-11-16 16:41:41.577696] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:56.579 16:41:42 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:56.579 16:41:42 -- common/autotest_common.sh@862 -- # return 0 00:05:56.579 16:41:42 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:56.838 Malloc0 00:05:56.838 16:41:42 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:57.097 Malloc1 00:05:57.097 16:41:42 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:57.097 16:41:42 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:57.097 16:41:42 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:57.097 16:41:42 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:57.097 16:41:42 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:57.097 16:41:42 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:57.097 16:41:42 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:57.097 16:41:42 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:57.097 16:41:42 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:57.097 16:41:42 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:57.097 16:41:42 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:57.097 16:41:42 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:57.097 16:41:42 -- bdev/nbd_common.sh@12 -- # local i 00:05:57.097 16:41:42 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:57.097 16:41:42 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:57.097 16:41:42 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:57.356 /dev/nbd0 00:05:57.356 16:41:42 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:57.356 16:41:42 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:57.356 16:41:42 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:05:57.356 16:41:42 -- common/autotest_common.sh@867 -- # local i 00:05:57.356 16:41:42 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:57.356 16:41:42 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:57.356 16:41:42 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:05:57.356 16:41:42 -- common/autotest_common.sh@871 -- # break 00:05:57.356 16:41:42 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:57.356 16:41:42 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:57.356 16:41:42 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:57.356 1+0 records in 00:05:57.356 1+0 records out 00:05:57.356 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254018 s, 16.1 MB/s 00:05:57.356 16:41:42 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:57.356 16:41:42 -- common/autotest_common.sh@884 -- # size=4096 00:05:57.356 16:41:42 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:57.356 16:41:42 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:57.356 16:41:42 -- common/autotest_common.sh@887 -- # return 0 00:05:57.356 16:41:42 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:57.356 16:41:42 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:57.356 16:41:42 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:57.356 /dev/nbd1 00:05:57.356 16:41:43 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:57.356 16:41:43 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:57.356 16:41:43 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:05:57.356 16:41:43 -- common/autotest_common.sh@867 -- # local i 00:05:57.356 16:41:43 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:57.616 16:41:43 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:57.616 16:41:43 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:05:57.616 16:41:43 -- common/autotest_common.sh@871 -- # break 00:05:57.616 16:41:43 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:57.616 16:41:43 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:57.616 16:41:43 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:57.616 1+0 records in 00:05:57.616 1+0 records out 00:05:57.616 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000218904 s, 18.7 MB/s 00:05:57.616 16:41:43 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:57.616 16:41:43 -- common/autotest_common.sh@884 -- # size=4096 00:05:57.616 16:41:43 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:57.616 16:41:43 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:57.616 16:41:43 -- common/autotest_common.sh@887 -- # return 0 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:57.616 { 00:05:57.616 "nbd_device": "/dev/nbd0", 00:05:57.616 "bdev_name": "Malloc0" 00:05:57.616 }, 00:05:57.616 { 00:05:57.616 "nbd_device": "/dev/nbd1", 00:05:57.616 "bdev_name": "Malloc1" 00:05:57.616 } 00:05:57.616 ]' 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:57.616 { 00:05:57.616 "nbd_device": "/dev/nbd0", 00:05:57.616 "bdev_name": "Malloc0" 00:05:57.616 }, 00:05:57.616 { 00:05:57.616 "nbd_device": "/dev/nbd1", 00:05:57.616 "bdev_name": "Malloc1" 00:05:57.616 } 00:05:57.616 ]' 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:57.616 /dev/nbd1' 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:57.616 /dev/nbd1' 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@65 -- # count=2 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@95 -- # count=2 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:57.616 16:41:43 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:57.616 256+0 records in 00:05:57.616 256+0 records out 00:05:57.616 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0107994 s, 97.1 MB/s 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:57.876 256+0 records in 00:05:57.876 256+0 records out 00:05:57.876 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.02007 s, 52.2 MB/s 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:57.876 256+0 records in 00:05:57.876 256+0 records out 00:05:57.876 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.021123 s, 49.6 MB/s 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@51 -- # local i 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:57.876 16:41:43 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:58.135 16:41:43 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:58.135 16:41:43 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:58.135 16:41:43 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:58.135 16:41:43 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:58.135 16:41:43 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:58.135 16:41:43 -- bdev/nbd_common.sh@41 -- # break 00:05:58.135 16:41:43 -- bdev/nbd_common.sh@45 -- # return 0 00:05:58.135 16:41:43 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:58.135 16:41:43 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:58.135 16:41:43 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:58.135 16:41:43 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:58.135 16:41:43 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:58.135 16:41:43 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:58.135 16:41:43 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:58.135 16:41:43 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:58.135 16:41:43 -- bdev/nbd_common.sh@41 -- # break 00:05:58.135 16:41:43 -- bdev/nbd_common.sh@45 -- # return 0 00:05:58.135 16:41:43 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:58.135 16:41:43 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:58.135 16:41:43 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:58.395 16:41:44 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:58.395 16:41:44 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:58.395 16:41:44 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:58.395 16:41:44 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:58.395 16:41:44 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:58.395 16:41:44 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:58.395 16:41:44 -- bdev/nbd_common.sh@65 -- # true 00:05:58.395 16:41:44 -- bdev/nbd_common.sh@65 -- # count=0 00:05:58.395 16:41:44 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:58.395 16:41:44 -- bdev/nbd_common.sh@104 -- # count=0 00:05:58.395 16:41:44 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:58.395 16:41:44 -- bdev/nbd_common.sh@109 -- # return 0 00:05:58.395 16:41:44 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:58.654 16:41:44 -- event/event.sh@35 -- # sleep 3 00:05:58.913 [2024-11-16 16:41:44.430222] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:58.913 [2024-11-16 16:41:44.462943] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:58.913 [2024-11-16 16:41:44.462944] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:58.913 [2024-11-16 16:41:44.502641] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:58.913 [2024-11-16 16:41:44.502694] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:02.202 16:41:47 -- event/event.sh@23 -- # for i in {0..2} 00:06:02.202 16:41:47 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:02.202 spdk_app_start Round 1 00:06:02.202 16:41:47 -- event/event.sh@25 -- # waitforlisten 463902 /var/tmp/spdk-nbd.sock 00:06:02.202 16:41:47 -- common/autotest_common.sh@829 -- # '[' -z 463902 ']' 00:06:02.202 16:41:47 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:02.203 16:41:47 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:02.203 16:41:47 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:02.203 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:02.203 16:41:47 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:02.203 16:41:47 -- common/autotest_common.sh@10 -- # set +x 00:06:02.203 16:41:47 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:02.203 16:41:47 -- common/autotest_common.sh@862 -- # return 0 00:06:02.203 16:41:47 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:02.203 Malloc0 00:06:02.203 16:41:47 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:02.203 Malloc1 00:06:02.203 16:41:47 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:02.203 16:41:47 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:02.203 16:41:47 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:02.203 16:41:47 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:02.203 16:41:47 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:02.203 16:41:47 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:02.203 16:41:47 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:02.203 16:41:47 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:02.203 16:41:47 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:02.203 16:41:47 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:02.203 16:41:47 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:02.203 16:41:47 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:02.203 16:41:47 -- bdev/nbd_common.sh@12 -- # local i 00:06:02.203 16:41:47 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:02.203 16:41:47 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:02.203 16:41:47 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:02.463 /dev/nbd0 00:06:02.463 16:41:47 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:02.463 16:41:47 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:02.463 16:41:47 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:02.463 16:41:47 -- common/autotest_common.sh@867 -- # local i 00:06:02.463 16:41:47 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:02.463 16:41:47 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:02.463 16:41:47 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:02.463 16:41:47 -- common/autotest_common.sh@871 -- # break 00:06:02.463 16:41:47 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:02.463 16:41:47 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:02.463 16:41:47 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:02.463 1+0 records in 00:06:02.463 1+0 records out 00:06:02.463 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000245346 s, 16.7 MB/s 00:06:02.463 16:41:47 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:02.463 16:41:48 -- common/autotest_common.sh@884 -- # size=4096 00:06:02.463 16:41:48 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:02.463 16:41:48 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:02.463 16:41:48 -- common/autotest_common.sh@887 -- # return 0 00:06:02.463 16:41:48 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:02.463 16:41:48 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:02.463 16:41:48 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:02.463 /dev/nbd1 00:06:02.463 16:41:48 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:02.463 16:41:48 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:02.463 16:41:48 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:02.463 16:41:48 -- common/autotest_common.sh@867 -- # local i 00:06:02.463 16:41:48 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:02.463 16:41:48 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:02.463 16:41:48 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:02.463 16:41:48 -- common/autotest_common.sh@871 -- # break 00:06:02.463 16:41:48 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:02.463 16:41:48 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:02.463 16:41:48 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:02.723 1+0 records in 00:06:02.723 1+0 records out 00:06:02.723 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000245115 s, 16.7 MB/s 00:06:02.723 16:41:48 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:02.723 16:41:48 -- common/autotest_common.sh@884 -- # size=4096 00:06:02.723 16:41:48 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:02.723 16:41:48 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:02.723 16:41:48 -- common/autotest_common.sh@887 -- # return 0 00:06:02.723 16:41:48 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:02.724 16:41:48 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:02.724 16:41:48 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:02.724 16:41:48 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:02.724 16:41:48 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:02.724 16:41:48 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:02.724 { 00:06:02.724 "nbd_device": "/dev/nbd0", 00:06:02.724 "bdev_name": "Malloc0" 00:06:02.724 }, 00:06:02.724 { 00:06:02.724 "nbd_device": "/dev/nbd1", 00:06:02.724 "bdev_name": "Malloc1" 00:06:02.724 } 00:06:02.724 ]' 00:06:02.724 16:41:48 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:02.724 { 00:06:02.724 "nbd_device": "/dev/nbd0", 00:06:02.724 "bdev_name": "Malloc0" 00:06:02.724 }, 00:06:02.724 { 00:06:02.724 "nbd_device": "/dev/nbd1", 00:06:02.724 "bdev_name": "Malloc1" 00:06:02.724 } 00:06:02.724 ]' 00:06:02.724 16:41:48 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:02.724 16:41:48 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:02.724 /dev/nbd1' 00:06:02.724 16:41:48 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:02.724 /dev/nbd1' 00:06:02.724 16:41:48 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:02.724 16:41:48 -- bdev/nbd_common.sh@65 -- # count=2 00:06:02.724 16:41:48 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:02.724 16:41:48 -- bdev/nbd_common.sh@95 -- # count=2 00:06:02.724 16:41:48 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:02.724 16:41:48 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:02.724 16:41:48 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:02.724 16:41:48 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:02.724 16:41:48 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:02.724 16:41:48 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:02.724 16:41:48 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:02.724 16:41:48 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:02.984 256+0 records in 00:06:02.984 256+0 records out 00:06:02.984 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.011522 s, 91.0 MB/s 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:02.984 256+0 records in 00:06:02.984 256+0 records out 00:06:02.984 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0195595 s, 53.6 MB/s 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:02.984 256+0 records in 00:06:02.984 256+0 records out 00:06:02.984 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0208359 s, 50.3 MB/s 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@51 -- # local i 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:02.984 16:41:48 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:03.243 16:41:48 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:03.243 16:41:48 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:03.243 16:41:48 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:03.243 16:41:48 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:03.243 16:41:48 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:03.243 16:41:48 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:03.243 16:41:48 -- bdev/nbd_common.sh@41 -- # break 00:06:03.243 16:41:48 -- bdev/nbd_common.sh@45 -- # return 0 00:06:03.243 16:41:48 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:03.243 16:41:48 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:03.243 16:41:48 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:03.243 16:41:48 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:03.243 16:41:48 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:03.243 16:41:48 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:03.243 16:41:48 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:03.243 16:41:48 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:03.243 16:41:48 -- bdev/nbd_common.sh@41 -- # break 00:06:03.243 16:41:48 -- bdev/nbd_common.sh@45 -- # return 0 00:06:03.243 16:41:48 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:03.243 16:41:48 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:03.243 16:41:48 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:03.517 16:41:49 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:03.517 16:41:49 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:03.517 16:41:49 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:03.517 16:41:49 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:03.517 16:41:49 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:03.517 16:41:49 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:03.517 16:41:49 -- bdev/nbd_common.sh@65 -- # true 00:06:03.517 16:41:49 -- bdev/nbd_common.sh@65 -- # count=0 00:06:03.517 16:41:49 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:03.517 16:41:49 -- bdev/nbd_common.sh@104 -- # count=0 00:06:03.517 16:41:49 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:03.517 16:41:49 -- bdev/nbd_common.sh@109 -- # return 0 00:06:03.517 16:41:49 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:03.780 16:41:49 -- event/event.sh@35 -- # sleep 3 00:06:04.039 [2024-11-16 16:41:49.533047] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:04.039 [2024-11-16 16:41:49.565884] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:04.039 [2024-11-16 16:41:49.565887] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:04.039 [2024-11-16 16:41:49.605596] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:04.039 [2024-11-16 16:41:49.605638] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:07.325 16:41:52 -- event/event.sh@23 -- # for i in {0..2} 00:06:07.325 16:41:52 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:07.325 spdk_app_start Round 2 00:06:07.325 16:41:52 -- event/event.sh@25 -- # waitforlisten 463902 /var/tmp/spdk-nbd.sock 00:06:07.325 16:41:52 -- common/autotest_common.sh@829 -- # '[' -z 463902 ']' 00:06:07.325 16:41:52 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:07.325 16:41:52 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:07.325 16:41:52 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:07.325 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:07.325 16:41:52 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:07.325 16:41:52 -- common/autotest_common.sh@10 -- # set +x 00:06:07.325 16:41:52 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:07.325 16:41:52 -- common/autotest_common.sh@862 -- # return 0 00:06:07.325 16:41:52 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:07.325 Malloc0 00:06:07.325 16:41:52 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:07.325 Malloc1 00:06:07.325 16:41:52 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:07.325 16:41:52 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:07.325 16:41:52 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:07.325 16:41:52 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:07.325 16:41:52 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:07.325 16:41:52 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:07.325 16:41:52 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:07.325 16:41:52 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:07.325 16:41:52 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:07.325 16:41:52 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:07.325 16:41:52 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:07.325 16:41:52 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:07.325 16:41:52 -- bdev/nbd_common.sh@12 -- # local i 00:06:07.325 16:41:52 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:07.325 16:41:52 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:07.325 16:41:52 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:07.584 /dev/nbd0 00:06:07.584 16:41:53 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:07.584 16:41:53 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:07.584 16:41:53 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:06:07.585 16:41:53 -- common/autotest_common.sh@867 -- # local i 00:06:07.585 16:41:53 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:07.585 16:41:53 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:07.585 16:41:53 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:06:07.585 16:41:53 -- common/autotest_common.sh@871 -- # break 00:06:07.585 16:41:53 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:07.585 16:41:53 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:07.585 16:41:53 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:07.585 1+0 records in 00:06:07.585 1+0 records out 00:06:07.585 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235471 s, 17.4 MB/s 00:06:07.585 16:41:53 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:07.585 16:41:53 -- common/autotest_common.sh@884 -- # size=4096 00:06:07.585 16:41:53 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:07.585 16:41:53 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:07.585 16:41:53 -- common/autotest_common.sh@887 -- # return 0 00:06:07.585 16:41:53 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:07.585 16:41:53 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:07.585 16:41:53 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:07.585 /dev/nbd1 00:06:07.585 16:41:53 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:07.585 16:41:53 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:07.585 16:41:53 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:06:07.844 16:41:53 -- common/autotest_common.sh@867 -- # local i 00:06:07.844 16:41:53 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:06:07.844 16:41:53 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:06:07.844 16:41:53 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:06:07.844 16:41:53 -- common/autotest_common.sh@871 -- # break 00:06:07.844 16:41:53 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:06:07.844 16:41:53 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:06:07.844 16:41:53 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:07.844 1+0 records in 00:06:07.844 1+0 records out 00:06:07.844 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241478 s, 17.0 MB/s 00:06:07.844 16:41:53 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:07.844 16:41:53 -- common/autotest_common.sh@884 -- # size=4096 00:06:07.844 16:41:53 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:07.844 16:41:53 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:06:07.844 16:41:53 -- common/autotest_common.sh@887 -- # return 0 00:06:07.844 16:41:53 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:07.844 16:41:53 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:07.844 16:41:53 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:07.844 16:41:53 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:07.844 16:41:53 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:07.845 16:41:53 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:07.845 { 00:06:07.845 "nbd_device": "/dev/nbd0", 00:06:07.845 "bdev_name": "Malloc0" 00:06:07.845 }, 00:06:07.845 { 00:06:07.845 "nbd_device": "/dev/nbd1", 00:06:07.845 "bdev_name": "Malloc1" 00:06:07.845 } 00:06:07.845 ]' 00:06:07.845 16:41:53 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:07.845 { 00:06:07.845 "nbd_device": "/dev/nbd0", 00:06:07.845 "bdev_name": "Malloc0" 00:06:07.845 }, 00:06:07.845 { 00:06:07.845 "nbd_device": "/dev/nbd1", 00:06:07.845 "bdev_name": "Malloc1" 00:06:07.845 } 00:06:07.845 ]' 00:06:07.845 16:41:53 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:07.845 16:41:53 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:07.845 /dev/nbd1' 00:06:07.845 16:41:53 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:07.845 /dev/nbd1' 00:06:07.845 16:41:53 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:07.845 16:41:53 -- bdev/nbd_common.sh@65 -- # count=2 00:06:07.845 16:41:53 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:07.845 16:41:53 -- bdev/nbd_common.sh@95 -- # count=2 00:06:07.845 16:41:53 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:07.845 16:41:53 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:07.845 16:41:53 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:07.845 16:41:53 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:07.845 16:41:53 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:07.845 16:41:53 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:07.845 16:41:53 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:07.845 16:41:53 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:08.104 256+0 records in 00:06:08.104 256+0 records out 00:06:08.104 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0112478 s, 93.2 MB/s 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:08.104 256+0 records in 00:06:08.104 256+0 records out 00:06:08.104 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0195778 s, 53.6 MB/s 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:08.104 256+0 records in 00:06:08.104 256+0 records out 00:06:08.104 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.020879 s, 50.2 MB/s 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@51 -- # local i 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:08.104 16:41:53 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:08.363 16:41:53 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:08.363 16:41:53 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:08.363 16:41:53 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:08.363 16:41:53 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:08.363 16:41:53 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:08.363 16:41:53 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:08.363 16:41:53 -- bdev/nbd_common.sh@41 -- # break 00:06:08.363 16:41:53 -- bdev/nbd_common.sh@45 -- # return 0 00:06:08.363 16:41:53 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:08.363 16:41:53 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:08.363 16:41:54 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:08.363 16:41:54 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:08.363 16:41:54 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:08.363 16:41:54 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:08.363 16:41:54 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:08.363 16:41:54 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:08.363 16:41:54 -- bdev/nbd_common.sh@41 -- # break 00:06:08.363 16:41:54 -- bdev/nbd_common.sh@45 -- # return 0 00:06:08.363 16:41:54 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:08.363 16:41:54 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:08.363 16:41:54 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:08.623 16:41:54 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:08.623 16:41:54 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:08.623 16:41:54 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:08.623 16:41:54 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:08.623 16:41:54 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:08.623 16:41:54 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:08.623 16:41:54 -- bdev/nbd_common.sh@65 -- # true 00:06:08.623 16:41:54 -- bdev/nbd_common.sh@65 -- # count=0 00:06:08.623 16:41:54 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:08.623 16:41:54 -- bdev/nbd_common.sh@104 -- # count=0 00:06:08.623 16:41:54 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:08.623 16:41:54 -- bdev/nbd_common.sh@109 -- # return 0 00:06:08.623 16:41:54 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:08.882 16:41:54 -- event/event.sh@35 -- # sleep 3 00:06:09.141 [2024-11-16 16:41:54.676465] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:09.141 [2024-11-16 16:41:54.708957] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:09.141 [2024-11-16 16:41:54.708959] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:09.141 [2024-11-16 16:41:54.748639] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:09.141 [2024-11-16 16:41:54.748686] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:12.479 16:41:57 -- event/event.sh@38 -- # waitforlisten 463902 /var/tmp/spdk-nbd.sock 00:06:12.479 16:41:57 -- common/autotest_common.sh@829 -- # '[' -z 463902 ']' 00:06:12.479 16:41:57 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:12.479 16:41:57 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:12.479 16:41:57 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:12.479 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:12.479 16:41:57 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:12.479 16:41:57 -- common/autotest_common.sh@10 -- # set +x 00:06:12.479 16:41:57 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:12.479 16:41:57 -- common/autotest_common.sh@862 -- # return 0 00:06:12.479 16:41:57 -- event/event.sh@39 -- # killprocess 463902 00:06:12.479 16:41:57 -- common/autotest_common.sh@936 -- # '[' -z 463902 ']' 00:06:12.479 16:41:57 -- common/autotest_common.sh@940 -- # kill -0 463902 00:06:12.479 16:41:57 -- common/autotest_common.sh@941 -- # uname 00:06:12.479 16:41:57 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:12.479 16:41:57 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 463902 00:06:12.479 16:41:57 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:12.479 16:41:57 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:12.479 16:41:57 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 463902' 00:06:12.479 killing process with pid 463902 00:06:12.479 16:41:57 -- common/autotest_common.sh@955 -- # kill 463902 00:06:12.479 16:41:57 -- common/autotest_common.sh@960 -- # wait 463902 00:06:12.479 spdk_app_start is called in Round 0. 00:06:12.479 Shutdown signal received, stop current app iteration 00:06:12.479 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 reinitialization... 00:06:12.479 spdk_app_start is called in Round 1. 00:06:12.479 Shutdown signal received, stop current app iteration 00:06:12.479 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 reinitialization... 00:06:12.479 spdk_app_start is called in Round 2. 00:06:12.479 Shutdown signal received, stop current app iteration 00:06:12.479 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 reinitialization... 00:06:12.479 spdk_app_start is called in Round 3. 00:06:12.479 Shutdown signal received, stop current app iteration 00:06:12.479 16:41:57 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:12.479 16:41:57 -- event/event.sh@42 -- # return 0 00:06:12.479 00:06:12.479 real 0m16.468s 00:06:12.479 user 0m35.264s 00:06:12.479 sys 0m3.088s 00:06:12.479 16:41:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:12.479 16:41:57 -- common/autotest_common.sh@10 -- # set +x 00:06:12.479 ************************************ 00:06:12.479 END TEST app_repeat 00:06:12.479 ************************************ 00:06:12.479 16:41:57 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:12.479 16:41:57 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:12.479 16:41:57 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:12.480 16:41:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:12.480 16:41:57 -- common/autotest_common.sh@10 -- # set +x 00:06:12.480 ************************************ 00:06:12.480 START TEST cpu_locks 00:06:12.480 ************************************ 00:06:12.480 16:41:57 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:12.480 * Looking for test storage... 00:06:12.480 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:12.480 16:41:58 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:12.480 16:41:58 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:12.480 16:41:58 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:12.480 16:41:58 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:12.480 16:41:58 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:12.480 16:41:58 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:12.480 16:41:58 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:12.480 16:41:58 -- scripts/common.sh@335 -- # IFS=.-: 00:06:12.480 16:41:58 -- scripts/common.sh@335 -- # read -ra ver1 00:06:12.480 16:41:58 -- scripts/common.sh@336 -- # IFS=.-: 00:06:12.480 16:41:58 -- scripts/common.sh@336 -- # read -ra ver2 00:06:12.480 16:41:58 -- scripts/common.sh@337 -- # local 'op=<' 00:06:12.480 16:41:58 -- scripts/common.sh@339 -- # ver1_l=2 00:06:12.480 16:41:58 -- scripts/common.sh@340 -- # ver2_l=1 00:06:12.480 16:41:58 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:12.480 16:41:58 -- scripts/common.sh@343 -- # case "$op" in 00:06:12.480 16:41:58 -- scripts/common.sh@344 -- # : 1 00:06:12.480 16:41:58 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:12.480 16:41:58 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:12.480 16:41:58 -- scripts/common.sh@364 -- # decimal 1 00:06:12.480 16:41:58 -- scripts/common.sh@352 -- # local d=1 00:06:12.480 16:41:58 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:12.480 16:41:58 -- scripts/common.sh@354 -- # echo 1 00:06:12.480 16:41:58 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:12.480 16:41:58 -- scripts/common.sh@365 -- # decimal 2 00:06:12.480 16:41:58 -- scripts/common.sh@352 -- # local d=2 00:06:12.480 16:41:58 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:12.480 16:41:58 -- scripts/common.sh@354 -- # echo 2 00:06:12.480 16:41:58 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:12.480 16:41:58 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:12.480 16:41:58 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:12.480 16:41:58 -- scripts/common.sh@367 -- # return 0 00:06:12.480 16:41:58 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:12.480 16:41:58 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:12.480 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:12.480 --rc genhtml_branch_coverage=1 00:06:12.480 --rc genhtml_function_coverage=1 00:06:12.480 --rc genhtml_legend=1 00:06:12.480 --rc geninfo_all_blocks=1 00:06:12.480 --rc geninfo_unexecuted_blocks=1 00:06:12.480 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:12.480 ' 00:06:12.480 16:41:58 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:12.480 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:12.480 --rc genhtml_branch_coverage=1 00:06:12.480 --rc genhtml_function_coverage=1 00:06:12.480 --rc genhtml_legend=1 00:06:12.480 --rc geninfo_all_blocks=1 00:06:12.480 --rc geninfo_unexecuted_blocks=1 00:06:12.480 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:12.480 ' 00:06:12.480 16:41:58 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:12.480 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:12.480 --rc genhtml_branch_coverage=1 00:06:12.480 --rc genhtml_function_coverage=1 00:06:12.480 --rc genhtml_legend=1 00:06:12.480 --rc geninfo_all_blocks=1 00:06:12.480 --rc geninfo_unexecuted_blocks=1 00:06:12.480 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:12.480 ' 00:06:12.480 16:41:58 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:12.480 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:12.480 --rc genhtml_branch_coverage=1 00:06:12.480 --rc genhtml_function_coverage=1 00:06:12.480 --rc genhtml_legend=1 00:06:12.480 --rc geninfo_all_blocks=1 00:06:12.480 --rc geninfo_unexecuted_blocks=1 00:06:12.480 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:12.480 ' 00:06:12.480 16:41:58 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:06:12.480 16:41:58 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:06:12.480 16:41:58 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:06:12.480 16:41:58 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:06:12.480 16:41:58 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:12.480 16:41:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:12.480 16:41:58 -- common/autotest_common.sh@10 -- # set +x 00:06:12.480 ************************************ 00:06:12.480 START TEST default_locks 00:06:12.480 ************************************ 00:06:12.480 16:41:58 -- common/autotest_common.sh@1114 -- # default_locks 00:06:12.480 16:41:58 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=467100 00:06:12.480 16:41:58 -- event/cpu_locks.sh@47 -- # waitforlisten 467100 00:06:12.480 16:41:58 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:12.480 16:41:58 -- common/autotest_common.sh@829 -- # '[' -z 467100 ']' 00:06:12.480 16:41:58 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:12.480 16:41:58 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:12.480 16:41:58 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:12.480 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:12.480 16:41:58 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:12.480 16:41:58 -- common/autotest_common.sh@10 -- # set +x 00:06:12.480 [2024-11-16 16:41:58.181420] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:12.480 [2024-11-16 16:41:58.181510] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid467100 ] 00:06:12.480 EAL: No free 2048 kB hugepages reported on node 1 00:06:12.739 [2024-11-16 16:41:58.249828] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:12.739 [2024-11-16 16:41:58.285871] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:12.739 [2024-11-16 16:41:58.286007] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.310 16:41:59 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:13.310 16:41:59 -- common/autotest_common.sh@862 -- # return 0 00:06:13.310 16:41:59 -- event/cpu_locks.sh@49 -- # locks_exist 467100 00:06:13.310 16:41:59 -- event/cpu_locks.sh@22 -- # lslocks -p 467100 00:06:13.310 16:41:59 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:14.248 lslocks: write error 00:06:14.248 16:41:59 -- event/cpu_locks.sh@50 -- # killprocess 467100 00:06:14.248 16:41:59 -- common/autotest_common.sh@936 -- # '[' -z 467100 ']' 00:06:14.248 16:41:59 -- common/autotest_common.sh@940 -- # kill -0 467100 00:06:14.248 16:41:59 -- common/autotest_common.sh@941 -- # uname 00:06:14.248 16:41:59 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:14.248 16:41:59 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 467100 00:06:14.248 16:41:59 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:14.248 16:41:59 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:14.248 16:41:59 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 467100' 00:06:14.248 killing process with pid 467100 00:06:14.248 16:41:59 -- common/autotest_common.sh@955 -- # kill 467100 00:06:14.248 16:41:59 -- common/autotest_common.sh@960 -- # wait 467100 00:06:14.507 16:42:00 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 467100 00:06:14.507 16:42:00 -- common/autotest_common.sh@650 -- # local es=0 00:06:14.507 16:42:00 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 467100 00:06:14.507 16:42:00 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:14.507 16:42:00 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:14.507 16:42:00 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:14.507 16:42:00 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:14.507 16:42:00 -- common/autotest_common.sh@653 -- # waitforlisten 467100 00:06:14.507 16:42:00 -- common/autotest_common.sh@829 -- # '[' -z 467100 ']' 00:06:14.507 16:42:00 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:14.507 16:42:00 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:14.507 16:42:00 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:14.507 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:14.507 16:42:00 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:14.507 16:42:00 -- common/autotest_common.sh@10 -- # set +x 00:06:14.507 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (467100) - No such process 00:06:14.507 ERROR: process (pid: 467100) is no longer running 00:06:14.507 16:42:00 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:14.507 16:42:00 -- common/autotest_common.sh@862 -- # return 1 00:06:14.507 16:42:00 -- common/autotest_common.sh@653 -- # es=1 00:06:14.507 16:42:00 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:14.507 16:42:00 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:14.507 16:42:00 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:14.507 16:42:00 -- event/cpu_locks.sh@54 -- # no_locks 00:06:14.507 16:42:00 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:14.507 16:42:00 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:14.507 16:42:00 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:14.507 00:06:14.507 real 0m1.909s 00:06:14.507 user 0m2.025s 00:06:14.507 sys 0m0.702s 00:06:14.507 16:42:00 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:14.507 16:42:00 -- common/autotest_common.sh@10 -- # set +x 00:06:14.507 ************************************ 00:06:14.507 END TEST default_locks 00:06:14.507 ************************************ 00:06:14.507 16:42:00 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:06:14.507 16:42:00 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:14.507 16:42:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:14.507 16:42:00 -- common/autotest_common.sh@10 -- # set +x 00:06:14.507 ************************************ 00:06:14.507 START TEST default_locks_via_rpc 00:06:14.507 ************************************ 00:06:14.507 16:42:00 -- common/autotest_common.sh@1114 -- # default_locks_via_rpc 00:06:14.507 16:42:00 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=467433 00:06:14.507 16:42:00 -- event/cpu_locks.sh@63 -- # waitforlisten 467433 00:06:14.507 16:42:00 -- common/autotest_common.sh@829 -- # '[' -z 467433 ']' 00:06:14.507 16:42:00 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:14.507 16:42:00 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:14.507 16:42:00 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:14.507 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:14.507 16:42:00 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:14.507 16:42:00 -- common/autotest_common.sh@10 -- # set +x 00:06:14.508 16:42:00 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:14.508 [2024-11-16 16:42:00.125474] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:14.508 [2024-11-16 16:42:00.125564] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid467433 ] 00:06:14.508 EAL: No free 2048 kB hugepages reported on node 1 00:06:14.508 [2024-11-16 16:42:00.193436] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:14.508 [2024-11-16 16:42:00.231465] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:14.508 [2024-11-16 16:42:00.231585] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:15.446 16:42:00 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:15.446 16:42:00 -- common/autotest_common.sh@862 -- # return 0 00:06:15.446 16:42:00 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:06:15.446 16:42:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:15.446 16:42:00 -- common/autotest_common.sh@10 -- # set +x 00:06:15.446 16:42:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:15.446 16:42:00 -- event/cpu_locks.sh@67 -- # no_locks 00:06:15.446 16:42:00 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:15.446 16:42:00 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:15.446 16:42:00 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:15.446 16:42:00 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:06:15.446 16:42:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:15.446 16:42:00 -- common/autotest_common.sh@10 -- # set +x 00:06:15.446 16:42:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:15.446 16:42:00 -- event/cpu_locks.sh@71 -- # locks_exist 467433 00:06:15.446 16:42:00 -- event/cpu_locks.sh@22 -- # lslocks -p 467433 00:06:15.446 16:42:00 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:15.705 16:42:01 -- event/cpu_locks.sh@73 -- # killprocess 467433 00:06:15.705 16:42:01 -- common/autotest_common.sh@936 -- # '[' -z 467433 ']' 00:06:15.705 16:42:01 -- common/autotest_common.sh@940 -- # kill -0 467433 00:06:15.705 16:42:01 -- common/autotest_common.sh@941 -- # uname 00:06:15.705 16:42:01 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:15.705 16:42:01 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 467433 00:06:15.964 16:42:01 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:15.964 16:42:01 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:15.964 16:42:01 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 467433' 00:06:15.964 killing process with pid 467433 00:06:15.964 16:42:01 -- common/autotest_common.sh@955 -- # kill 467433 00:06:15.964 16:42:01 -- common/autotest_common.sh@960 -- # wait 467433 00:06:16.223 00:06:16.223 real 0m1.668s 00:06:16.223 user 0m1.752s 00:06:16.223 sys 0m0.585s 00:06:16.223 16:42:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:16.223 16:42:01 -- common/autotest_common.sh@10 -- # set +x 00:06:16.223 ************************************ 00:06:16.223 END TEST default_locks_via_rpc 00:06:16.223 ************************************ 00:06:16.223 16:42:01 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:06:16.223 16:42:01 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:16.223 16:42:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:16.223 16:42:01 -- common/autotest_common.sh@10 -- # set +x 00:06:16.223 ************************************ 00:06:16.223 START TEST non_locking_app_on_locked_coremask 00:06:16.223 ************************************ 00:06:16.223 16:42:01 -- common/autotest_common.sh@1114 -- # non_locking_app_on_locked_coremask 00:06:16.223 16:42:01 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=467938 00:06:16.223 16:42:01 -- event/cpu_locks.sh@81 -- # waitforlisten 467938 /var/tmp/spdk.sock 00:06:16.223 16:42:01 -- common/autotest_common.sh@829 -- # '[' -z 467938 ']' 00:06:16.223 16:42:01 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:16.223 16:42:01 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:16.223 16:42:01 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:16.223 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:16.223 16:42:01 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:16.223 16:42:01 -- common/autotest_common.sh@10 -- # set +x 00:06:16.223 16:42:01 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:16.223 [2024-11-16 16:42:01.841255] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:16.223 [2024-11-16 16:42:01.841335] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid467938 ] 00:06:16.223 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.223 [2024-11-16 16:42:01.909173] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.223 [2024-11-16 16:42:01.946740] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:16.223 [2024-11-16 16:42:01.946856] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:17.161 16:42:02 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:17.161 16:42:02 -- common/autotest_common.sh@862 -- # return 0 00:06:17.161 16:42:02 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=468104 00:06:17.161 16:42:02 -- event/cpu_locks.sh@85 -- # waitforlisten 468104 /var/tmp/spdk2.sock 00:06:17.161 16:42:02 -- common/autotest_common.sh@829 -- # '[' -z 468104 ']' 00:06:17.161 16:42:02 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:17.161 16:42:02 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:17.161 16:42:02 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:17.161 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:17.161 16:42:02 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:17.161 16:42:02 -- common/autotest_common.sh@10 -- # set +x 00:06:17.161 16:42:02 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:06:17.161 [2024-11-16 16:42:02.694568] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:17.161 [2024-11-16 16:42:02.694661] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid468104 ] 00:06:17.161 EAL: No free 2048 kB hugepages reported on node 1 00:06:17.161 [2024-11-16 16:42:02.785805] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:17.161 [2024-11-16 16:42:02.785833] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:17.161 [2024-11-16 16:42:02.859304] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:17.161 [2024-11-16 16:42:02.859438] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:18.098 16:42:03 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:18.098 16:42:03 -- common/autotest_common.sh@862 -- # return 0 00:06:18.098 16:42:03 -- event/cpu_locks.sh@87 -- # locks_exist 467938 00:06:18.098 16:42:03 -- event/cpu_locks.sh@22 -- # lslocks -p 467938 00:06:18.098 16:42:03 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:18.357 lslocks: write error 00:06:18.357 16:42:03 -- event/cpu_locks.sh@89 -- # killprocess 467938 00:06:18.357 16:42:03 -- common/autotest_common.sh@936 -- # '[' -z 467938 ']' 00:06:18.357 16:42:03 -- common/autotest_common.sh@940 -- # kill -0 467938 00:06:18.357 16:42:03 -- common/autotest_common.sh@941 -- # uname 00:06:18.357 16:42:03 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:18.357 16:42:03 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 467938 00:06:18.357 16:42:04 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:18.357 16:42:04 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:18.357 16:42:04 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 467938' 00:06:18.357 killing process with pid 467938 00:06:18.357 16:42:04 -- common/autotest_common.sh@955 -- # kill 467938 00:06:18.357 16:42:04 -- common/autotest_common.sh@960 -- # wait 467938 00:06:18.925 16:42:04 -- event/cpu_locks.sh@90 -- # killprocess 468104 00:06:18.925 16:42:04 -- common/autotest_common.sh@936 -- # '[' -z 468104 ']' 00:06:18.925 16:42:04 -- common/autotest_common.sh@940 -- # kill -0 468104 00:06:18.925 16:42:04 -- common/autotest_common.sh@941 -- # uname 00:06:18.925 16:42:04 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:18.925 16:42:04 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 468104 00:06:18.926 16:42:04 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:18.926 16:42:04 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:18.926 16:42:04 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 468104' 00:06:18.926 killing process with pid 468104 00:06:18.926 16:42:04 -- common/autotest_common.sh@955 -- # kill 468104 00:06:18.926 16:42:04 -- common/autotest_common.sh@960 -- # wait 468104 00:06:19.494 00:06:19.494 real 0m3.131s 00:06:19.494 user 0m3.349s 00:06:19.494 sys 0m0.995s 00:06:19.494 16:42:04 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:19.494 16:42:04 -- common/autotest_common.sh@10 -- # set +x 00:06:19.494 ************************************ 00:06:19.494 END TEST non_locking_app_on_locked_coremask 00:06:19.494 ************************************ 00:06:19.494 16:42:04 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:06:19.494 16:42:04 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:19.494 16:42:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:19.494 16:42:04 -- common/autotest_common.sh@10 -- # set +x 00:06:19.494 ************************************ 00:06:19.494 START TEST locking_app_on_unlocked_coremask 00:06:19.494 ************************************ 00:06:19.494 16:42:04 -- common/autotest_common.sh@1114 -- # locking_app_on_unlocked_coremask 00:06:19.494 16:42:04 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=468900 00:06:19.494 16:42:04 -- event/cpu_locks.sh@99 -- # waitforlisten 468900 /var/tmp/spdk.sock 00:06:19.494 16:42:04 -- common/autotest_common.sh@829 -- # '[' -z 468900 ']' 00:06:19.494 16:42:04 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:19.494 16:42:04 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:19.494 16:42:04 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:19.494 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:19.494 16:42:04 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:19.494 16:42:04 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:06:19.494 16:42:04 -- common/autotest_common.sh@10 -- # set +x 00:06:19.494 [2024-11-16 16:42:05.017584] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:19.494 [2024-11-16 16:42:05.017663] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid468900 ] 00:06:19.494 EAL: No free 2048 kB hugepages reported on node 1 00:06:19.494 [2024-11-16 16:42:05.084970] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:19.494 [2024-11-16 16:42:05.084998] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:19.494 [2024-11-16 16:42:05.122865] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:19.494 [2024-11-16 16:42:05.122981] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.432 16:42:05 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:20.432 16:42:05 -- common/autotest_common.sh@862 -- # return 0 00:06:20.432 16:42:05 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=469106 00:06:20.432 16:42:05 -- event/cpu_locks.sh@103 -- # waitforlisten 469106 /var/tmp/spdk2.sock 00:06:20.432 16:42:05 -- common/autotest_common.sh@829 -- # '[' -z 469106 ']' 00:06:20.432 16:42:05 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:20.432 16:42:05 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:20.432 16:42:05 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:20.432 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:20.432 16:42:05 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:20.432 16:42:05 -- common/autotest_common.sh@10 -- # set +x 00:06:20.432 16:42:05 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:20.432 [2024-11-16 16:42:05.862930] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:20.433 [2024-11-16 16:42:05.863000] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid469106 ] 00:06:20.433 EAL: No free 2048 kB hugepages reported on node 1 00:06:20.433 [2024-11-16 16:42:05.952355] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:20.433 [2024-11-16 16:42:06.029339] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:20.433 [2024-11-16 16:42:06.029466] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:21.001 16:42:06 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:21.001 16:42:06 -- common/autotest_common.sh@862 -- # return 0 00:06:21.001 16:42:06 -- event/cpu_locks.sh@105 -- # locks_exist 469106 00:06:21.001 16:42:06 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:21.001 16:42:06 -- event/cpu_locks.sh@22 -- # lslocks -p 469106 00:06:22.376 lslocks: write error 00:06:22.376 16:42:07 -- event/cpu_locks.sh@107 -- # killprocess 468900 00:06:22.376 16:42:07 -- common/autotest_common.sh@936 -- # '[' -z 468900 ']' 00:06:22.376 16:42:07 -- common/autotest_common.sh@940 -- # kill -0 468900 00:06:22.376 16:42:07 -- common/autotest_common.sh@941 -- # uname 00:06:22.376 16:42:07 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:22.376 16:42:07 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 468900 00:06:22.376 16:42:07 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:22.376 16:42:07 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:22.376 16:42:07 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 468900' 00:06:22.376 killing process with pid 468900 00:06:22.376 16:42:07 -- common/autotest_common.sh@955 -- # kill 468900 00:06:22.376 16:42:07 -- common/autotest_common.sh@960 -- # wait 468900 00:06:22.942 16:42:08 -- event/cpu_locks.sh@108 -- # killprocess 469106 00:06:22.942 16:42:08 -- common/autotest_common.sh@936 -- # '[' -z 469106 ']' 00:06:22.942 16:42:08 -- common/autotest_common.sh@940 -- # kill -0 469106 00:06:22.942 16:42:08 -- common/autotest_common.sh@941 -- # uname 00:06:22.942 16:42:08 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:22.943 16:42:08 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 469106 00:06:22.943 16:42:08 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:22.943 16:42:08 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:22.943 16:42:08 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 469106' 00:06:22.943 killing process with pid 469106 00:06:22.943 16:42:08 -- common/autotest_common.sh@955 -- # kill 469106 00:06:22.943 16:42:08 -- common/autotest_common.sh@960 -- # wait 469106 00:06:23.202 00:06:23.202 real 0m3.835s 00:06:23.202 user 0m4.093s 00:06:23.202 sys 0m1.333s 00:06:23.202 16:42:08 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:23.202 16:42:08 -- common/autotest_common.sh@10 -- # set +x 00:06:23.202 ************************************ 00:06:23.202 END TEST locking_app_on_unlocked_coremask 00:06:23.202 ************************************ 00:06:23.202 16:42:08 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:06:23.202 16:42:08 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:23.202 16:42:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:23.202 16:42:08 -- common/autotest_common.sh@10 -- # set +x 00:06:23.202 ************************************ 00:06:23.202 START TEST locking_app_on_locked_coremask 00:06:23.202 ************************************ 00:06:23.202 16:42:08 -- common/autotest_common.sh@1114 -- # locking_app_on_locked_coremask 00:06:23.202 16:42:08 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=469686 00:06:23.202 16:42:08 -- event/cpu_locks.sh@116 -- # waitforlisten 469686 /var/tmp/spdk.sock 00:06:23.202 16:42:08 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:23.202 16:42:08 -- common/autotest_common.sh@829 -- # '[' -z 469686 ']' 00:06:23.202 16:42:08 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:23.202 16:42:08 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:23.202 16:42:08 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:23.202 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:23.202 16:42:08 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:23.202 16:42:08 -- common/autotest_common.sh@10 -- # set +x 00:06:23.202 [2024-11-16 16:42:08.904944] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:23.202 [2024-11-16 16:42:08.905036] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid469686 ] 00:06:23.202 EAL: No free 2048 kB hugepages reported on node 1 00:06:23.461 [2024-11-16 16:42:08.971334] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:23.461 [2024-11-16 16:42:09.003657] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:23.461 [2024-11-16 16:42:09.003783] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.028 16:42:09 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:24.028 16:42:09 -- common/autotest_common.sh@862 -- # return 0 00:06:24.028 16:42:09 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=469950 00:06:24.028 16:42:09 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 469950 /var/tmp/spdk2.sock 00:06:24.028 16:42:09 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:24.028 16:42:09 -- common/autotest_common.sh@650 -- # local es=0 00:06:24.028 16:42:09 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 469950 /var/tmp/spdk2.sock 00:06:24.028 16:42:09 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:24.028 16:42:09 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:24.028 16:42:09 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:24.028 16:42:09 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:24.028 16:42:09 -- common/autotest_common.sh@653 -- # waitforlisten 469950 /var/tmp/spdk2.sock 00:06:24.028 16:42:09 -- common/autotest_common.sh@829 -- # '[' -z 469950 ']' 00:06:24.028 16:42:09 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:24.028 16:42:09 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:24.028 16:42:09 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:24.028 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:24.028 16:42:09 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:24.028 16:42:09 -- common/autotest_common.sh@10 -- # set +x 00:06:24.028 [2024-11-16 16:42:09.757709] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:24.028 [2024-11-16 16:42:09.757773] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid469950 ] 00:06:24.286 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.286 [2024-11-16 16:42:09.850121] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 469686 has claimed it. 00:06:24.286 [2024-11-16 16:42:09.850159] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:24.854 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (469950) - No such process 00:06:24.854 ERROR: process (pid: 469950) is no longer running 00:06:24.854 16:42:10 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:24.854 16:42:10 -- common/autotest_common.sh@862 -- # return 1 00:06:24.854 16:42:10 -- common/autotest_common.sh@653 -- # es=1 00:06:24.854 16:42:10 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:24.854 16:42:10 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:24.854 16:42:10 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:24.854 16:42:10 -- event/cpu_locks.sh@122 -- # locks_exist 469686 00:06:24.854 16:42:10 -- event/cpu_locks.sh@22 -- # lslocks -p 469686 00:06:24.854 16:42:10 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:25.423 lslocks: write error 00:06:25.423 16:42:10 -- event/cpu_locks.sh@124 -- # killprocess 469686 00:06:25.423 16:42:10 -- common/autotest_common.sh@936 -- # '[' -z 469686 ']' 00:06:25.423 16:42:10 -- common/autotest_common.sh@940 -- # kill -0 469686 00:06:25.423 16:42:10 -- common/autotest_common.sh@941 -- # uname 00:06:25.423 16:42:10 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:25.423 16:42:10 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 469686 00:06:25.423 16:42:11 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:25.424 16:42:11 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:25.424 16:42:11 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 469686' 00:06:25.424 killing process with pid 469686 00:06:25.424 16:42:11 -- common/autotest_common.sh@955 -- # kill 469686 00:06:25.424 16:42:11 -- common/autotest_common.sh@960 -- # wait 469686 00:06:25.684 00:06:25.684 real 0m2.461s 00:06:25.684 user 0m2.700s 00:06:25.684 sys 0m0.743s 00:06:25.684 16:42:11 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:25.684 16:42:11 -- common/autotest_common.sh@10 -- # set +x 00:06:25.684 ************************************ 00:06:25.684 END TEST locking_app_on_locked_coremask 00:06:25.684 ************************************ 00:06:25.684 16:42:11 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:06:25.684 16:42:11 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:25.684 16:42:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:25.684 16:42:11 -- common/autotest_common.sh@10 -- # set +x 00:06:25.684 ************************************ 00:06:25.684 START TEST locking_overlapped_coremask 00:06:25.684 ************************************ 00:06:25.684 16:42:11 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask 00:06:25.684 16:42:11 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=470248 00:06:25.684 16:42:11 -- event/cpu_locks.sh@133 -- # waitforlisten 470248 /var/tmp/spdk.sock 00:06:25.684 16:42:11 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:06:25.684 16:42:11 -- common/autotest_common.sh@829 -- # '[' -z 470248 ']' 00:06:25.684 16:42:11 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:25.684 16:42:11 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:25.684 16:42:11 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:25.684 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:25.684 16:42:11 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:25.684 16:42:11 -- common/autotest_common.sh@10 -- # set +x 00:06:25.684 [2024-11-16 16:42:11.416431] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:25.684 [2024-11-16 16:42:11.416527] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid470248 ] 00:06:25.943 EAL: No free 2048 kB hugepages reported on node 1 00:06:25.943 [2024-11-16 16:42:11.483099] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:25.943 [2024-11-16 16:42:11.516915] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:25.943 [2024-11-16 16:42:11.517073] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:25.943 [2024-11-16 16:42:11.517193] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:25.943 [2024-11-16 16:42:11.517195] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.512 16:42:12 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:26.512 16:42:12 -- common/autotest_common.sh@862 -- # return 0 00:06:26.512 16:42:12 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=470271 00:06:26.512 16:42:12 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:06:26.512 16:42:12 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 470271 /var/tmp/spdk2.sock 00:06:26.512 16:42:12 -- common/autotest_common.sh@650 -- # local es=0 00:06:26.512 16:42:12 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 470271 /var/tmp/spdk2.sock 00:06:26.512 16:42:12 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:26.512 16:42:12 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:26.512 16:42:12 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:26.512 16:42:12 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:26.512 16:42:12 -- common/autotest_common.sh@653 -- # waitforlisten 470271 /var/tmp/spdk2.sock 00:06:26.512 16:42:12 -- common/autotest_common.sh@829 -- # '[' -z 470271 ']' 00:06:26.512 16:42:12 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:26.512 16:42:12 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:26.512 16:42:12 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:26.512 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:26.512 16:42:12 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:26.512 16:42:12 -- common/autotest_common.sh@10 -- # set +x 00:06:26.512 [2024-11-16 16:42:12.258974] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:26.512 [2024-11-16 16:42:12.259050] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid470271 ] 00:06:26.771 EAL: No free 2048 kB hugepages reported on node 1 00:06:26.771 [2024-11-16 16:42:12.353274] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 470248 has claimed it. 00:06:26.771 [2024-11-16 16:42:12.353315] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:27.341 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (470271) - No such process 00:06:27.341 ERROR: process (pid: 470271) is no longer running 00:06:27.341 16:42:12 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:27.341 16:42:12 -- common/autotest_common.sh@862 -- # return 1 00:06:27.341 16:42:12 -- common/autotest_common.sh@653 -- # es=1 00:06:27.341 16:42:12 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:27.341 16:42:12 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:27.341 16:42:12 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:27.341 16:42:12 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:06:27.341 16:42:12 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:27.341 16:42:12 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:27.341 16:42:12 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:27.341 16:42:12 -- event/cpu_locks.sh@141 -- # killprocess 470248 00:06:27.341 16:42:12 -- common/autotest_common.sh@936 -- # '[' -z 470248 ']' 00:06:27.341 16:42:12 -- common/autotest_common.sh@940 -- # kill -0 470248 00:06:27.341 16:42:12 -- common/autotest_common.sh@941 -- # uname 00:06:27.341 16:42:12 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:27.341 16:42:12 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 470248 00:06:27.341 16:42:12 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:27.341 16:42:12 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:27.341 16:42:12 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 470248' 00:06:27.341 killing process with pid 470248 00:06:27.341 16:42:12 -- common/autotest_common.sh@955 -- # kill 470248 00:06:27.341 16:42:12 -- common/autotest_common.sh@960 -- # wait 470248 00:06:27.601 00:06:27.601 real 0m1.898s 00:06:27.601 user 0m5.460s 00:06:27.601 sys 0m0.453s 00:06:27.601 16:42:13 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:27.601 16:42:13 -- common/autotest_common.sh@10 -- # set +x 00:06:27.601 ************************************ 00:06:27.601 END TEST locking_overlapped_coremask 00:06:27.601 ************************************ 00:06:27.601 16:42:13 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:06:27.601 16:42:13 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:27.601 16:42:13 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:27.601 16:42:13 -- common/autotest_common.sh@10 -- # set +x 00:06:27.601 ************************************ 00:06:27.601 START TEST locking_overlapped_coremask_via_rpc 00:06:27.601 ************************************ 00:06:27.601 16:42:13 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask_via_rpc 00:06:27.601 16:42:13 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=470559 00:06:27.601 16:42:13 -- event/cpu_locks.sh@149 -- # waitforlisten 470559 /var/tmp/spdk.sock 00:06:27.601 16:42:13 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:06:27.601 16:42:13 -- common/autotest_common.sh@829 -- # '[' -z 470559 ']' 00:06:27.601 16:42:13 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:27.601 16:42:13 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:27.601 16:42:13 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:27.601 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:27.601 16:42:13 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:27.601 16:42:13 -- common/autotest_common.sh@10 -- # set +x 00:06:27.861 [2024-11-16 16:42:13.367447] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:27.861 [2024-11-16 16:42:13.367544] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid470559 ] 00:06:27.861 EAL: No free 2048 kB hugepages reported on node 1 00:06:27.861 [2024-11-16 16:42:13.434974] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:27.861 [2024-11-16 16:42:13.435005] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:27.861 [2024-11-16 16:42:13.469115] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:27.861 [2024-11-16 16:42:13.469312] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:27.861 [2024-11-16 16:42:13.469412] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:27.861 [2024-11-16 16:42:13.469414] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.798 16:42:14 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:28.798 16:42:14 -- common/autotest_common.sh@862 -- # return 0 00:06:28.798 16:42:14 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=470767 00:06:28.798 16:42:14 -- event/cpu_locks.sh@153 -- # waitforlisten 470767 /var/tmp/spdk2.sock 00:06:28.798 16:42:14 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:06:28.798 16:42:14 -- common/autotest_common.sh@829 -- # '[' -z 470767 ']' 00:06:28.798 16:42:14 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:28.798 16:42:14 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:28.798 16:42:14 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:28.798 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:28.798 16:42:14 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:28.798 16:42:14 -- common/autotest_common.sh@10 -- # set +x 00:06:28.798 [2024-11-16 16:42:14.221106] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:28.798 [2024-11-16 16:42:14.221173] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid470767 ] 00:06:28.798 EAL: No free 2048 kB hugepages reported on node 1 00:06:28.798 [2024-11-16 16:42:14.312696] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:28.798 [2024-11-16 16:42:14.312727] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:28.798 [2024-11-16 16:42:14.386093] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:28.798 [2024-11-16 16:42:14.386249] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:28.798 [2024-11-16 16:42:14.389712] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:28.798 [2024-11-16 16:42:14.389714] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:06:29.367 16:42:15 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:29.367 16:42:15 -- common/autotest_common.sh@862 -- # return 0 00:06:29.367 16:42:15 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:06:29.367 16:42:15 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:29.367 16:42:15 -- common/autotest_common.sh@10 -- # set +x 00:06:29.367 16:42:15 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:29.367 16:42:15 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:29.367 16:42:15 -- common/autotest_common.sh@650 -- # local es=0 00:06:29.367 16:42:15 -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:29.367 16:42:15 -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:06:29.367 16:42:15 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:29.367 16:42:15 -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:06:29.367 16:42:15 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:29.367 16:42:15 -- common/autotest_common.sh@653 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:29.367 16:42:15 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:29.367 16:42:15 -- common/autotest_common.sh@10 -- # set +x 00:06:29.367 [2024-11-16 16:42:15.093730] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 470559 has claimed it. 00:06:29.367 request: 00:06:29.367 { 00:06:29.367 "method": "framework_enable_cpumask_locks", 00:06:29.367 "req_id": 1 00:06:29.367 } 00:06:29.367 Got JSON-RPC error response 00:06:29.367 response: 00:06:29.367 { 00:06:29.367 "code": -32603, 00:06:29.367 "message": "Failed to claim CPU core: 2" 00:06:29.367 } 00:06:29.367 16:42:15 -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:29.367 16:42:15 -- common/autotest_common.sh@653 -- # es=1 00:06:29.367 16:42:15 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:29.367 16:42:15 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:29.367 16:42:15 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:29.367 16:42:15 -- event/cpu_locks.sh@158 -- # waitforlisten 470559 /var/tmp/spdk.sock 00:06:29.367 16:42:15 -- common/autotest_common.sh@829 -- # '[' -z 470559 ']' 00:06:29.367 16:42:15 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:29.367 16:42:15 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:29.367 16:42:15 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:29.367 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:29.367 16:42:15 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:29.367 16:42:15 -- common/autotest_common.sh@10 -- # set +x 00:06:29.626 16:42:15 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:29.626 16:42:15 -- common/autotest_common.sh@862 -- # return 0 00:06:29.626 16:42:15 -- event/cpu_locks.sh@159 -- # waitforlisten 470767 /var/tmp/spdk2.sock 00:06:29.626 16:42:15 -- common/autotest_common.sh@829 -- # '[' -z 470767 ']' 00:06:29.626 16:42:15 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:29.626 16:42:15 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:29.626 16:42:15 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:29.626 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:29.626 16:42:15 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:29.626 16:42:15 -- common/autotest_common.sh@10 -- # set +x 00:06:29.886 16:42:15 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:29.886 16:42:15 -- common/autotest_common.sh@862 -- # return 0 00:06:29.886 16:42:15 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:06:29.886 16:42:15 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:29.886 16:42:15 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:29.886 16:42:15 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:29.886 00:06:29.886 real 0m2.117s 00:06:29.886 user 0m0.879s 00:06:29.886 sys 0m0.170s 00:06:29.886 16:42:15 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:29.886 16:42:15 -- common/autotest_common.sh@10 -- # set +x 00:06:29.886 ************************************ 00:06:29.886 END TEST locking_overlapped_coremask_via_rpc 00:06:29.886 ************************************ 00:06:29.886 16:42:15 -- event/cpu_locks.sh@174 -- # cleanup 00:06:29.886 16:42:15 -- event/cpu_locks.sh@15 -- # [[ -z 470559 ]] 00:06:29.886 16:42:15 -- event/cpu_locks.sh@15 -- # killprocess 470559 00:06:29.886 16:42:15 -- common/autotest_common.sh@936 -- # '[' -z 470559 ']' 00:06:29.886 16:42:15 -- common/autotest_common.sh@940 -- # kill -0 470559 00:06:29.886 16:42:15 -- common/autotest_common.sh@941 -- # uname 00:06:29.886 16:42:15 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:29.886 16:42:15 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 470559 00:06:29.886 16:42:15 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:29.886 16:42:15 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:29.886 16:42:15 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 470559' 00:06:29.886 killing process with pid 470559 00:06:29.886 16:42:15 -- common/autotest_common.sh@955 -- # kill 470559 00:06:29.886 16:42:15 -- common/autotest_common.sh@960 -- # wait 470559 00:06:30.146 16:42:15 -- event/cpu_locks.sh@16 -- # [[ -z 470767 ]] 00:06:30.146 16:42:15 -- event/cpu_locks.sh@16 -- # killprocess 470767 00:06:30.146 16:42:15 -- common/autotest_common.sh@936 -- # '[' -z 470767 ']' 00:06:30.146 16:42:15 -- common/autotest_common.sh@940 -- # kill -0 470767 00:06:30.146 16:42:15 -- common/autotest_common.sh@941 -- # uname 00:06:30.146 16:42:15 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:30.146 16:42:15 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 470767 00:06:30.405 16:42:15 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:06:30.405 16:42:15 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:06:30.405 16:42:15 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 470767' 00:06:30.405 killing process with pid 470767 00:06:30.405 16:42:15 -- common/autotest_common.sh@955 -- # kill 470767 00:06:30.405 16:42:15 -- common/autotest_common.sh@960 -- # wait 470767 00:06:30.665 16:42:16 -- event/cpu_locks.sh@18 -- # rm -f 00:06:30.665 16:42:16 -- event/cpu_locks.sh@1 -- # cleanup 00:06:30.665 16:42:16 -- event/cpu_locks.sh@15 -- # [[ -z 470559 ]] 00:06:30.665 16:42:16 -- event/cpu_locks.sh@15 -- # killprocess 470559 00:06:30.665 16:42:16 -- common/autotest_common.sh@936 -- # '[' -z 470559 ']' 00:06:30.665 16:42:16 -- common/autotest_common.sh@940 -- # kill -0 470559 00:06:30.665 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (470559) - No such process 00:06:30.665 16:42:16 -- common/autotest_common.sh@963 -- # echo 'Process with pid 470559 is not found' 00:06:30.665 Process with pid 470559 is not found 00:06:30.665 16:42:16 -- event/cpu_locks.sh@16 -- # [[ -z 470767 ]] 00:06:30.665 16:42:16 -- event/cpu_locks.sh@16 -- # killprocess 470767 00:06:30.665 16:42:16 -- common/autotest_common.sh@936 -- # '[' -z 470767 ']' 00:06:30.665 16:42:16 -- common/autotest_common.sh@940 -- # kill -0 470767 00:06:30.665 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (470767) - No such process 00:06:30.665 16:42:16 -- common/autotest_common.sh@963 -- # echo 'Process with pid 470767 is not found' 00:06:30.665 Process with pid 470767 is not found 00:06:30.665 16:42:16 -- event/cpu_locks.sh@18 -- # rm -f 00:06:30.665 00:06:30.665 real 0m18.281s 00:06:30.665 user 0m31.226s 00:06:30.665 sys 0m5.937s 00:06:30.665 16:42:16 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:30.665 16:42:16 -- common/autotest_common.sh@10 -- # set +x 00:06:30.665 ************************************ 00:06:30.665 END TEST cpu_locks 00:06:30.665 ************************************ 00:06:30.665 00:06:30.665 real 0m43.818s 00:06:30.665 user 1m22.396s 00:06:30.665 sys 0m10.094s 00:06:30.665 16:42:16 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:30.665 16:42:16 -- common/autotest_common.sh@10 -- # set +x 00:06:30.665 ************************************ 00:06:30.665 END TEST event 00:06:30.665 ************************************ 00:06:30.665 16:42:16 -- spdk/autotest.sh@175 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:30.665 16:42:16 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:30.665 16:42:16 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:30.665 16:42:16 -- common/autotest_common.sh@10 -- # set +x 00:06:30.665 ************************************ 00:06:30.665 START TEST thread 00:06:30.665 ************************************ 00:06:30.665 16:42:16 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:30.924 * Looking for test storage... 00:06:30.924 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:06:30.924 16:42:16 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:30.924 16:42:16 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:30.924 16:42:16 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:30.924 16:42:16 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:30.924 16:42:16 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:30.924 16:42:16 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:30.924 16:42:16 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:30.924 16:42:16 -- scripts/common.sh@335 -- # IFS=.-: 00:06:30.924 16:42:16 -- scripts/common.sh@335 -- # read -ra ver1 00:06:30.924 16:42:16 -- scripts/common.sh@336 -- # IFS=.-: 00:06:30.924 16:42:16 -- scripts/common.sh@336 -- # read -ra ver2 00:06:30.924 16:42:16 -- scripts/common.sh@337 -- # local 'op=<' 00:06:30.924 16:42:16 -- scripts/common.sh@339 -- # ver1_l=2 00:06:30.924 16:42:16 -- scripts/common.sh@340 -- # ver2_l=1 00:06:30.924 16:42:16 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:30.924 16:42:16 -- scripts/common.sh@343 -- # case "$op" in 00:06:30.924 16:42:16 -- scripts/common.sh@344 -- # : 1 00:06:30.924 16:42:16 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:30.924 16:42:16 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:30.924 16:42:16 -- scripts/common.sh@364 -- # decimal 1 00:06:30.924 16:42:16 -- scripts/common.sh@352 -- # local d=1 00:06:30.924 16:42:16 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:30.924 16:42:16 -- scripts/common.sh@354 -- # echo 1 00:06:30.924 16:42:16 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:30.924 16:42:16 -- scripts/common.sh@365 -- # decimal 2 00:06:30.924 16:42:16 -- scripts/common.sh@352 -- # local d=2 00:06:30.924 16:42:16 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:30.924 16:42:16 -- scripts/common.sh@354 -- # echo 2 00:06:30.924 16:42:16 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:30.924 16:42:16 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:30.924 16:42:16 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:30.924 16:42:16 -- scripts/common.sh@367 -- # return 0 00:06:30.924 16:42:16 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:30.924 16:42:16 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:30.924 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:30.924 --rc genhtml_branch_coverage=1 00:06:30.924 --rc genhtml_function_coverage=1 00:06:30.924 --rc genhtml_legend=1 00:06:30.924 --rc geninfo_all_blocks=1 00:06:30.924 --rc geninfo_unexecuted_blocks=1 00:06:30.924 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:30.924 ' 00:06:30.924 16:42:16 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:30.924 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:30.924 --rc genhtml_branch_coverage=1 00:06:30.924 --rc genhtml_function_coverage=1 00:06:30.924 --rc genhtml_legend=1 00:06:30.924 --rc geninfo_all_blocks=1 00:06:30.924 --rc geninfo_unexecuted_blocks=1 00:06:30.924 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:30.924 ' 00:06:30.924 16:42:16 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:30.924 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:30.924 --rc genhtml_branch_coverage=1 00:06:30.924 --rc genhtml_function_coverage=1 00:06:30.924 --rc genhtml_legend=1 00:06:30.924 --rc geninfo_all_blocks=1 00:06:30.924 --rc geninfo_unexecuted_blocks=1 00:06:30.924 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:30.924 ' 00:06:30.924 16:42:16 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:30.924 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:30.924 --rc genhtml_branch_coverage=1 00:06:30.924 --rc genhtml_function_coverage=1 00:06:30.924 --rc genhtml_legend=1 00:06:30.924 --rc geninfo_all_blocks=1 00:06:30.924 --rc geninfo_unexecuted_blocks=1 00:06:30.924 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:30.924 ' 00:06:30.924 16:42:16 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:30.924 16:42:16 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:30.924 16:42:16 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:30.924 16:42:16 -- common/autotest_common.sh@10 -- # set +x 00:06:30.924 ************************************ 00:06:30.924 START TEST thread_poller_perf 00:06:30.924 ************************************ 00:06:30.924 16:42:16 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:30.924 [2024-11-16 16:42:16.519432] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:30.924 [2024-11-16 16:42:16.519524] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid471212 ] 00:06:30.924 EAL: No free 2048 kB hugepages reported on node 1 00:06:30.924 [2024-11-16 16:42:16.587879] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:30.924 [2024-11-16 16:42:16.623492] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:30.924 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:32.302 [2024-11-16T15:42:18.052Z] ====================================== 00:06:32.302 [2024-11-16T15:42:18.052Z] busy:2505491406 (cyc) 00:06:32.302 [2024-11-16T15:42:18.052Z] total_run_count: 790000 00:06:32.302 [2024-11-16T15:42:18.052Z] tsc_hz: 2500000000 (cyc) 00:06:32.302 [2024-11-16T15:42:18.052Z] ====================================== 00:06:32.302 [2024-11-16T15:42:18.052Z] poller_cost: 3171 (cyc), 1268 (nsec) 00:06:32.302 00:06:32.302 real 0m1.180s 00:06:32.302 user 0m1.088s 00:06:32.302 sys 0m0.088s 00:06:32.302 16:42:17 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:32.302 16:42:17 -- common/autotest_common.sh@10 -- # set +x 00:06:32.302 ************************************ 00:06:32.302 END TEST thread_poller_perf 00:06:32.302 ************************************ 00:06:32.302 16:42:17 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:32.302 16:42:17 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:32.302 16:42:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:32.302 16:42:17 -- common/autotest_common.sh@10 -- # set +x 00:06:32.302 ************************************ 00:06:32.302 START TEST thread_poller_perf 00:06:32.302 ************************************ 00:06:32.302 16:42:17 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:32.302 [2024-11-16 16:42:17.740338] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:32.302 [2024-11-16 16:42:17.740425] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid471496 ] 00:06:32.302 EAL: No free 2048 kB hugepages reported on node 1 00:06:32.302 [2024-11-16 16:42:17.808325] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:32.302 [2024-11-16 16:42:17.843102] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:32.302 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:33.238 [2024-11-16T15:42:18.988Z] ====================================== 00:06:33.238 [2024-11-16T15:42:18.988Z] busy:2501983800 (cyc) 00:06:33.238 [2024-11-16T15:42:18.988Z] total_run_count: 13112000 00:06:33.238 [2024-11-16T15:42:18.988Z] tsc_hz: 2500000000 (cyc) 00:06:33.238 [2024-11-16T15:42:18.988Z] ====================================== 00:06:33.238 [2024-11-16T15:42:18.988Z] poller_cost: 190 (cyc), 76 (nsec) 00:06:33.238 00:06:33.238 real 0m1.177s 00:06:33.238 user 0m1.078s 00:06:33.238 sys 0m0.094s 00:06:33.238 16:42:18 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:33.238 16:42:18 -- common/autotest_common.sh@10 -- # set +x 00:06:33.238 ************************************ 00:06:33.238 END TEST thread_poller_perf 00:06:33.238 ************************************ 00:06:33.238 16:42:18 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:06:33.238 16:42:18 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:33.238 16:42:18 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:33.238 16:42:18 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:33.238 16:42:18 -- common/autotest_common.sh@10 -- # set +x 00:06:33.238 ************************************ 00:06:33.238 START TEST thread_spdk_lock 00:06:33.238 ************************************ 00:06:33.238 16:42:18 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:33.238 [2024-11-16 16:42:18.959941] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:33.238 [2024-11-16 16:42:18.960063] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid471784 ] 00:06:33.498 EAL: No free 2048 kB hugepages reported on node 1 00:06:33.498 [2024-11-16 16:42:19.030233] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:33.498 [2024-11-16 16:42:19.065954] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:33.498 [2024-11-16 16:42:19.065956] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:34.066 [2024-11-16 16:42:19.552200] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 957:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:34.066 [2024-11-16 16:42:19.552245] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3064:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:06:34.066 [2024-11-16 16:42:19.552272] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3019:sspin_stacks_print: *ERROR*: spinlock 0x12e2e40 00:06:34.066 [2024-11-16 16:42:19.553063] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:34.066 [2024-11-16 16:42:19.553167] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1018:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:34.066 [2024-11-16 16:42:19.553185] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:34.066 Starting test contend 00:06:34.066 Worker Delay Wait us Hold us Total us 00:06:34.066 0 3 164070 182559 346630 00:06:34.066 1 5 84184 284527 368711 00:06:34.066 PASS test contend 00:06:34.066 Starting test hold_by_poller 00:06:34.066 PASS test hold_by_poller 00:06:34.066 Starting test hold_by_message 00:06:34.066 PASS test hold_by_message 00:06:34.066 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:06:34.066 100014 assertions passed 00:06:34.066 0 assertions failed 00:06:34.066 00:06:34.066 real 0m0.663s 00:06:34.066 user 0m1.057s 00:06:34.066 sys 0m0.089s 00:06:34.066 16:42:19 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:34.066 16:42:19 -- common/autotest_common.sh@10 -- # set +x 00:06:34.066 ************************************ 00:06:34.066 END TEST thread_spdk_lock 00:06:34.066 ************************************ 00:06:34.066 00:06:34.066 real 0m3.312s 00:06:34.066 user 0m3.351s 00:06:34.066 sys 0m0.470s 00:06:34.066 16:42:19 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:34.066 16:42:19 -- common/autotest_common.sh@10 -- # set +x 00:06:34.066 ************************************ 00:06:34.066 END TEST thread 00:06:34.066 ************************************ 00:06:34.066 16:42:19 -- spdk/autotest.sh@176 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:34.066 16:42:19 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:34.066 16:42:19 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:34.066 16:42:19 -- common/autotest_common.sh@10 -- # set +x 00:06:34.066 ************************************ 00:06:34.066 START TEST accel 00:06:34.066 ************************************ 00:06:34.066 16:42:19 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:34.066 * Looking for test storage... 00:06:34.066 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:34.066 16:42:19 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:34.066 16:42:19 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:34.066 16:42:19 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:34.326 16:42:19 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:34.326 16:42:19 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:34.326 16:42:19 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:34.326 16:42:19 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:34.326 16:42:19 -- scripts/common.sh@335 -- # IFS=.-: 00:06:34.326 16:42:19 -- scripts/common.sh@335 -- # read -ra ver1 00:06:34.326 16:42:19 -- scripts/common.sh@336 -- # IFS=.-: 00:06:34.326 16:42:19 -- scripts/common.sh@336 -- # read -ra ver2 00:06:34.326 16:42:19 -- scripts/common.sh@337 -- # local 'op=<' 00:06:34.326 16:42:19 -- scripts/common.sh@339 -- # ver1_l=2 00:06:34.326 16:42:19 -- scripts/common.sh@340 -- # ver2_l=1 00:06:34.326 16:42:19 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:34.326 16:42:19 -- scripts/common.sh@343 -- # case "$op" in 00:06:34.326 16:42:19 -- scripts/common.sh@344 -- # : 1 00:06:34.326 16:42:19 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:34.326 16:42:19 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:34.326 16:42:19 -- scripts/common.sh@364 -- # decimal 1 00:06:34.326 16:42:19 -- scripts/common.sh@352 -- # local d=1 00:06:34.326 16:42:19 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:34.326 16:42:19 -- scripts/common.sh@354 -- # echo 1 00:06:34.326 16:42:19 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:34.326 16:42:19 -- scripts/common.sh@365 -- # decimal 2 00:06:34.326 16:42:19 -- scripts/common.sh@352 -- # local d=2 00:06:34.326 16:42:19 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:34.326 16:42:19 -- scripts/common.sh@354 -- # echo 2 00:06:34.326 16:42:19 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:34.326 16:42:19 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:34.326 16:42:19 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:34.326 16:42:19 -- scripts/common.sh@367 -- # return 0 00:06:34.326 16:42:19 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:34.326 16:42:19 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:34.326 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.326 --rc genhtml_branch_coverage=1 00:06:34.326 --rc genhtml_function_coverage=1 00:06:34.326 --rc genhtml_legend=1 00:06:34.326 --rc geninfo_all_blocks=1 00:06:34.326 --rc geninfo_unexecuted_blocks=1 00:06:34.326 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.326 ' 00:06:34.326 16:42:19 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:34.326 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.326 --rc genhtml_branch_coverage=1 00:06:34.326 --rc genhtml_function_coverage=1 00:06:34.326 --rc genhtml_legend=1 00:06:34.326 --rc geninfo_all_blocks=1 00:06:34.326 --rc geninfo_unexecuted_blocks=1 00:06:34.326 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.326 ' 00:06:34.326 16:42:19 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:34.326 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.326 --rc genhtml_branch_coverage=1 00:06:34.326 --rc genhtml_function_coverage=1 00:06:34.326 --rc genhtml_legend=1 00:06:34.326 --rc geninfo_all_blocks=1 00:06:34.326 --rc geninfo_unexecuted_blocks=1 00:06:34.326 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.326 ' 00:06:34.326 16:42:19 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:34.326 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.326 --rc genhtml_branch_coverage=1 00:06:34.326 --rc genhtml_function_coverage=1 00:06:34.326 --rc genhtml_legend=1 00:06:34.326 --rc geninfo_all_blocks=1 00:06:34.326 --rc geninfo_unexecuted_blocks=1 00:06:34.326 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.326 ' 00:06:34.326 16:42:19 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:06:34.326 16:42:19 -- accel/accel.sh@74 -- # get_expected_opcs 00:06:34.326 16:42:19 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:34.326 16:42:19 -- accel/accel.sh@59 -- # spdk_tgt_pid=471865 00:06:34.326 16:42:19 -- accel/accel.sh@60 -- # waitforlisten 471865 00:06:34.326 16:42:19 -- common/autotest_common.sh@829 -- # '[' -z 471865 ']' 00:06:34.326 16:42:19 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:34.326 16:42:19 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:34.326 16:42:19 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:34.326 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:34.326 16:42:19 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:34.326 16:42:19 -- common/autotest_common.sh@10 -- # set +x 00:06:34.326 16:42:19 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:34.326 16:42:19 -- accel/accel.sh@58 -- # build_accel_config 00:06:34.326 16:42:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:34.326 16:42:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:34.326 16:42:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:34.326 16:42:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:34.326 16:42:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:34.326 16:42:19 -- accel/accel.sh@41 -- # local IFS=, 00:06:34.326 16:42:19 -- accel/accel.sh@42 -- # jq -r . 00:06:34.326 [2024-11-16 16:42:19.879442] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:34.326 [2024-11-16 16:42:19.879513] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid471865 ] 00:06:34.326 EAL: No free 2048 kB hugepages reported on node 1 00:06:34.326 [2024-11-16 16:42:19.942640] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:34.326 [2024-11-16 16:42:19.979661] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:34.326 [2024-11-16 16:42:19.979786] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.264 16:42:20 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:35.264 16:42:20 -- common/autotest_common.sh@862 -- # return 0 00:06:35.264 16:42:20 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:35.264 16:42:20 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:06:35.264 16:42:20 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.264 16:42:20 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:35.264 16:42:20 -- common/autotest_common.sh@10 -- # set +x 00:06:35.264 16:42:20 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.264 16:42:20 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # IFS== 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.264 16:42:20 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.264 16:42:20 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # IFS== 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.264 16:42:20 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.264 16:42:20 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # IFS== 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.264 16:42:20 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.264 16:42:20 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # IFS== 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.264 16:42:20 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.264 16:42:20 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # IFS== 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.264 16:42:20 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.264 16:42:20 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # IFS== 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.264 16:42:20 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.264 16:42:20 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # IFS== 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.264 16:42:20 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.264 16:42:20 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # IFS== 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.264 16:42:20 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.264 16:42:20 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # IFS== 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.264 16:42:20 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.264 16:42:20 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # IFS== 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.264 16:42:20 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.264 16:42:20 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # IFS== 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.264 16:42:20 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.264 16:42:20 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # IFS== 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.264 16:42:20 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.264 16:42:20 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # IFS== 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.264 16:42:20 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.264 16:42:20 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # IFS== 00:06:35.264 16:42:20 -- accel/accel.sh@64 -- # read -r opc module 00:06:35.264 16:42:20 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:35.264 16:42:20 -- accel/accel.sh@67 -- # killprocess 471865 00:06:35.264 16:42:20 -- common/autotest_common.sh@936 -- # '[' -z 471865 ']' 00:06:35.264 16:42:20 -- common/autotest_common.sh@940 -- # kill -0 471865 00:06:35.264 16:42:20 -- common/autotest_common.sh@941 -- # uname 00:06:35.264 16:42:20 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:35.264 16:42:20 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 471865 00:06:35.264 16:42:20 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:35.264 16:42:20 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:35.264 16:42:20 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 471865' 00:06:35.264 killing process with pid 471865 00:06:35.264 16:42:20 -- common/autotest_common.sh@955 -- # kill 471865 00:06:35.264 16:42:20 -- common/autotest_common.sh@960 -- # wait 471865 00:06:35.524 16:42:21 -- accel/accel.sh@68 -- # trap - ERR 00:06:35.524 16:42:21 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:06:35.524 16:42:21 -- common/autotest_common.sh@1087 -- # '[' 3 -le 1 ']' 00:06:35.524 16:42:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:35.524 16:42:21 -- common/autotest_common.sh@10 -- # set +x 00:06:35.524 16:42:21 -- common/autotest_common.sh@1114 -- # accel_perf -h 00:06:35.524 16:42:21 -- accel/accel.sh@12 -- # build_accel_config 00:06:35.524 16:42:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:35.524 16:42:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:35.524 16:42:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:35.524 16:42:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:35.524 16:42:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:35.524 16:42:21 -- accel/accel.sh@41 -- # local IFS=, 00:06:35.524 16:42:21 -- accel/accel.sh@42 -- # jq -r . 00:06:35.524 16:42:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:35.524 16:42:21 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:35.524 16:42:21 -- common/autotest_common.sh@10 -- # set +x 00:06:35.524 16:42:21 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:35.524 16:42:21 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:35.524 16:42:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:35.524 16:42:21 -- common/autotest_common.sh@10 -- # set +x 00:06:35.524 ************************************ 00:06:35.524 START TEST accel_missing_filename 00:06:35.524 ************************************ 00:06:35.524 16:42:21 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress 00:06:35.524 16:42:21 -- common/autotest_common.sh@650 -- # local es=0 00:06:35.524 16:42:21 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:35.524 16:42:21 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:35.524 16:42:21 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:35.524 16:42:21 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:35.524 16:42:21 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:35.524 16:42:21 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress 00:06:35.524 16:42:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:35.524 16:42:21 -- accel/accel.sh@12 -- # build_accel_config 00:06:35.524 16:42:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:35.524 16:42:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:35.524 16:42:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:35.524 16:42:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:35.525 16:42:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:35.525 16:42:21 -- accel/accel.sh@41 -- # local IFS=, 00:06:35.525 16:42:21 -- accel/accel.sh@42 -- # jq -r . 00:06:35.525 [2024-11-16 16:42:21.201849] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:35.525 [2024-11-16 16:42:21.201936] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid472165 ] 00:06:35.525 EAL: No free 2048 kB hugepages reported on node 1 00:06:35.525 [2024-11-16 16:42:21.270343] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.784 [2024-11-16 16:42:21.305607] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.784 [2024-11-16 16:42:21.345063] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:35.784 [2024-11-16 16:42:21.404797] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:35.784 A filename is required. 00:06:35.784 16:42:21 -- common/autotest_common.sh@653 -- # es=234 00:06:35.784 16:42:21 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:35.784 16:42:21 -- common/autotest_common.sh@662 -- # es=106 00:06:35.784 16:42:21 -- common/autotest_common.sh@663 -- # case "$es" in 00:06:35.784 16:42:21 -- common/autotest_common.sh@670 -- # es=1 00:06:35.784 16:42:21 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:35.784 00:06:35.784 real 0m0.286s 00:06:35.784 user 0m0.192s 00:06:35.784 sys 0m0.135s 00:06:35.784 16:42:21 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:35.784 16:42:21 -- common/autotest_common.sh@10 -- # set +x 00:06:35.784 ************************************ 00:06:35.784 END TEST accel_missing_filename 00:06:35.784 ************************************ 00:06:35.785 16:42:21 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:35.785 16:42:21 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:35.785 16:42:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:35.785 16:42:21 -- common/autotest_common.sh@10 -- # set +x 00:06:35.785 ************************************ 00:06:35.785 START TEST accel_compress_verify 00:06:35.785 ************************************ 00:06:35.785 16:42:21 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:35.785 16:42:21 -- common/autotest_common.sh@650 -- # local es=0 00:06:35.785 16:42:21 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:35.785 16:42:21 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:35.785 16:42:21 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:35.785 16:42:21 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:35.785 16:42:21 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:35.785 16:42:21 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:35.785 16:42:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:35.785 16:42:21 -- accel/accel.sh@12 -- # build_accel_config 00:06:35.785 16:42:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:35.785 16:42:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:35.785 16:42:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:35.785 16:42:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:35.785 16:42:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:35.785 16:42:21 -- accel/accel.sh@41 -- # local IFS=, 00:06:35.785 16:42:21 -- accel/accel.sh@42 -- # jq -r . 00:06:35.785 [2024-11-16 16:42:21.524595] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:35.785 [2024-11-16 16:42:21.524701] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid472220 ] 00:06:36.045 EAL: No free 2048 kB hugepages reported on node 1 00:06:36.045 [2024-11-16 16:42:21.594811] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.045 [2024-11-16 16:42:21.630365] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.045 [2024-11-16 16:42:21.669594] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:36.045 [2024-11-16 16:42:21.729867] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:36.045 00:06:36.045 Compression does not support the verify option, aborting. 00:06:36.045 16:42:21 -- common/autotest_common.sh@653 -- # es=161 00:06:36.045 16:42:21 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:36.045 16:42:21 -- common/autotest_common.sh@662 -- # es=33 00:06:36.045 16:42:21 -- common/autotest_common.sh@663 -- # case "$es" in 00:06:36.045 16:42:21 -- common/autotest_common.sh@670 -- # es=1 00:06:36.045 16:42:21 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:36.045 00:06:36.045 real 0m0.288s 00:06:36.045 user 0m0.202s 00:06:36.045 sys 0m0.124s 00:06:36.045 16:42:21 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:36.045 16:42:21 -- common/autotest_common.sh@10 -- # set +x 00:06:36.045 ************************************ 00:06:36.045 END TEST accel_compress_verify 00:06:36.045 ************************************ 00:06:36.305 16:42:21 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:36.305 16:42:21 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:36.305 16:42:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:36.305 16:42:21 -- common/autotest_common.sh@10 -- # set +x 00:06:36.305 ************************************ 00:06:36.305 START TEST accel_wrong_workload 00:06:36.305 ************************************ 00:06:36.305 16:42:21 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w foobar 00:06:36.305 16:42:21 -- common/autotest_common.sh@650 -- # local es=0 00:06:36.305 16:42:21 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:36.305 16:42:21 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:36.305 16:42:21 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:36.305 16:42:21 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:36.305 16:42:21 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:36.305 16:42:21 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w foobar 00:06:36.305 16:42:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:36.305 16:42:21 -- accel/accel.sh@12 -- # build_accel_config 00:06:36.305 16:42:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:36.305 16:42:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:36.305 16:42:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:36.305 16:42:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:36.305 16:42:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:36.305 16:42:21 -- accel/accel.sh@41 -- # local IFS=, 00:06:36.305 16:42:21 -- accel/accel.sh@42 -- # jq -r . 00:06:36.305 Unsupported workload type: foobar 00:06:36.305 [2024-11-16 16:42:21.850657] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:36.305 accel_perf options: 00:06:36.305 [-h help message] 00:06:36.305 [-q queue depth per core] 00:06:36.305 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:36.305 [-T number of threads per core 00:06:36.305 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:36.305 [-t time in seconds] 00:06:36.305 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:36.305 [ dif_verify, , dif_generate, dif_generate_copy 00:06:36.305 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:36.305 [-l for compress/decompress workloads, name of uncompressed input file 00:06:36.305 [-S for crc32c workload, use this seed value (default 0) 00:06:36.305 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:36.305 [-f for fill workload, use this BYTE value (default 255) 00:06:36.305 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:36.305 [-y verify result if this switch is on] 00:06:36.305 [-a tasks to allocate per core (default: same value as -q)] 00:06:36.305 Can be used to spread operations across a wider range of memory. 00:06:36.305 16:42:21 -- common/autotest_common.sh@653 -- # es=1 00:06:36.305 16:42:21 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:36.305 16:42:21 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:36.305 16:42:21 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:36.305 00:06:36.305 real 0m0.027s 00:06:36.305 user 0m0.013s 00:06:36.305 sys 0m0.014s 00:06:36.305 16:42:21 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:36.305 16:42:21 -- common/autotest_common.sh@10 -- # set +x 00:06:36.305 ************************************ 00:06:36.305 END TEST accel_wrong_workload 00:06:36.305 ************************************ 00:06:36.305 Error: writing output failed: Broken pipe 00:06:36.305 16:42:21 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:36.305 16:42:21 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:36.305 16:42:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:36.305 16:42:21 -- common/autotest_common.sh@10 -- # set +x 00:06:36.305 ************************************ 00:06:36.305 START TEST accel_negative_buffers 00:06:36.305 ************************************ 00:06:36.305 16:42:21 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:36.305 16:42:21 -- common/autotest_common.sh@650 -- # local es=0 00:06:36.305 16:42:21 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:36.305 16:42:21 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:36.305 16:42:21 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:36.305 16:42:21 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:36.305 16:42:21 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:36.305 16:42:21 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w xor -y -x -1 00:06:36.305 16:42:21 -- accel/accel.sh@12 -- # build_accel_config 00:06:36.305 16:42:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:36.305 16:42:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:36.305 16:42:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:36.305 16:42:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:36.305 16:42:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:36.305 16:42:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:36.305 16:42:21 -- accel/accel.sh@41 -- # local IFS=, 00:06:36.305 16:42:21 -- accel/accel.sh@42 -- # jq -r . 00:06:36.305 -x option must be non-negative. 00:06:36.305 [2024-11-16 16:42:21.911702] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:36.305 accel_perf options: 00:06:36.305 [-h help message] 00:06:36.305 [-q queue depth per core] 00:06:36.305 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:36.305 [-T number of threads per core 00:06:36.305 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:36.305 [-t time in seconds] 00:06:36.305 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:36.305 [ dif_verify, , dif_generate, dif_generate_copy 00:06:36.305 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:36.305 [-l for compress/decompress workloads, name of uncompressed input file 00:06:36.305 [-S for crc32c workload, use this seed value (default 0) 00:06:36.305 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:36.305 [-f for fill workload, use this BYTE value (default 255) 00:06:36.305 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:36.305 [-y verify result if this switch is on] 00:06:36.305 [-a tasks to allocate per core (default: same value as -q)] 00:06:36.305 Can be used to spread operations across a wider range of memory. 00:06:36.305 16:42:21 -- common/autotest_common.sh@653 -- # es=1 00:06:36.305 16:42:21 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:36.305 16:42:21 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:36.305 16:42:21 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:36.305 00:06:36.305 real 0m0.015s 00:06:36.305 user 0m0.005s 00:06:36.305 sys 0m0.010s 00:06:36.305 16:42:21 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:36.305 16:42:21 -- common/autotest_common.sh@10 -- # set +x 00:06:36.305 ************************************ 00:06:36.305 END TEST accel_negative_buffers 00:06:36.305 ************************************ 00:06:36.305 Error: writing output failed: Broken pipe 00:06:36.305 16:42:21 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:36.305 16:42:21 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:36.305 16:42:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:36.305 16:42:21 -- common/autotest_common.sh@10 -- # set +x 00:06:36.305 ************************************ 00:06:36.305 START TEST accel_crc32c 00:06:36.305 ************************************ 00:06:36.305 16:42:21 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:36.305 16:42:21 -- accel/accel.sh@16 -- # local accel_opc 00:06:36.305 16:42:21 -- accel/accel.sh@17 -- # local accel_module 00:06:36.305 16:42:21 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:36.305 16:42:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:36.305 16:42:21 -- accel/accel.sh@12 -- # build_accel_config 00:06:36.305 16:42:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:36.305 16:42:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:36.305 16:42:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:36.305 16:42:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:36.305 16:42:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:36.305 16:42:21 -- accel/accel.sh@41 -- # local IFS=, 00:06:36.306 16:42:21 -- accel/accel.sh@42 -- # jq -r . 00:06:36.306 [2024-11-16 16:42:21.982327] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:36.306 [2024-11-16 16:42:21.982416] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid472496 ] 00:06:36.306 EAL: No free 2048 kB hugepages reported on node 1 00:06:36.306 [2024-11-16 16:42:22.049757] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.565 [2024-11-16 16:42:22.086577] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.501 16:42:23 -- accel/accel.sh@18 -- # out=' 00:06:37.501 SPDK Configuration: 00:06:37.501 Core mask: 0x1 00:06:37.501 00:06:37.502 Accel Perf Configuration: 00:06:37.502 Workload Type: crc32c 00:06:37.502 CRC-32C seed: 32 00:06:37.502 Transfer size: 4096 bytes 00:06:37.502 Vector count 1 00:06:37.502 Module: software 00:06:37.502 Queue depth: 32 00:06:37.502 Allocate depth: 32 00:06:37.502 # threads/core: 1 00:06:37.502 Run time: 1 seconds 00:06:37.502 Verify: Yes 00:06:37.502 00:06:37.502 Running for 1 seconds... 00:06:37.502 00:06:37.502 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:37.502 ------------------------------------------------------------------------------------ 00:06:37.502 0,0 848608/s 3314 MiB/s 0 0 00:06:37.502 ==================================================================================== 00:06:37.502 Total 848608/s 3314 MiB/s 0 0' 00:06:37.761 16:42:23 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # IFS=: 00:06:37.761 16:42:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # read -r var val 00:06:37.761 16:42:23 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.761 16:42:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.761 16:42:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.761 16:42:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.761 16:42:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.761 16:42:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.761 16:42:23 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.761 16:42:23 -- accel/accel.sh@42 -- # jq -r . 00:06:37.761 [2024-11-16 16:42:23.258214] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:37.761 [2024-11-16 16:42:23.258263] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid472656 ] 00:06:37.761 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.761 [2024-11-16 16:42:23.315209] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.761 [2024-11-16 16:42:23.349695] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.761 16:42:23 -- accel/accel.sh@21 -- # val= 00:06:37.761 16:42:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # IFS=: 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # read -r var val 00:06:37.761 16:42:23 -- accel/accel.sh@21 -- # val= 00:06:37.761 16:42:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # IFS=: 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # read -r var val 00:06:37.761 16:42:23 -- accel/accel.sh@21 -- # val=0x1 00:06:37.761 16:42:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # IFS=: 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # read -r var val 00:06:37.761 16:42:23 -- accel/accel.sh@21 -- # val= 00:06:37.761 16:42:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # IFS=: 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # read -r var val 00:06:37.761 16:42:23 -- accel/accel.sh@21 -- # val= 00:06:37.761 16:42:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # IFS=: 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # read -r var val 00:06:37.761 16:42:23 -- accel/accel.sh@21 -- # val=crc32c 00:06:37.761 16:42:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.761 16:42:23 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # IFS=: 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # read -r var val 00:06:37.761 16:42:23 -- accel/accel.sh@21 -- # val=32 00:06:37.761 16:42:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # IFS=: 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # read -r var val 00:06:37.761 16:42:23 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:37.761 16:42:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # IFS=: 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # read -r var val 00:06:37.761 16:42:23 -- accel/accel.sh@21 -- # val= 00:06:37.761 16:42:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # IFS=: 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # read -r var val 00:06:37.761 16:42:23 -- accel/accel.sh@21 -- # val=software 00:06:37.761 16:42:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.761 16:42:23 -- accel/accel.sh@23 -- # accel_module=software 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # IFS=: 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # read -r var val 00:06:37.761 16:42:23 -- accel/accel.sh@21 -- # val=32 00:06:37.761 16:42:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # IFS=: 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # read -r var val 00:06:37.761 16:42:23 -- accel/accel.sh@21 -- # val=32 00:06:37.761 16:42:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # IFS=: 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # read -r var val 00:06:37.761 16:42:23 -- accel/accel.sh@21 -- # val=1 00:06:37.761 16:42:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # IFS=: 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # read -r var val 00:06:37.761 16:42:23 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:37.761 16:42:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # IFS=: 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # read -r var val 00:06:37.761 16:42:23 -- accel/accel.sh@21 -- # val=Yes 00:06:37.761 16:42:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # IFS=: 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # read -r var val 00:06:37.761 16:42:23 -- accel/accel.sh@21 -- # val= 00:06:37.761 16:42:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # IFS=: 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # read -r var val 00:06:37.761 16:42:23 -- accel/accel.sh@21 -- # val= 00:06:37.761 16:42:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # IFS=: 00:06:37.761 16:42:23 -- accel/accel.sh@20 -- # read -r var val 00:06:39.140 16:42:24 -- accel/accel.sh@21 -- # val= 00:06:39.140 16:42:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.140 16:42:24 -- accel/accel.sh@20 -- # IFS=: 00:06:39.140 16:42:24 -- accel/accel.sh@20 -- # read -r var val 00:06:39.140 16:42:24 -- accel/accel.sh@21 -- # val= 00:06:39.140 16:42:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.140 16:42:24 -- accel/accel.sh@20 -- # IFS=: 00:06:39.140 16:42:24 -- accel/accel.sh@20 -- # read -r var val 00:06:39.140 16:42:24 -- accel/accel.sh@21 -- # val= 00:06:39.140 16:42:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.140 16:42:24 -- accel/accel.sh@20 -- # IFS=: 00:06:39.140 16:42:24 -- accel/accel.sh@20 -- # read -r var val 00:06:39.140 16:42:24 -- accel/accel.sh@21 -- # val= 00:06:39.140 16:42:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.140 16:42:24 -- accel/accel.sh@20 -- # IFS=: 00:06:39.140 16:42:24 -- accel/accel.sh@20 -- # read -r var val 00:06:39.140 16:42:24 -- accel/accel.sh@21 -- # val= 00:06:39.140 16:42:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.140 16:42:24 -- accel/accel.sh@20 -- # IFS=: 00:06:39.140 16:42:24 -- accel/accel.sh@20 -- # read -r var val 00:06:39.141 16:42:24 -- accel/accel.sh@21 -- # val= 00:06:39.141 16:42:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:39.141 16:42:24 -- accel/accel.sh@20 -- # IFS=: 00:06:39.141 16:42:24 -- accel/accel.sh@20 -- # read -r var val 00:06:39.141 16:42:24 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:39.141 16:42:24 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:39.141 16:42:24 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:39.141 00:06:39.141 real 0m2.556s 00:06:39.141 user 0m2.322s 00:06:39.141 sys 0m0.243s 00:06:39.141 16:42:24 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:39.141 16:42:24 -- common/autotest_common.sh@10 -- # set +x 00:06:39.141 ************************************ 00:06:39.141 END TEST accel_crc32c 00:06:39.141 ************************************ 00:06:39.141 16:42:24 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:39.141 16:42:24 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:39.141 16:42:24 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:39.141 16:42:24 -- common/autotest_common.sh@10 -- # set +x 00:06:39.141 ************************************ 00:06:39.141 START TEST accel_crc32c_C2 00:06:39.141 ************************************ 00:06:39.141 16:42:24 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:39.141 16:42:24 -- accel/accel.sh@16 -- # local accel_opc 00:06:39.141 16:42:24 -- accel/accel.sh@17 -- # local accel_module 00:06:39.141 16:42:24 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:39.141 16:42:24 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:39.141 16:42:24 -- accel/accel.sh@12 -- # build_accel_config 00:06:39.141 16:42:24 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:39.141 16:42:24 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:39.141 16:42:24 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:39.141 16:42:24 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:39.141 16:42:24 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:39.141 16:42:24 -- accel/accel.sh@41 -- # local IFS=, 00:06:39.141 16:42:24 -- accel/accel.sh@42 -- # jq -r . 00:06:39.141 [2024-11-16 16:42:24.577736] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:39.141 [2024-11-16 16:42:24.577817] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid472825 ] 00:06:39.141 EAL: No free 2048 kB hugepages reported on node 1 00:06:39.141 [2024-11-16 16:42:24.645684] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.141 [2024-11-16 16:42:24.680992] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:40.519 16:42:25 -- accel/accel.sh@18 -- # out=' 00:06:40.519 SPDK Configuration: 00:06:40.519 Core mask: 0x1 00:06:40.519 00:06:40.519 Accel Perf Configuration: 00:06:40.519 Workload Type: crc32c 00:06:40.519 CRC-32C seed: 0 00:06:40.519 Transfer size: 4096 bytes 00:06:40.519 Vector count 2 00:06:40.519 Module: software 00:06:40.519 Queue depth: 32 00:06:40.519 Allocate depth: 32 00:06:40.519 # threads/core: 1 00:06:40.519 Run time: 1 seconds 00:06:40.519 Verify: Yes 00:06:40.519 00:06:40.519 Running for 1 seconds... 00:06:40.519 00:06:40.519 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:40.519 ------------------------------------------------------------------------------------ 00:06:40.519 0,0 613760/s 4795 MiB/s 0 0 00:06:40.519 ==================================================================================== 00:06:40.519 Total 613760/s 2397 MiB/s 0 0' 00:06:40.519 16:42:25 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:40.519 16:42:25 -- accel/accel.sh@20 -- # IFS=: 00:06:40.519 16:42:25 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:40.519 16:42:25 -- accel/accel.sh@20 -- # read -r var val 00:06:40.519 16:42:25 -- accel/accel.sh@12 -- # build_accel_config 00:06:40.519 16:42:25 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:40.519 16:42:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:40.519 16:42:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:40.519 16:42:25 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:40.519 16:42:25 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:40.519 16:42:25 -- accel/accel.sh@41 -- # local IFS=, 00:06:40.519 16:42:25 -- accel/accel.sh@42 -- # jq -r . 00:06:40.519 [2024-11-16 16:42:25.851937] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:40.519 [2024-11-16 16:42:25.851986] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid473064 ] 00:06:40.519 EAL: No free 2048 kB hugepages reported on node 1 00:06:40.519 [2024-11-16 16:42:25.909367] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:40.519 [2024-11-16 16:42:25.943488] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:40.519 16:42:25 -- accel/accel.sh@21 -- # val= 00:06:40.519 16:42:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.519 16:42:25 -- accel/accel.sh@20 -- # IFS=: 00:06:40.519 16:42:25 -- accel/accel.sh@20 -- # read -r var val 00:06:40.519 16:42:25 -- accel/accel.sh@21 -- # val= 00:06:40.519 16:42:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.519 16:42:25 -- accel/accel.sh@20 -- # IFS=: 00:06:40.519 16:42:25 -- accel/accel.sh@20 -- # read -r var val 00:06:40.519 16:42:25 -- accel/accel.sh@21 -- # val=0x1 00:06:40.519 16:42:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.519 16:42:25 -- accel/accel.sh@20 -- # IFS=: 00:06:40.519 16:42:25 -- accel/accel.sh@20 -- # read -r var val 00:06:40.519 16:42:25 -- accel/accel.sh@21 -- # val= 00:06:40.519 16:42:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.519 16:42:25 -- accel/accel.sh@20 -- # IFS=: 00:06:40.519 16:42:25 -- accel/accel.sh@20 -- # read -r var val 00:06:40.519 16:42:25 -- accel/accel.sh@21 -- # val= 00:06:40.519 16:42:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.519 16:42:25 -- accel/accel.sh@20 -- # IFS=: 00:06:40.519 16:42:25 -- accel/accel.sh@20 -- # read -r var val 00:06:40.519 16:42:25 -- accel/accel.sh@21 -- # val=crc32c 00:06:40.520 16:42:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.520 16:42:25 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # IFS=: 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # read -r var val 00:06:40.520 16:42:25 -- accel/accel.sh@21 -- # val=0 00:06:40.520 16:42:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # IFS=: 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # read -r var val 00:06:40.520 16:42:25 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:40.520 16:42:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # IFS=: 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # read -r var val 00:06:40.520 16:42:25 -- accel/accel.sh@21 -- # val= 00:06:40.520 16:42:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # IFS=: 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # read -r var val 00:06:40.520 16:42:25 -- accel/accel.sh@21 -- # val=software 00:06:40.520 16:42:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.520 16:42:25 -- accel/accel.sh@23 -- # accel_module=software 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # IFS=: 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # read -r var val 00:06:40.520 16:42:25 -- accel/accel.sh@21 -- # val=32 00:06:40.520 16:42:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # IFS=: 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # read -r var val 00:06:40.520 16:42:25 -- accel/accel.sh@21 -- # val=32 00:06:40.520 16:42:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # IFS=: 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # read -r var val 00:06:40.520 16:42:25 -- accel/accel.sh@21 -- # val=1 00:06:40.520 16:42:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # IFS=: 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # read -r var val 00:06:40.520 16:42:25 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:40.520 16:42:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # IFS=: 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # read -r var val 00:06:40.520 16:42:25 -- accel/accel.sh@21 -- # val=Yes 00:06:40.520 16:42:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # IFS=: 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # read -r var val 00:06:40.520 16:42:25 -- accel/accel.sh@21 -- # val= 00:06:40.520 16:42:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # IFS=: 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # read -r var val 00:06:40.520 16:42:25 -- accel/accel.sh@21 -- # val= 00:06:40.520 16:42:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # IFS=: 00:06:40.520 16:42:25 -- accel/accel.sh@20 -- # read -r var val 00:06:41.459 16:42:27 -- accel/accel.sh@21 -- # val= 00:06:41.459 16:42:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.459 16:42:27 -- accel/accel.sh@20 -- # IFS=: 00:06:41.459 16:42:27 -- accel/accel.sh@20 -- # read -r var val 00:06:41.459 16:42:27 -- accel/accel.sh@21 -- # val= 00:06:41.459 16:42:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.459 16:42:27 -- accel/accel.sh@20 -- # IFS=: 00:06:41.459 16:42:27 -- accel/accel.sh@20 -- # read -r var val 00:06:41.459 16:42:27 -- accel/accel.sh@21 -- # val= 00:06:41.459 16:42:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.459 16:42:27 -- accel/accel.sh@20 -- # IFS=: 00:06:41.459 16:42:27 -- accel/accel.sh@20 -- # read -r var val 00:06:41.459 16:42:27 -- accel/accel.sh@21 -- # val= 00:06:41.459 16:42:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.459 16:42:27 -- accel/accel.sh@20 -- # IFS=: 00:06:41.459 16:42:27 -- accel/accel.sh@20 -- # read -r var val 00:06:41.459 16:42:27 -- accel/accel.sh@21 -- # val= 00:06:41.459 16:42:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.459 16:42:27 -- accel/accel.sh@20 -- # IFS=: 00:06:41.459 16:42:27 -- accel/accel.sh@20 -- # read -r var val 00:06:41.459 16:42:27 -- accel/accel.sh@21 -- # val= 00:06:41.459 16:42:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.459 16:42:27 -- accel/accel.sh@20 -- # IFS=: 00:06:41.459 16:42:27 -- accel/accel.sh@20 -- # read -r var val 00:06:41.459 16:42:27 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:41.459 16:42:27 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:41.459 16:42:27 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:41.459 00:06:41.459 real 0m2.556s 00:06:41.459 user 0m2.324s 00:06:41.459 sys 0m0.241s 00:06:41.459 16:42:27 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:41.459 16:42:27 -- common/autotest_common.sh@10 -- # set +x 00:06:41.459 ************************************ 00:06:41.459 END TEST accel_crc32c_C2 00:06:41.459 ************************************ 00:06:41.459 16:42:27 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:41.459 16:42:27 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:41.459 16:42:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:41.459 16:42:27 -- common/autotest_common.sh@10 -- # set +x 00:06:41.459 ************************************ 00:06:41.459 START TEST accel_copy 00:06:41.459 ************************************ 00:06:41.459 16:42:27 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy -y 00:06:41.459 16:42:27 -- accel/accel.sh@16 -- # local accel_opc 00:06:41.459 16:42:27 -- accel/accel.sh@17 -- # local accel_module 00:06:41.459 16:42:27 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:06:41.459 16:42:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:41.459 16:42:27 -- accel/accel.sh@12 -- # build_accel_config 00:06:41.459 16:42:27 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:41.459 16:42:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:41.459 16:42:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:41.459 16:42:27 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:41.459 16:42:27 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:41.459 16:42:27 -- accel/accel.sh@41 -- # local IFS=, 00:06:41.459 16:42:27 -- accel/accel.sh@42 -- # jq -r . 00:06:41.459 [2024-11-16 16:42:27.173001] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:41.459 [2024-11-16 16:42:27.173092] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid473353 ] 00:06:41.459 EAL: No free 2048 kB hugepages reported on node 1 00:06:41.719 [2024-11-16 16:42:27.241489] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:41.719 [2024-11-16 16:42:27.276825] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.099 16:42:28 -- accel/accel.sh@18 -- # out=' 00:06:43.099 SPDK Configuration: 00:06:43.099 Core mask: 0x1 00:06:43.099 00:06:43.099 Accel Perf Configuration: 00:06:43.099 Workload Type: copy 00:06:43.099 Transfer size: 4096 bytes 00:06:43.099 Vector count 1 00:06:43.099 Module: software 00:06:43.099 Queue depth: 32 00:06:43.099 Allocate depth: 32 00:06:43.099 # threads/core: 1 00:06:43.099 Run time: 1 seconds 00:06:43.099 Verify: Yes 00:06:43.099 00:06:43.099 Running for 1 seconds... 00:06:43.099 00:06:43.099 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:43.099 ------------------------------------------------------------------------------------ 00:06:43.099 0,0 549888/s 2148 MiB/s 0 0 00:06:43.099 ==================================================================================== 00:06:43.099 Total 549888/s 2148 MiB/s 0 0' 00:06:43.099 16:42:28 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:43.099 16:42:28 -- accel/accel.sh@20 -- # IFS=: 00:06:43.099 16:42:28 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:43.099 16:42:28 -- accel/accel.sh@20 -- # read -r var val 00:06:43.099 16:42:28 -- accel/accel.sh@12 -- # build_accel_config 00:06:43.099 16:42:28 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:43.099 16:42:28 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.099 16:42:28 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.099 16:42:28 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:43.099 16:42:28 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:43.099 16:42:28 -- accel/accel.sh@41 -- # local IFS=, 00:06:43.099 16:42:28 -- accel/accel.sh@42 -- # jq -r . 00:06:43.099 [2024-11-16 16:42:28.447529] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:43.099 [2024-11-16 16:42:28.447577] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid473619 ] 00:06:43.099 EAL: No free 2048 kB hugepages reported on node 1 00:06:43.099 [2024-11-16 16:42:28.504329] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.099 [2024-11-16 16:42:28.539100] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.099 16:42:28 -- accel/accel.sh@21 -- # val= 00:06:43.099 16:42:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.099 16:42:28 -- accel/accel.sh@20 -- # IFS=: 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # read -r var val 00:06:43.100 16:42:28 -- accel/accel.sh@21 -- # val= 00:06:43.100 16:42:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # IFS=: 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # read -r var val 00:06:43.100 16:42:28 -- accel/accel.sh@21 -- # val=0x1 00:06:43.100 16:42:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # IFS=: 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # read -r var val 00:06:43.100 16:42:28 -- accel/accel.sh@21 -- # val= 00:06:43.100 16:42:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # IFS=: 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # read -r var val 00:06:43.100 16:42:28 -- accel/accel.sh@21 -- # val= 00:06:43.100 16:42:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # IFS=: 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # read -r var val 00:06:43.100 16:42:28 -- accel/accel.sh@21 -- # val=copy 00:06:43.100 16:42:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.100 16:42:28 -- accel/accel.sh@24 -- # accel_opc=copy 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # IFS=: 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # read -r var val 00:06:43.100 16:42:28 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:43.100 16:42:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # IFS=: 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # read -r var val 00:06:43.100 16:42:28 -- accel/accel.sh@21 -- # val= 00:06:43.100 16:42:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # IFS=: 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # read -r var val 00:06:43.100 16:42:28 -- accel/accel.sh@21 -- # val=software 00:06:43.100 16:42:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.100 16:42:28 -- accel/accel.sh@23 -- # accel_module=software 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # IFS=: 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # read -r var val 00:06:43.100 16:42:28 -- accel/accel.sh@21 -- # val=32 00:06:43.100 16:42:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # IFS=: 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # read -r var val 00:06:43.100 16:42:28 -- accel/accel.sh@21 -- # val=32 00:06:43.100 16:42:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # IFS=: 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # read -r var val 00:06:43.100 16:42:28 -- accel/accel.sh@21 -- # val=1 00:06:43.100 16:42:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # IFS=: 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # read -r var val 00:06:43.100 16:42:28 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:43.100 16:42:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # IFS=: 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # read -r var val 00:06:43.100 16:42:28 -- accel/accel.sh@21 -- # val=Yes 00:06:43.100 16:42:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # IFS=: 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # read -r var val 00:06:43.100 16:42:28 -- accel/accel.sh@21 -- # val= 00:06:43.100 16:42:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # IFS=: 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # read -r var val 00:06:43.100 16:42:28 -- accel/accel.sh@21 -- # val= 00:06:43.100 16:42:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # IFS=: 00:06:43.100 16:42:28 -- accel/accel.sh@20 -- # read -r var val 00:06:44.035 16:42:29 -- accel/accel.sh@21 -- # val= 00:06:44.035 16:42:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.035 16:42:29 -- accel/accel.sh@20 -- # IFS=: 00:06:44.035 16:42:29 -- accel/accel.sh@20 -- # read -r var val 00:06:44.035 16:42:29 -- accel/accel.sh@21 -- # val= 00:06:44.035 16:42:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.035 16:42:29 -- accel/accel.sh@20 -- # IFS=: 00:06:44.035 16:42:29 -- accel/accel.sh@20 -- # read -r var val 00:06:44.035 16:42:29 -- accel/accel.sh@21 -- # val= 00:06:44.035 16:42:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.035 16:42:29 -- accel/accel.sh@20 -- # IFS=: 00:06:44.035 16:42:29 -- accel/accel.sh@20 -- # read -r var val 00:06:44.035 16:42:29 -- accel/accel.sh@21 -- # val= 00:06:44.035 16:42:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.035 16:42:29 -- accel/accel.sh@20 -- # IFS=: 00:06:44.035 16:42:29 -- accel/accel.sh@20 -- # read -r var val 00:06:44.035 16:42:29 -- accel/accel.sh@21 -- # val= 00:06:44.035 16:42:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.035 16:42:29 -- accel/accel.sh@20 -- # IFS=: 00:06:44.035 16:42:29 -- accel/accel.sh@20 -- # read -r var val 00:06:44.035 16:42:29 -- accel/accel.sh@21 -- # val= 00:06:44.035 16:42:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:44.035 16:42:29 -- accel/accel.sh@20 -- # IFS=: 00:06:44.035 16:42:29 -- accel/accel.sh@20 -- # read -r var val 00:06:44.035 16:42:29 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:44.035 16:42:29 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:06:44.035 16:42:29 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:44.035 00:06:44.035 real 0m2.555s 00:06:44.035 user 0m2.316s 00:06:44.035 sys 0m0.247s 00:06:44.035 16:42:29 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:44.035 16:42:29 -- common/autotest_common.sh@10 -- # set +x 00:06:44.035 ************************************ 00:06:44.035 END TEST accel_copy 00:06:44.035 ************************************ 00:06:44.035 16:42:29 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:44.035 16:42:29 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:06:44.035 16:42:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:44.035 16:42:29 -- common/autotest_common.sh@10 -- # set +x 00:06:44.035 ************************************ 00:06:44.035 START TEST accel_fill 00:06:44.035 ************************************ 00:06:44.035 16:42:29 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:44.035 16:42:29 -- accel/accel.sh@16 -- # local accel_opc 00:06:44.035 16:42:29 -- accel/accel.sh@17 -- # local accel_module 00:06:44.035 16:42:29 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:44.035 16:42:29 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:44.035 16:42:29 -- accel/accel.sh@12 -- # build_accel_config 00:06:44.035 16:42:29 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:44.035 16:42:29 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:44.035 16:42:29 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:44.035 16:42:29 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:44.035 16:42:29 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:44.035 16:42:29 -- accel/accel.sh@41 -- # local IFS=, 00:06:44.035 16:42:29 -- accel/accel.sh@42 -- # jq -r . 00:06:44.035 [2024-11-16 16:42:29.760444] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:44.035 [2024-11-16 16:42:29.760492] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid473904 ] 00:06:44.294 EAL: No free 2048 kB hugepages reported on node 1 00:06:44.294 [2024-11-16 16:42:29.823436] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:44.294 [2024-11-16 16:42:29.858677] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.673 16:42:31 -- accel/accel.sh@18 -- # out=' 00:06:45.673 SPDK Configuration: 00:06:45.673 Core mask: 0x1 00:06:45.673 00:06:45.673 Accel Perf Configuration: 00:06:45.673 Workload Type: fill 00:06:45.673 Fill pattern: 0x80 00:06:45.673 Transfer size: 4096 bytes 00:06:45.673 Vector count 1 00:06:45.673 Module: software 00:06:45.673 Queue depth: 64 00:06:45.673 Allocate depth: 64 00:06:45.673 # threads/core: 1 00:06:45.673 Run time: 1 seconds 00:06:45.673 Verify: Yes 00:06:45.673 00:06:45.673 Running for 1 seconds... 00:06:45.673 00:06:45.673 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:45.673 ------------------------------------------------------------------------------------ 00:06:45.673 0,0 929024/s 3629 MiB/s 0 0 00:06:45.673 ==================================================================================== 00:06:45.673 Total 929024/s 3629 MiB/s 0 0' 00:06:45.673 16:42:31 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:45.673 16:42:31 -- accel/accel.sh@20 -- # IFS=: 00:06:45.673 16:42:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:45.673 16:42:31 -- accel/accel.sh@20 -- # read -r var val 00:06:45.673 16:42:31 -- accel/accel.sh@12 -- # build_accel_config 00:06:45.673 16:42:31 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:45.673 16:42:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.673 16:42:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.673 16:42:31 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:45.673 16:42:31 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:45.673 16:42:31 -- accel/accel.sh@41 -- # local IFS=, 00:06:45.673 16:42:31 -- accel/accel.sh@42 -- # jq -r . 00:06:45.673 [2024-11-16 16:42:31.031018] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:45.673 [2024-11-16 16:42:31.031067] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid474136 ] 00:06:45.673 EAL: No free 2048 kB hugepages reported on node 1 00:06:45.673 [2024-11-16 16:42:31.087633] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.673 [2024-11-16 16:42:31.122025] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.673 16:42:31 -- accel/accel.sh@21 -- # val= 00:06:45.673 16:42:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.673 16:42:31 -- accel/accel.sh@20 -- # IFS=: 00:06:45.673 16:42:31 -- accel/accel.sh@20 -- # read -r var val 00:06:45.674 16:42:31 -- accel/accel.sh@21 -- # val= 00:06:45.674 16:42:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # IFS=: 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # read -r var val 00:06:45.674 16:42:31 -- accel/accel.sh@21 -- # val=0x1 00:06:45.674 16:42:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # IFS=: 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # read -r var val 00:06:45.674 16:42:31 -- accel/accel.sh@21 -- # val= 00:06:45.674 16:42:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # IFS=: 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # read -r var val 00:06:45.674 16:42:31 -- accel/accel.sh@21 -- # val= 00:06:45.674 16:42:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # IFS=: 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # read -r var val 00:06:45.674 16:42:31 -- accel/accel.sh@21 -- # val=fill 00:06:45.674 16:42:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.674 16:42:31 -- accel/accel.sh@24 -- # accel_opc=fill 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # IFS=: 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # read -r var val 00:06:45.674 16:42:31 -- accel/accel.sh@21 -- # val=0x80 00:06:45.674 16:42:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # IFS=: 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # read -r var val 00:06:45.674 16:42:31 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:45.674 16:42:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # IFS=: 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # read -r var val 00:06:45.674 16:42:31 -- accel/accel.sh@21 -- # val= 00:06:45.674 16:42:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # IFS=: 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # read -r var val 00:06:45.674 16:42:31 -- accel/accel.sh@21 -- # val=software 00:06:45.674 16:42:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.674 16:42:31 -- accel/accel.sh@23 -- # accel_module=software 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # IFS=: 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # read -r var val 00:06:45.674 16:42:31 -- accel/accel.sh@21 -- # val=64 00:06:45.674 16:42:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # IFS=: 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # read -r var val 00:06:45.674 16:42:31 -- accel/accel.sh@21 -- # val=64 00:06:45.674 16:42:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # IFS=: 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # read -r var val 00:06:45.674 16:42:31 -- accel/accel.sh@21 -- # val=1 00:06:45.674 16:42:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # IFS=: 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # read -r var val 00:06:45.674 16:42:31 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:45.674 16:42:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # IFS=: 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # read -r var val 00:06:45.674 16:42:31 -- accel/accel.sh@21 -- # val=Yes 00:06:45.674 16:42:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # IFS=: 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # read -r var val 00:06:45.674 16:42:31 -- accel/accel.sh@21 -- # val= 00:06:45.674 16:42:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # IFS=: 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # read -r var val 00:06:45.674 16:42:31 -- accel/accel.sh@21 -- # val= 00:06:45.674 16:42:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # IFS=: 00:06:45.674 16:42:31 -- accel/accel.sh@20 -- # read -r var val 00:06:46.612 16:42:32 -- accel/accel.sh@21 -- # val= 00:06:46.612 16:42:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.612 16:42:32 -- accel/accel.sh@20 -- # IFS=: 00:06:46.612 16:42:32 -- accel/accel.sh@20 -- # read -r var val 00:06:46.612 16:42:32 -- accel/accel.sh@21 -- # val= 00:06:46.612 16:42:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.612 16:42:32 -- accel/accel.sh@20 -- # IFS=: 00:06:46.612 16:42:32 -- accel/accel.sh@20 -- # read -r var val 00:06:46.612 16:42:32 -- accel/accel.sh@21 -- # val= 00:06:46.612 16:42:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.612 16:42:32 -- accel/accel.sh@20 -- # IFS=: 00:06:46.612 16:42:32 -- accel/accel.sh@20 -- # read -r var val 00:06:46.612 16:42:32 -- accel/accel.sh@21 -- # val= 00:06:46.612 16:42:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.612 16:42:32 -- accel/accel.sh@20 -- # IFS=: 00:06:46.612 16:42:32 -- accel/accel.sh@20 -- # read -r var val 00:06:46.612 16:42:32 -- accel/accel.sh@21 -- # val= 00:06:46.612 16:42:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.612 16:42:32 -- accel/accel.sh@20 -- # IFS=: 00:06:46.612 16:42:32 -- accel/accel.sh@20 -- # read -r var val 00:06:46.612 16:42:32 -- accel/accel.sh@21 -- # val= 00:06:46.612 16:42:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.612 16:42:32 -- accel/accel.sh@20 -- # IFS=: 00:06:46.612 16:42:32 -- accel/accel.sh@20 -- # read -r var val 00:06:46.612 16:42:32 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:46.612 16:42:32 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:06:46.612 16:42:32 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:46.612 00:06:46.612 real 0m2.540s 00:06:46.612 user 0m2.315s 00:06:46.612 sys 0m0.234s 00:06:46.612 16:42:32 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:46.612 16:42:32 -- common/autotest_common.sh@10 -- # set +x 00:06:46.612 ************************************ 00:06:46.612 END TEST accel_fill 00:06:46.612 ************************************ 00:06:46.612 16:42:32 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:46.612 16:42:32 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:46.612 16:42:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:46.612 16:42:32 -- common/autotest_common.sh@10 -- # set +x 00:06:46.612 ************************************ 00:06:46.612 START TEST accel_copy_crc32c 00:06:46.612 ************************************ 00:06:46.612 16:42:32 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y 00:06:46.612 16:42:32 -- accel/accel.sh@16 -- # local accel_opc 00:06:46.612 16:42:32 -- accel/accel.sh@17 -- # local accel_module 00:06:46.612 16:42:32 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:46.612 16:42:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:46.612 16:42:32 -- accel/accel.sh@12 -- # build_accel_config 00:06:46.612 16:42:32 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:46.612 16:42:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:46.612 16:42:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:46.612 16:42:32 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:46.612 16:42:32 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:46.612 16:42:32 -- accel/accel.sh@41 -- # local IFS=, 00:06:46.612 16:42:32 -- accel/accel.sh@42 -- # jq -r . 00:06:46.612 [2024-11-16 16:42:32.351766] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:46.612 [2024-11-16 16:42:32.351854] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid474317 ] 00:06:46.871 EAL: No free 2048 kB hugepages reported on node 1 00:06:46.871 [2024-11-16 16:42:32.421153] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.871 [2024-11-16 16:42:32.456201] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.250 16:42:33 -- accel/accel.sh@18 -- # out=' 00:06:48.250 SPDK Configuration: 00:06:48.250 Core mask: 0x1 00:06:48.250 00:06:48.250 Accel Perf Configuration: 00:06:48.250 Workload Type: copy_crc32c 00:06:48.250 CRC-32C seed: 0 00:06:48.250 Vector size: 4096 bytes 00:06:48.250 Transfer size: 4096 bytes 00:06:48.250 Vector count 1 00:06:48.250 Module: software 00:06:48.250 Queue depth: 32 00:06:48.250 Allocate depth: 32 00:06:48.250 # threads/core: 1 00:06:48.250 Run time: 1 seconds 00:06:48.250 Verify: Yes 00:06:48.250 00:06:48.250 Running for 1 seconds... 00:06:48.250 00:06:48.250 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:48.250 ------------------------------------------------------------------------------------ 00:06:48.250 0,0 437248/s 1708 MiB/s 0 0 00:06:48.250 ==================================================================================== 00:06:48.250 Total 437248/s 1708 MiB/s 0 0' 00:06:48.250 16:42:33 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # IFS=: 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # read -r var val 00:06:48.250 16:42:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:48.250 16:42:33 -- accel/accel.sh@12 -- # build_accel_config 00:06:48.250 16:42:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:48.250 16:42:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:48.250 16:42:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:48.250 16:42:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:48.250 16:42:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:48.250 16:42:33 -- accel/accel.sh@41 -- # local IFS=, 00:06:48.250 16:42:33 -- accel/accel.sh@42 -- # jq -r . 00:06:48.250 [2024-11-16 16:42:33.637711] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:48.250 [2024-11-16 16:42:33.637799] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid474481 ] 00:06:48.250 EAL: No free 2048 kB hugepages reported on node 1 00:06:48.250 [2024-11-16 16:42:33.707910] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.250 [2024-11-16 16:42:33.742279] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.250 16:42:33 -- accel/accel.sh@21 -- # val= 00:06:48.250 16:42:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # IFS=: 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # read -r var val 00:06:48.250 16:42:33 -- accel/accel.sh@21 -- # val= 00:06:48.250 16:42:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # IFS=: 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # read -r var val 00:06:48.250 16:42:33 -- accel/accel.sh@21 -- # val=0x1 00:06:48.250 16:42:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # IFS=: 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # read -r var val 00:06:48.250 16:42:33 -- accel/accel.sh@21 -- # val= 00:06:48.250 16:42:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # IFS=: 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # read -r var val 00:06:48.250 16:42:33 -- accel/accel.sh@21 -- # val= 00:06:48.250 16:42:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # IFS=: 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # read -r var val 00:06:48.250 16:42:33 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:48.250 16:42:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.250 16:42:33 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # IFS=: 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # read -r var val 00:06:48.250 16:42:33 -- accel/accel.sh@21 -- # val=0 00:06:48.250 16:42:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # IFS=: 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # read -r var val 00:06:48.250 16:42:33 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:48.250 16:42:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # IFS=: 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # read -r var val 00:06:48.250 16:42:33 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:48.250 16:42:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # IFS=: 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # read -r var val 00:06:48.250 16:42:33 -- accel/accel.sh@21 -- # val= 00:06:48.250 16:42:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # IFS=: 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # read -r var val 00:06:48.250 16:42:33 -- accel/accel.sh@21 -- # val=software 00:06:48.250 16:42:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.250 16:42:33 -- accel/accel.sh@23 -- # accel_module=software 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # IFS=: 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # read -r var val 00:06:48.250 16:42:33 -- accel/accel.sh@21 -- # val=32 00:06:48.250 16:42:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # IFS=: 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # read -r var val 00:06:48.250 16:42:33 -- accel/accel.sh@21 -- # val=32 00:06:48.250 16:42:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # IFS=: 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # read -r var val 00:06:48.250 16:42:33 -- accel/accel.sh@21 -- # val=1 00:06:48.250 16:42:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # IFS=: 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # read -r var val 00:06:48.250 16:42:33 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:48.250 16:42:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # IFS=: 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # read -r var val 00:06:48.250 16:42:33 -- accel/accel.sh@21 -- # val=Yes 00:06:48.250 16:42:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # IFS=: 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # read -r var val 00:06:48.250 16:42:33 -- accel/accel.sh@21 -- # val= 00:06:48.250 16:42:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # IFS=: 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # read -r var val 00:06:48.250 16:42:33 -- accel/accel.sh@21 -- # val= 00:06:48.250 16:42:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # IFS=: 00:06:48.250 16:42:33 -- accel/accel.sh@20 -- # read -r var val 00:06:49.187 16:42:34 -- accel/accel.sh@21 -- # val= 00:06:49.187 16:42:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.187 16:42:34 -- accel/accel.sh@20 -- # IFS=: 00:06:49.187 16:42:34 -- accel/accel.sh@20 -- # read -r var val 00:06:49.187 16:42:34 -- accel/accel.sh@21 -- # val= 00:06:49.187 16:42:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.187 16:42:34 -- accel/accel.sh@20 -- # IFS=: 00:06:49.187 16:42:34 -- accel/accel.sh@20 -- # read -r var val 00:06:49.187 16:42:34 -- accel/accel.sh@21 -- # val= 00:06:49.187 16:42:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.187 16:42:34 -- accel/accel.sh@20 -- # IFS=: 00:06:49.187 16:42:34 -- accel/accel.sh@20 -- # read -r var val 00:06:49.187 16:42:34 -- accel/accel.sh@21 -- # val= 00:06:49.187 16:42:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.187 16:42:34 -- accel/accel.sh@20 -- # IFS=: 00:06:49.187 16:42:34 -- accel/accel.sh@20 -- # read -r var val 00:06:49.187 16:42:34 -- accel/accel.sh@21 -- # val= 00:06:49.187 16:42:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.187 16:42:34 -- accel/accel.sh@20 -- # IFS=: 00:06:49.187 16:42:34 -- accel/accel.sh@20 -- # read -r var val 00:06:49.187 16:42:34 -- accel/accel.sh@21 -- # val= 00:06:49.187 16:42:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.187 16:42:34 -- accel/accel.sh@20 -- # IFS=: 00:06:49.187 16:42:34 -- accel/accel.sh@20 -- # read -r var val 00:06:49.187 16:42:34 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:49.187 16:42:34 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:49.187 16:42:34 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:49.187 00:06:49.187 real 0m2.580s 00:06:49.187 user 0m2.322s 00:06:49.187 sys 0m0.267s 00:06:49.187 16:42:34 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:49.187 16:42:34 -- common/autotest_common.sh@10 -- # set +x 00:06:49.187 ************************************ 00:06:49.187 END TEST accel_copy_crc32c 00:06:49.187 ************************************ 00:06:49.447 16:42:34 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:49.447 16:42:34 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:49.447 16:42:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:49.447 16:42:34 -- common/autotest_common.sh@10 -- # set +x 00:06:49.447 ************************************ 00:06:49.447 START TEST accel_copy_crc32c_C2 00:06:49.447 ************************************ 00:06:49.447 16:42:34 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:49.447 16:42:34 -- accel/accel.sh@16 -- # local accel_opc 00:06:49.447 16:42:34 -- accel/accel.sh@17 -- # local accel_module 00:06:49.447 16:42:34 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:49.447 16:42:34 -- accel/accel.sh@12 -- # build_accel_config 00:06:49.447 16:42:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:49.447 16:42:34 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:49.447 16:42:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:49.447 16:42:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:49.447 16:42:34 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:49.447 16:42:34 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:49.447 16:42:34 -- accel/accel.sh@41 -- # local IFS=, 00:06:49.447 16:42:34 -- accel/accel.sh@42 -- # jq -r . 00:06:49.447 [2024-11-16 16:42:34.973892] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:49.447 [2024-11-16 16:42:34.973978] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid474766 ] 00:06:49.447 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.447 [2024-11-16 16:42:35.039524] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.447 [2024-11-16 16:42:35.073848] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.824 16:42:36 -- accel/accel.sh@18 -- # out=' 00:06:50.824 SPDK Configuration: 00:06:50.824 Core mask: 0x1 00:06:50.824 00:06:50.824 Accel Perf Configuration: 00:06:50.824 Workload Type: copy_crc32c 00:06:50.824 CRC-32C seed: 0 00:06:50.824 Vector size: 4096 bytes 00:06:50.824 Transfer size: 8192 bytes 00:06:50.824 Vector count 2 00:06:50.824 Module: software 00:06:50.824 Queue depth: 32 00:06:50.824 Allocate depth: 32 00:06:50.824 # threads/core: 1 00:06:50.824 Run time: 1 seconds 00:06:50.824 Verify: Yes 00:06:50.824 00:06:50.824 Running for 1 seconds... 00:06:50.824 00:06:50.824 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:50.824 ------------------------------------------------------------------------------------ 00:06:50.824 0,0 308064/s 2406 MiB/s 0 0 00:06:50.824 ==================================================================================== 00:06:50.824 Total 308064/s 1203 MiB/s 0 0' 00:06:50.824 16:42:36 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:50.824 16:42:36 -- accel/accel.sh@20 -- # IFS=: 00:06:50.824 16:42:36 -- accel/accel.sh@20 -- # read -r var val 00:06:50.824 16:42:36 -- accel/accel.sh@12 -- # build_accel_config 00:06:50.824 16:42:36 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:50.824 16:42:36 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:50.824 16:42:36 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:50.824 16:42:36 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:50.824 16:42:36 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:50.824 16:42:36 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:50.824 16:42:36 -- accel/accel.sh@41 -- # local IFS=, 00:06:50.824 16:42:36 -- accel/accel.sh@42 -- # jq -r . 00:06:50.824 [2024-11-16 16:42:36.243941] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:50.824 [2024-11-16 16:42:36.243990] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid475034 ] 00:06:50.824 EAL: No free 2048 kB hugepages reported on node 1 00:06:50.825 [2024-11-16 16:42:36.303948] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.825 [2024-11-16 16:42:36.337473] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.825 16:42:36 -- accel/accel.sh@21 -- # val= 00:06:50.825 16:42:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # IFS=: 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # read -r var val 00:06:50.825 16:42:36 -- accel/accel.sh@21 -- # val= 00:06:50.825 16:42:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # IFS=: 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # read -r var val 00:06:50.825 16:42:36 -- accel/accel.sh@21 -- # val=0x1 00:06:50.825 16:42:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # IFS=: 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # read -r var val 00:06:50.825 16:42:36 -- accel/accel.sh@21 -- # val= 00:06:50.825 16:42:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # IFS=: 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # read -r var val 00:06:50.825 16:42:36 -- accel/accel.sh@21 -- # val= 00:06:50.825 16:42:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # IFS=: 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # read -r var val 00:06:50.825 16:42:36 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:50.825 16:42:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.825 16:42:36 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # IFS=: 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # read -r var val 00:06:50.825 16:42:36 -- accel/accel.sh@21 -- # val=0 00:06:50.825 16:42:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # IFS=: 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # read -r var val 00:06:50.825 16:42:36 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:50.825 16:42:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # IFS=: 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # read -r var val 00:06:50.825 16:42:36 -- accel/accel.sh@21 -- # val='8192 bytes' 00:06:50.825 16:42:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # IFS=: 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # read -r var val 00:06:50.825 16:42:36 -- accel/accel.sh@21 -- # val= 00:06:50.825 16:42:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # IFS=: 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # read -r var val 00:06:50.825 16:42:36 -- accel/accel.sh@21 -- # val=software 00:06:50.825 16:42:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.825 16:42:36 -- accel/accel.sh@23 -- # accel_module=software 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # IFS=: 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # read -r var val 00:06:50.825 16:42:36 -- accel/accel.sh@21 -- # val=32 00:06:50.825 16:42:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # IFS=: 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # read -r var val 00:06:50.825 16:42:36 -- accel/accel.sh@21 -- # val=32 00:06:50.825 16:42:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # IFS=: 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # read -r var val 00:06:50.825 16:42:36 -- accel/accel.sh@21 -- # val=1 00:06:50.825 16:42:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # IFS=: 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # read -r var val 00:06:50.825 16:42:36 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:50.825 16:42:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # IFS=: 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # read -r var val 00:06:50.825 16:42:36 -- accel/accel.sh@21 -- # val=Yes 00:06:50.825 16:42:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # IFS=: 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # read -r var val 00:06:50.825 16:42:36 -- accel/accel.sh@21 -- # val= 00:06:50.825 16:42:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # IFS=: 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # read -r var val 00:06:50.825 16:42:36 -- accel/accel.sh@21 -- # val= 00:06:50.825 16:42:36 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # IFS=: 00:06:50.825 16:42:36 -- accel/accel.sh@20 -- # read -r var val 00:06:51.763 16:42:37 -- accel/accel.sh@21 -- # val= 00:06:51.763 16:42:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.763 16:42:37 -- accel/accel.sh@20 -- # IFS=: 00:06:51.763 16:42:37 -- accel/accel.sh@20 -- # read -r var val 00:06:51.763 16:42:37 -- accel/accel.sh@21 -- # val= 00:06:51.763 16:42:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.763 16:42:37 -- accel/accel.sh@20 -- # IFS=: 00:06:51.763 16:42:37 -- accel/accel.sh@20 -- # read -r var val 00:06:51.763 16:42:37 -- accel/accel.sh@21 -- # val= 00:06:51.763 16:42:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.763 16:42:37 -- accel/accel.sh@20 -- # IFS=: 00:06:51.763 16:42:37 -- accel/accel.sh@20 -- # read -r var val 00:06:51.763 16:42:37 -- accel/accel.sh@21 -- # val= 00:06:51.763 16:42:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.763 16:42:37 -- accel/accel.sh@20 -- # IFS=: 00:06:51.763 16:42:37 -- accel/accel.sh@20 -- # read -r var val 00:06:51.763 16:42:37 -- accel/accel.sh@21 -- # val= 00:06:51.763 16:42:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.763 16:42:37 -- accel/accel.sh@20 -- # IFS=: 00:06:51.763 16:42:37 -- accel/accel.sh@20 -- # read -r var val 00:06:51.763 16:42:37 -- accel/accel.sh@21 -- # val= 00:06:51.763 16:42:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:51.763 16:42:37 -- accel/accel.sh@20 -- # IFS=: 00:06:51.763 16:42:37 -- accel/accel.sh@20 -- # read -r var val 00:06:51.763 16:42:37 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:51.763 16:42:37 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:51.763 16:42:37 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:51.763 00:06:51.763 real 0m2.552s 00:06:51.763 user 0m2.315s 00:06:51.763 sys 0m0.245s 00:06:51.763 16:42:37 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:51.763 16:42:37 -- common/autotest_common.sh@10 -- # set +x 00:06:51.763 ************************************ 00:06:51.763 END TEST accel_copy_crc32c_C2 00:06:51.763 ************************************ 00:06:52.022 16:42:37 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:52.022 16:42:37 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:52.022 16:42:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:52.022 16:42:37 -- common/autotest_common.sh@10 -- # set +x 00:06:52.022 ************************************ 00:06:52.022 START TEST accel_dualcast 00:06:52.022 ************************************ 00:06:52.022 16:42:37 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dualcast -y 00:06:52.022 16:42:37 -- accel/accel.sh@16 -- # local accel_opc 00:06:52.022 16:42:37 -- accel/accel.sh@17 -- # local accel_module 00:06:52.022 16:42:37 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:06:52.022 16:42:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:52.022 16:42:37 -- accel/accel.sh@12 -- # build_accel_config 00:06:52.022 16:42:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:52.022 16:42:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:52.023 16:42:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:52.023 16:42:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:52.023 16:42:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:52.023 16:42:37 -- accel/accel.sh@41 -- # local IFS=, 00:06:52.023 16:42:37 -- accel/accel.sh@42 -- # jq -r . 00:06:52.023 [2024-11-16 16:42:37.566306] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:52.023 [2024-11-16 16:42:37.566390] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid475321 ] 00:06:52.023 EAL: No free 2048 kB hugepages reported on node 1 00:06:52.023 [2024-11-16 16:42:37.634574] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.023 [2024-11-16 16:42:37.669098] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.402 16:42:38 -- accel/accel.sh@18 -- # out=' 00:06:53.402 SPDK Configuration: 00:06:53.402 Core mask: 0x1 00:06:53.402 00:06:53.402 Accel Perf Configuration: 00:06:53.402 Workload Type: dualcast 00:06:53.402 Transfer size: 4096 bytes 00:06:53.402 Vector count 1 00:06:53.402 Module: software 00:06:53.402 Queue depth: 32 00:06:53.402 Allocate depth: 32 00:06:53.402 # threads/core: 1 00:06:53.402 Run time: 1 seconds 00:06:53.402 Verify: Yes 00:06:53.402 00:06:53.402 Running for 1 seconds... 00:06:53.402 00:06:53.402 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:53.402 ------------------------------------------------------------------------------------ 00:06:53.402 0,0 648736/s 2534 MiB/s 0 0 00:06:53.402 ==================================================================================== 00:06:53.402 Total 648736/s 2534 MiB/s 0 0' 00:06:53.402 16:42:38 -- accel/accel.sh@20 -- # IFS=: 00:06:53.402 16:42:38 -- accel/accel.sh@20 -- # read -r var val 00:06:53.402 16:42:38 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:53.402 16:42:38 -- accel/accel.sh@12 -- # build_accel_config 00:06:53.402 16:42:38 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:53.402 16:42:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.402 16:42:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:53.402 16:42:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.402 16:42:38 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:53.402 16:42:38 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:53.402 16:42:38 -- accel/accel.sh@41 -- # local IFS=, 00:06:53.402 16:42:38 -- accel/accel.sh@42 -- # jq -r . 00:06:53.402 [2024-11-16 16:42:38.850446] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:53.402 [2024-11-16 16:42:38.850533] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid475591 ] 00:06:53.402 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.402 [2024-11-16 16:42:38.916745] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.402 [2024-11-16 16:42:38.950265] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.402 16:42:38 -- accel/accel.sh@21 -- # val= 00:06:53.402 16:42:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.402 16:42:38 -- accel/accel.sh@20 -- # IFS=: 00:06:53.402 16:42:38 -- accel/accel.sh@20 -- # read -r var val 00:06:53.402 16:42:38 -- accel/accel.sh@21 -- # val= 00:06:53.402 16:42:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.402 16:42:38 -- accel/accel.sh@20 -- # IFS=: 00:06:53.402 16:42:38 -- accel/accel.sh@20 -- # read -r var val 00:06:53.402 16:42:38 -- accel/accel.sh@21 -- # val=0x1 00:06:53.402 16:42:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.402 16:42:38 -- accel/accel.sh@20 -- # IFS=: 00:06:53.402 16:42:38 -- accel/accel.sh@20 -- # read -r var val 00:06:53.402 16:42:38 -- accel/accel.sh@21 -- # val= 00:06:53.402 16:42:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.402 16:42:38 -- accel/accel.sh@20 -- # IFS=: 00:06:53.402 16:42:38 -- accel/accel.sh@20 -- # read -r var val 00:06:53.402 16:42:38 -- accel/accel.sh@21 -- # val= 00:06:53.402 16:42:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.402 16:42:38 -- accel/accel.sh@20 -- # IFS=: 00:06:53.402 16:42:38 -- accel/accel.sh@20 -- # read -r var val 00:06:53.402 16:42:38 -- accel/accel.sh@21 -- # val=dualcast 00:06:53.402 16:42:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.402 16:42:38 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:06:53.402 16:42:38 -- accel/accel.sh@20 -- # IFS=: 00:06:53.402 16:42:38 -- accel/accel.sh@20 -- # read -r var val 00:06:53.402 16:42:38 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:53.402 16:42:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.402 16:42:38 -- accel/accel.sh@20 -- # IFS=: 00:06:53.402 16:42:38 -- accel/accel.sh@20 -- # read -r var val 00:06:53.402 16:42:38 -- accel/accel.sh@21 -- # val= 00:06:53.402 16:42:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.402 16:42:38 -- accel/accel.sh@20 -- # IFS=: 00:06:53.403 16:42:38 -- accel/accel.sh@20 -- # read -r var val 00:06:53.403 16:42:38 -- accel/accel.sh@21 -- # val=software 00:06:53.403 16:42:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.403 16:42:38 -- accel/accel.sh@23 -- # accel_module=software 00:06:53.403 16:42:38 -- accel/accel.sh@20 -- # IFS=: 00:06:53.403 16:42:38 -- accel/accel.sh@20 -- # read -r var val 00:06:53.403 16:42:38 -- accel/accel.sh@21 -- # val=32 00:06:53.403 16:42:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.403 16:42:38 -- accel/accel.sh@20 -- # IFS=: 00:06:53.403 16:42:38 -- accel/accel.sh@20 -- # read -r var val 00:06:53.403 16:42:38 -- accel/accel.sh@21 -- # val=32 00:06:53.403 16:42:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.403 16:42:38 -- accel/accel.sh@20 -- # IFS=: 00:06:53.403 16:42:38 -- accel/accel.sh@20 -- # read -r var val 00:06:53.403 16:42:38 -- accel/accel.sh@21 -- # val=1 00:06:53.403 16:42:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.403 16:42:38 -- accel/accel.sh@20 -- # IFS=: 00:06:53.403 16:42:38 -- accel/accel.sh@20 -- # read -r var val 00:06:53.403 16:42:38 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:53.403 16:42:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.403 16:42:39 -- accel/accel.sh@20 -- # IFS=: 00:06:53.403 16:42:39 -- accel/accel.sh@20 -- # read -r var val 00:06:53.403 16:42:39 -- accel/accel.sh@21 -- # val=Yes 00:06:53.403 16:42:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.403 16:42:39 -- accel/accel.sh@20 -- # IFS=: 00:06:53.403 16:42:39 -- accel/accel.sh@20 -- # read -r var val 00:06:53.403 16:42:39 -- accel/accel.sh@21 -- # val= 00:06:53.403 16:42:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.403 16:42:39 -- accel/accel.sh@20 -- # IFS=: 00:06:53.403 16:42:39 -- accel/accel.sh@20 -- # read -r var val 00:06:53.403 16:42:39 -- accel/accel.sh@21 -- # val= 00:06:53.403 16:42:39 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.403 16:42:39 -- accel/accel.sh@20 -- # IFS=: 00:06:53.403 16:42:39 -- accel/accel.sh@20 -- # read -r var val 00:06:54.780 16:42:40 -- accel/accel.sh@21 -- # val= 00:06:54.780 16:42:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.780 16:42:40 -- accel/accel.sh@20 -- # IFS=: 00:06:54.780 16:42:40 -- accel/accel.sh@20 -- # read -r var val 00:06:54.780 16:42:40 -- accel/accel.sh@21 -- # val= 00:06:54.780 16:42:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.780 16:42:40 -- accel/accel.sh@20 -- # IFS=: 00:06:54.780 16:42:40 -- accel/accel.sh@20 -- # read -r var val 00:06:54.780 16:42:40 -- accel/accel.sh@21 -- # val= 00:06:54.780 16:42:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.780 16:42:40 -- accel/accel.sh@20 -- # IFS=: 00:06:54.780 16:42:40 -- accel/accel.sh@20 -- # read -r var val 00:06:54.780 16:42:40 -- accel/accel.sh@21 -- # val= 00:06:54.780 16:42:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.780 16:42:40 -- accel/accel.sh@20 -- # IFS=: 00:06:54.780 16:42:40 -- accel/accel.sh@20 -- # read -r var val 00:06:54.780 16:42:40 -- accel/accel.sh@21 -- # val= 00:06:54.780 16:42:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.780 16:42:40 -- accel/accel.sh@20 -- # IFS=: 00:06:54.780 16:42:40 -- accel/accel.sh@20 -- # read -r var val 00:06:54.780 16:42:40 -- accel/accel.sh@21 -- # val= 00:06:54.780 16:42:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.780 16:42:40 -- accel/accel.sh@20 -- # IFS=: 00:06:54.780 16:42:40 -- accel/accel.sh@20 -- # read -r var val 00:06:54.780 16:42:40 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:54.780 16:42:40 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:06:54.780 16:42:40 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:54.780 00:06:54.780 real 0m2.572s 00:06:54.780 user 0m2.332s 00:06:54.780 sys 0m0.248s 00:06:54.780 16:42:40 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:54.780 16:42:40 -- common/autotest_common.sh@10 -- # set +x 00:06:54.781 ************************************ 00:06:54.781 END TEST accel_dualcast 00:06:54.781 ************************************ 00:06:54.781 16:42:40 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:54.781 16:42:40 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:54.781 16:42:40 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:54.781 16:42:40 -- common/autotest_common.sh@10 -- # set +x 00:06:54.781 ************************************ 00:06:54.781 START TEST accel_compare 00:06:54.781 ************************************ 00:06:54.781 16:42:40 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compare -y 00:06:54.781 16:42:40 -- accel/accel.sh@16 -- # local accel_opc 00:06:54.781 16:42:40 -- accel/accel.sh@17 -- # local accel_module 00:06:54.781 16:42:40 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:06:54.781 16:42:40 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:54.781 16:42:40 -- accel/accel.sh@12 -- # build_accel_config 00:06:54.781 16:42:40 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:54.781 16:42:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:54.781 16:42:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:54.781 16:42:40 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:54.781 16:42:40 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:54.781 16:42:40 -- accel/accel.sh@41 -- # local IFS=, 00:06:54.781 16:42:40 -- accel/accel.sh@42 -- # jq -r . 00:06:54.781 [2024-11-16 16:42:40.181034] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:54.781 [2024-11-16 16:42:40.181127] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid475821 ] 00:06:54.781 EAL: No free 2048 kB hugepages reported on node 1 00:06:54.781 [2024-11-16 16:42:40.251041] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.781 [2024-11-16 16:42:40.286655] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.718 16:42:41 -- accel/accel.sh@18 -- # out=' 00:06:55.718 SPDK Configuration: 00:06:55.718 Core mask: 0x1 00:06:55.718 00:06:55.718 Accel Perf Configuration: 00:06:55.718 Workload Type: compare 00:06:55.718 Transfer size: 4096 bytes 00:06:55.718 Vector count 1 00:06:55.718 Module: software 00:06:55.718 Queue depth: 32 00:06:55.718 Allocate depth: 32 00:06:55.718 # threads/core: 1 00:06:55.718 Run time: 1 seconds 00:06:55.718 Verify: Yes 00:06:55.718 00:06:55.718 Running for 1 seconds... 00:06:55.718 00:06:55.718 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:55.718 ------------------------------------------------------------------------------------ 00:06:55.718 0,0 821152/s 3207 MiB/s 0 0 00:06:55.718 ==================================================================================== 00:06:55.718 Total 821152/s 3207 MiB/s 0 0' 00:06:55.718 16:42:41 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:55.718 16:42:41 -- accel/accel.sh@20 -- # IFS=: 00:06:55.718 16:42:41 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:55.718 16:42:41 -- accel/accel.sh@20 -- # read -r var val 00:06:55.718 16:42:41 -- accel/accel.sh@12 -- # build_accel_config 00:06:55.718 16:42:41 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:55.718 16:42:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:55.718 16:42:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:55.718 16:42:41 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:55.718 16:42:41 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:55.718 16:42:41 -- accel/accel.sh@41 -- # local IFS=, 00:06:55.718 16:42:41 -- accel/accel.sh@42 -- # jq -r . 00:06:55.718 [2024-11-16 16:42:41.457259] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:55.718 [2024-11-16 16:42:41.457309] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid475989 ] 00:06:55.978 EAL: No free 2048 kB hugepages reported on node 1 00:06:55.978 [2024-11-16 16:42:41.519693] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.978 [2024-11-16 16:42:41.554613] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.978 16:42:41 -- accel/accel.sh@21 -- # val= 00:06:55.978 16:42:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # IFS=: 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # read -r var val 00:06:55.978 16:42:41 -- accel/accel.sh@21 -- # val= 00:06:55.978 16:42:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # IFS=: 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # read -r var val 00:06:55.978 16:42:41 -- accel/accel.sh@21 -- # val=0x1 00:06:55.978 16:42:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # IFS=: 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # read -r var val 00:06:55.978 16:42:41 -- accel/accel.sh@21 -- # val= 00:06:55.978 16:42:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # IFS=: 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # read -r var val 00:06:55.978 16:42:41 -- accel/accel.sh@21 -- # val= 00:06:55.978 16:42:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # IFS=: 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # read -r var val 00:06:55.978 16:42:41 -- accel/accel.sh@21 -- # val=compare 00:06:55.978 16:42:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.978 16:42:41 -- accel/accel.sh@24 -- # accel_opc=compare 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # IFS=: 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # read -r var val 00:06:55.978 16:42:41 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:55.978 16:42:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # IFS=: 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # read -r var val 00:06:55.978 16:42:41 -- accel/accel.sh@21 -- # val= 00:06:55.978 16:42:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # IFS=: 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # read -r var val 00:06:55.978 16:42:41 -- accel/accel.sh@21 -- # val=software 00:06:55.978 16:42:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.978 16:42:41 -- accel/accel.sh@23 -- # accel_module=software 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # IFS=: 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # read -r var val 00:06:55.978 16:42:41 -- accel/accel.sh@21 -- # val=32 00:06:55.978 16:42:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # IFS=: 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # read -r var val 00:06:55.978 16:42:41 -- accel/accel.sh@21 -- # val=32 00:06:55.978 16:42:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # IFS=: 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # read -r var val 00:06:55.978 16:42:41 -- accel/accel.sh@21 -- # val=1 00:06:55.978 16:42:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # IFS=: 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # read -r var val 00:06:55.978 16:42:41 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:55.978 16:42:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # IFS=: 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # read -r var val 00:06:55.978 16:42:41 -- accel/accel.sh@21 -- # val=Yes 00:06:55.978 16:42:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # IFS=: 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # read -r var val 00:06:55.978 16:42:41 -- accel/accel.sh@21 -- # val= 00:06:55.978 16:42:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # IFS=: 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # read -r var val 00:06:55.978 16:42:41 -- accel/accel.sh@21 -- # val= 00:06:55.978 16:42:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # IFS=: 00:06:55.978 16:42:41 -- accel/accel.sh@20 -- # read -r var val 00:06:57.356 16:42:42 -- accel/accel.sh@21 -- # val= 00:06:57.356 16:42:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.356 16:42:42 -- accel/accel.sh@20 -- # IFS=: 00:06:57.356 16:42:42 -- accel/accel.sh@20 -- # read -r var val 00:06:57.356 16:42:42 -- accel/accel.sh@21 -- # val= 00:06:57.356 16:42:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.356 16:42:42 -- accel/accel.sh@20 -- # IFS=: 00:06:57.356 16:42:42 -- accel/accel.sh@20 -- # read -r var val 00:06:57.356 16:42:42 -- accel/accel.sh@21 -- # val= 00:06:57.356 16:42:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.356 16:42:42 -- accel/accel.sh@20 -- # IFS=: 00:06:57.356 16:42:42 -- accel/accel.sh@20 -- # read -r var val 00:06:57.356 16:42:42 -- accel/accel.sh@21 -- # val= 00:06:57.356 16:42:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.356 16:42:42 -- accel/accel.sh@20 -- # IFS=: 00:06:57.356 16:42:42 -- accel/accel.sh@20 -- # read -r var val 00:06:57.356 16:42:42 -- accel/accel.sh@21 -- # val= 00:06:57.356 16:42:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.356 16:42:42 -- accel/accel.sh@20 -- # IFS=: 00:06:57.356 16:42:42 -- accel/accel.sh@20 -- # read -r var val 00:06:57.356 16:42:42 -- accel/accel.sh@21 -- # val= 00:06:57.356 16:42:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.356 16:42:42 -- accel/accel.sh@20 -- # IFS=: 00:06:57.356 16:42:42 -- accel/accel.sh@20 -- # read -r var val 00:06:57.356 16:42:42 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:57.356 16:42:42 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:06:57.356 16:42:42 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:57.356 00:06:57.356 real 0m2.561s 00:06:57.356 user 0m2.321s 00:06:57.356 sys 0m0.247s 00:06:57.356 16:42:42 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:57.356 16:42:42 -- common/autotest_common.sh@10 -- # set +x 00:06:57.356 ************************************ 00:06:57.356 END TEST accel_compare 00:06:57.356 ************************************ 00:06:57.356 16:42:42 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:06:57.356 16:42:42 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:57.356 16:42:42 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:57.356 16:42:42 -- common/autotest_common.sh@10 -- # set +x 00:06:57.356 ************************************ 00:06:57.356 START TEST accel_xor 00:06:57.356 ************************************ 00:06:57.356 16:42:42 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y 00:06:57.356 16:42:42 -- accel/accel.sh@16 -- # local accel_opc 00:06:57.356 16:42:42 -- accel/accel.sh@17 -- # local accel_module 00:06:57.356 16:42:42 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:06:57.356 16:42:42 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:57.356 16:42:42 -- accel/accel.sh@12 -- # build_accel_config 00:06:57.356 16:42:42 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:57.356 16:42:42 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:57.356 16:42:42 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:57.356 16:42:42 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:57.356 16:42:42 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:57.356 16:42:42 -- accel/accel.sh@41 -- # local IFS=, 00:06:57.356 16:42:42 -- accel/accel.sh@42 -- # jq -r . 00:06:57.356 [2024-11-16 16:42:42.776545] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:57.356 [2024-11-16 16:42:42.776636] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid476183 ] 00:06:57.356 EAL: No free 2048 kB hugepages reported on node 1 00:06:57.356 [2024-11-16 16:42:42.845438] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.356 [2024-11-16 16:42:42.880497] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.293 16:42:44 -- accel/accel.sh@18 -- # out=' 00:06:58.293 SPDK Configuration: 00:06:58.293 Core mask: 0x1 00:06:58.293 00:06:58.293 Accel Perf Configuration: 00:06:58.293 Workload Type: xor 00:06:58.293 Source buffers: 2 00:06:58.293 Transfer size: 4096 bytes 00:06:58.293 Vector count 1 00:06:58.293 Module: software 00:06:58.293 Queue depth: 32 00:06:58.293 Allocate depth: 32 00:06:58.293 # threads/core: 1 00:06:58.293 Run time: 1 seconds 00:06:58.293 Verify: Yes 00:06:58.293 00:06:58.293 Running for 1 seconds... 00:06:58.293 00:06:58.293 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:58.293 ------------------------------------------------------------------------------------ 00:06:58.293 0,0 714400/s 2790 MiB/s 0 0 00:06:58.293 ==================================================================================== 00:06:58.293 Total 714400/s 2790 MiB/s 0 0' 00:06:58.553 16:42:44 -- accel/accel.sh@20 -- # IFS=: 00:06:58.553 16:42:44 -- accel/accel.sh@20 -- # read -r var val 00:06:58.553 16:42:44 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:06:58.553 16:42:44 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.553 16:42:44 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:58.553 16:42:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.553 16:42:44 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:58.553 16:42:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.553 16:42:44 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:58.553 16:42:44 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:58.553 16:42:44 -- accel/accel.sh@41 -- # local IFS=, 00:06:58.553 16:42:44 -- accel/accel.sh@42 -- # jq -r . 00:06:58.553 [2024-11-16 16:42:44.060871] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:58.553 [2024-11-16 16:42:44.060974] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid476449 ] 00:06:58.553 EAL: No free 2048 kB hugepages reported on node 1 00:06:58.553 [2024-11-16 16:42:44.127604] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.553 [2024-11-16 16:42:44.161618] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.553 16:42:44 -- accel/accel.sh@21 -- # val= 00:06:58.553 16:42:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.553 16:42:44 -- accel/accel.sh@20 -- # IFS=: 00:06:58.553 16:42:44 -- accel/accel.sh@20 -- # read -r var val 00:06:58.553 16:42:44 -- accel/accel.sh@21 -- # val= 00:06:58.553 16:42:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.553 16:42:44 -- accel/accel.sh@20 -- # IFS=: 00:06:58.553 16:42:44 -- accel/accel.sh@20 -- # read -r var val 00:06:58.553 16:42:44 -- accel/accel.sh@21 -- # val=0x1 00:06:58.553 16:42:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.553 16:42:44 -- accel/accel.sh@20 -- # IFS=: 00:06:58.553 16:42:44 -- accel/accel.sh@20 -- # read -r var val 00:06:58.553 16:42:44 -- accel/accel.sh@21 -- # val= 00:06:58.553 16:42:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.553 16:42:44 -- accel/accel.sh@20 -- # IFS=: 00:06:58.553 16:42:44 -- accel/accel.sh@20 -- # read -r var val 00:06:58.553 16:42:44 -- accel/accel.sh@21 -- # val= 00:06:58.553 16:42:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.553 16:42:44 -- accel/accel.sh@20 -- # IFS=: 00:06:58.553 16:42:44 -- accel/accel.sh@20 -- # read -r var val 00:06:58.553 16:42:44 -- accel/accel.sh@21 -- # val=xor 00:06:58.553 16:42:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.553 16:42:44 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:58.553 16:42:44 -- accel/accel.sh@20 -- # IFS=: 00:06:58.553 16:42:44 -- accel/accel.sh@20 -- # read -r var val 00:06:58.553 16:42:44 -- accel/accel.sh@21 -- # val=2 00:06:58.553 16:42:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.553 16:42:44 -- accel/accel.sh@20 -- # IFS=: 00:06:58.553 16:42:44 -- accel/accel.sh@20 -- # read -r var val 00:06:58.553 16:42:44 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:58.553 16:42:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.554 16:42:44 -- accel/accel.sh@20 -- # IFS=: 00:06:58.554 16:42:44 -- accel/accel.sh@20 -- # read -r var val 00:06:58.554 16:42:44 -- accel/accel.sh@21 -- # val= 00:06:58.554 16:42:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.554 16:42:44 -- accel/accel.sh@20 -- # IFS=: 00:06:58.554 16:42:44 -- accel/accel.sh@20 -- # read -r var val 00:06:58.554 16:42:44 -- accel/accel.sh@21 -- # val=software 00:06:58.554 16:42:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.554 16:42:44 -- accel/accel.sh@23 -- # accel_module=software 00:06:58.554 16:42:44 -- accel/accel.sh@20 -- # IFS=: 00:06:58.554 16:42:44 -- accel/accel.sh@20 -- # read -r var val 00:06:58.554 16:42:44 -- accel/accel.sh@21 -- # val=32 00:06:58.554 16:42:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.554 16:42:44 -- accel/accel.sh@20 -- # IFS=: 00:06:58.554 16:42:44 -- accel/accel.sh@20 -- # read -r var val 00:06:58.554 16:42:44 -- accel/accel.sh@21 -- # val=32 00:06:58.554 16:42:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.554 16:42:44 -- accel/accel.sh@20 -- # IFS=: 00:06:58.554 16:42:44 -- accel/accel.sh@20 -- # read -r var val 00:06:58.554 16:42:44 -- accel/accel.sh@21 -- # val=1 00:06:58.554 16:42:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.554 16:42:44 -- accel/accel.sh@20 -- # IFS=: 00:06:58.554 16:42:44 -- accel/accel.sh@20 -- # read -r var val 00:06:58.554 16:42:44 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:58.554 16:42:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.554 16:42:44 -- accel/accel.sh@20 -- # IFS=: 00:06:58.554 16:42:44 -- accel/accel.sh@20 -- # read -r var val 00:06:58.554 16:42:44 -- accel/accel.sh@21 -- # val=Yes 00:06:58.554 16:42:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.554 16:42:44 -- accel/accel.sh@20 -- # IFS=: 00:06:58.554 16:42:44 -- accel/accel.sh@20 -- # read -r var val 00:06:58.554 16:42:44 -- accel/accel.sh@21 -- # val= 00:06:58.554 16:42:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.554 16:42:44 -- accel/accel.sh@20 -- # IFS=: 00:06:58.554 16:42:44 -- accel/accel.sh@20 -- # read -r var val 00:06:58.554 16:42:44 -- accel/accel.sh@21 -- # val= 00:06:58.554 16:42:44 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.554 16:42:44 -- accel/accel.sh@20 -- # IFS=: 00:06:58.554 16:42:44 -- accel/accel.sh@20 -- # read -r var val 00:06:59.932 16:42:45 -- accel/accel.sh@21 -- # val= 00:06:59.932 16:42:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.932 16:42:45 -- accel/accel.sh@20 -- # IFS=: 00:06:59.932 16:42:45 -- accel/accel.sh@20 -- # read -r var val 00:06:59.932 16:42:45 -- accel/accel.sh@21 -- # val= 00:06:59.932 16:42:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.932 16:42:45 -- accel/accel.sh@20 -- # IFS=: 00:06:59.932 16:42:45 -- accel/accel.sh@20 -- # read -r var val 00:06:59.932 16:42:45 -- accel/accel.sh@21 -- # val= 00:06:59.932 16:42:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.932 16:42:45 -- accel/accel.sh@20 -- # IFS=: 00:06:59.932 16:42:45 -- accel/accel.sh@20 -- # read -r var val 00:06:59.932 16:42:45 -- accel/accel.sh@21 -- # val= 00:06:59.932 16:42:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.932 16:42:45 -- accel/accel.sh@20 -- # IFS=: 00:06:59.932 16:42:45 -- accel/accel.sh@20 -- # read -r var val 00:06:59.932 16:42:45 -- accel/accel.sh@21 -- # val= 00:06:59.932 16:42:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.932 16:42:45 -- accel/accel.sh@20 -- # IFS=: 00:06:59.932 16:42:45 -- accel/accel.sh@20 -- # read -r var val 00:06:59.932 16:42:45 -- accel/accel.sh@21 -- # val= 00:06:59.932 16:42:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:59.932 16:42:45 -- accel/accel.sh@20 -- # IFS=: 00:06:59.932 16:42:45 -- accel/accel.sh@20 -- # read -r var val 00:06:59.932 16:42:45 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:59.932 16:42:45 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:59.932 16:42:45 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:59.932 00:06:59.932 real 0m2.571s 00:06:59.932 user 0m2.326s 00:06:59.932 sys 0m0.253s 00:06:59.932 16:42:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:59.932 16:42:45 -- common/autotest_common.sh@10 -- # set +x 00:06:59.932 ************************************ 00:06:59.932 END TEST accel_xor 00:06:59.932 ************************************ 00:06:59.932 16:42:45 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:06:59.932 16:42:45 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:59.932 16:42:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:59.932 16:42:45 -- common/autotest_common.sh@10 -- # set +x 00:06:59.932 ************************************ 00:06:59.932 START TEST accel_xor 00:06:59.932 ************************************ 00:06:59.932 16:42:45 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y -x 3 00:06:59.932 16:42:45 -- accel/accel.sh@16 -- # local accel_opc 00:06:59.932 16:42:45 -- accel/accel.sh@17 -- # local accel_module 00:06:59.932 16:42:45 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:06:59.933 16:42:45 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:59.933 16:42:45 -- accel/accel.sh@12 -- # build_accel_config 00:06:59.933 16:42:45 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:59.933 16:42:45 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.933 16:42:45 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.933 16:42:45 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:59.933 16:42:45 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:59.933 16:42:45 -- accel/accel.sh@41 -- # local IFS=, 00:06:59.933 16:42:45 -- accel/accel.sh@42 -- # jq -r . 00:06:59.933 [2024-11-16 16:42:45.378155] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:06:59.933 [2024-11-16 16:42:45.378211] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid476732 ] 00:06:59.933 EAL: No free 2048 kB hugepages reported on node 1 00:06:59.933 [2024-11-16 16:42:45.442539] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.933 [2024-11-16 16:42:45.477183] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.311 16:42:46 -- accel/accel.sh@18 -- # out=' 00:07:01.311 SPDK Configuration: 00:07:01.311 Core mask: 0x1 00:07:01.311 00:07:01.311 Accel Perf Configuration: 00:07:01.311 Workload Type: xor 00:07:01.311 Source buffers: 3 00:07:01.311 Transfer size: 4096 bytes 00:07:01.311 Vector count 1 00:07:01.311 Module: software 00:07:01.311 Queue depth: 32 00:07:01.311 Allocate depth: 32 00:07:01.311 # threads/core: 1 00:07:01.311 Run time: 1 seconds 00:07:01.311 Verify: Yes 00:07:01.311 00:07:01.311 Running for 1 seconds... 00:07:01.311 00:07:01.311 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:01.311 ------------------------------------------------------------------------------------ 00:07:01.311 0,0 661312/s 2583 MiB/s 0 0 00:07:01.311 ==================================================================================== 00:07:01.311 Total 661312/s 2583 MiB/s 0 0' 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # IFS=: 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # read -r var val 00:07:01.311 16:42:46 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:01.311 16:42:46 -- accel/accel.sh@12 -- # build_accel_config 00:07:01.311 16:42:46 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:01.311 16:42:46 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:01.311 16:42:46 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:01.311 16:42:46 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:01.311 16:42:46 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:01.311 16:42:46 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:01.311 16:42:46 -- accel/accel.sh@41 -- # local IFS=, 00:07:01.311 16:42:46 -- accel/accel.sh@42 -- # jq -r . 00:07:01.311 [2024-11-16 16:42:46.648902] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:01.311 [2024-11-16 16:42:46.648952] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid477004 ] 00:07:01.311 EAL: No free 2048 kB hugepages reported on node 1 00:07:01.311 [2024-11-16 16:42:46.709147] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.311 [2024-11-16 16:42:46.743315] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.311 16:42:46 -- accel/accel.sh@21 -- # val= 00:07:01.311 16:42:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # IFS=: 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # read -r var val 00:07:01.311 16:42:46 -- accel/accel.sh@21 -- # val= 00:07:01.311 16:42:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # IFS=: 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # read -r var val 00:07:01.311 16:42:46 -- accel/accel.sh@21 -- # val=0x1 00:07:01.311 16:42:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # IFS=: 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # read -r var val 00:07:01.311 16:42:46 -- accel/accel.sh@21 -- # val= 00:07:01.311 16:42:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # IFS=: 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # read -r var val 00:07:01.311 16:42:46 -- accel/accel.sh@21 -- # val= 00:07:01.311 16:42:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # IFS=: 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # read -r var val 00:07:01.311 16:42:46 -- accel/accel.sh@21 -- # val=xor 00:07:01.311 16:42:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.311 16:42:46 -- accel/accel.sh@24 -- # accel_opc=xor 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # IFS=: 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # read -r var val 00:07:01.311 16:42:46 -- accel/accel.sh@21 -- # val=3 00:07:01.311 16:42:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # IFS=: 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # read -r var val 00:07:01.311 16:42:46 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:01.311 16:42:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # IFS=: 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # read -r var val 00:07:01.311 16:42:46 -- accel/accel.sh@21 -- # val= 00:07:01.311 16:42:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # IFS=: 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # read -r var val 00:07:01.311 16:42:46 -- accel/accel.sh@21 -- # val=software 00:07:01.311 16:42:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.311 16:42:46 -- accel/accel.sh@23 -- # accel_module=software 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # IFS=: 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # read -r var val 00:07:01.311 16:42:46 -- accel/accel.sh@21 -- # val=32 00:07:01.311 16:42:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # IFS=: 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # read -r var val 00:07:01.311 16:42:46 -- accel/accel.sh@21 -- # val=32 00:07:01.311 16:42:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # IFS=: 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # read -r var val 00:07:01.311 16:42:46 -- accel/accel.sh@21 -- # val=1 00:07:01.311 16:42:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # IFS=: 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # read -r var val 00:07:01.311 16:42:46 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:01.311 16:42:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # IFS=: 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # read -r var val 00:07:01.311 16:42:46 -- accel/accel.sh@21 -- # val=Yes 00:07:01.311 16:42:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # IFS=: 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # read -r var val 00:07:01.311 16:42:46 -- accel/accel.sh@21 -- # val= 00:07:01.311 16:42:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # IFS=: 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # read -r var val 00:07:01.311 16:42:46 -- accel/accel.sh@21 -- # val= 00:07:01.311 16:42:46 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # IFS=: 00:07:01.311 16:42:46 -- accel/accel.sh@20 -- # read -r var val 00:07:02.255 16:42:47 -- accel/accel.sh@21 -- # val= 00:07:02.255 16:42:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.255 16:42:47 -- accel/accel.sh@20 -- # IFS=: 00:07:02.255 16:42:47 -- accel/accel.sh@20 -- # read -r var val 00:07:02.255 16:42:47 -- accel/accel.sh@21 -- # val= 00:07:02.255 16:42:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.255 16:42:47 -- accel/accel.sh@20 -- # IFS=: 00:07:02.255 16:42:47 -- accel/accel.sh@20 -- # read -r var val 00:07:02.255 16:42:47 -- accel/accel.sh@21 -- # val= 00:07:02.255 16:42:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.255 16:42:47 -- accel/accel.sh@20 -- # IFS=: 00:07:02.255 16:42:47 -- accel/accel.sh@20 -- # read -r var val 00:07:02.255 16:42:47 -- accel/accel.sh@21 -- # val= 00:07:02.255 16:42:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.255 16:42:47 -- accel/accel.sh@20 -- # IFS=: 00:07:02.255 16:42:47 -- accel/accel.sh@20 -- # read -r var val 00:07:02.255 16:42:47 -- accel/accel.sh@21 -- # val= 00:07:02.255 16:42:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.255 16:42:47 -- accel/accel.sh@20 -- # IFS=: 00:07:02.255 16:42:47 -- accel/accel.sh@20 -- # read -r var val 00:07:02.255 16:42:47 -- accel/accel.sh@21 -- # val= 00:07:02.255 16:42:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.255 16:42:47 -- accel/accel.sh@20 -- # IFS=: 00:07:02.255 16:42:47 -- accel/accel.sh@20 -- # read -r var val 00:07:02.255 16:42:47 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:02.255 16:42:47 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:07:02.255 16:42:47 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:02.255 00:07:02.255 real 0m2.543s 00:07:02.255 user 0m2.313s 00:07:02.255 sys 0m0.239s 00:07:02.255 16:42:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:02.255 16:42:47 -- common/autotest_common.sh@10 -- # set +x 00:07:02.255 ************************************ 00:07:02.255 END TEST accel_xor 00:07:02.255 ************************************ 00:07:02.255 16:42:47 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:02.255 16:42:47 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:02.255 16:42:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:02.255 16:42:47 -- common/autotest_common.sh@10 -- # set +x 00:07:02.255 ************************************ 00:07:02.255 START TEST accel_dif_verify 00:07:02.255 ************************************ 00:07:02.255 16:42:47 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_verify 00:07:02.255 16:42:47 -- accel/accel.sh@16 -- # local accel_opc 00:07:02.255 16:42:47 -- accel/accel.sh@17 -- # local accel_module 00:07:02.255 16:42:47 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:07:02.255 16:42:47 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:02.255 16:42:47 -- accel/accel.sh@12 -- # build_accel_config 00:07:02.255 16:42:47 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:02.255 16:42:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:02.255 16:42:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:02.255 16:42:47 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:02.255 16:42:47 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:02.255 16:42:47 -- accel/accel.sh@41 -- # local IFS=, 00:07:02.255 16:42:47 -- accel/accel.sh@42 -- # jq -r . 00:07:02.255 [2024-11-16 16:42:47.969120] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:02.255 [2024-11-16 16:42:47.969207] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid477283 ] 00:07:02.514 EAL: No free 2048 kB hugepages reported on node 1 00:07:02.514 [2024-11-16 16:42:48.037718] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.514 [2024-11-16 16:42:48.073058] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.891 16:42:49 -- accel/accel.sh@18 -- # out=' 00:07:03.891 SPDK Configuration: 00:07:03.891 Core mask: 0x1 00:07:03.891 00:07:03.891 Accel Perf Configuration: 00:07:03.892 Workload Type: dif_verify 00:07:03.892 Vector size: 4096 bytes 00:07:03.892 Transfer size: 4096 bytes 00:07:03.892 Block size: 512 bytes 00:07:03.892 Metadata size: 8 bytes 00:07:03.892 Vector count 1 00:07:03.892 Module: software 00:07:03.892 Queue depth: 32 00:07:03.892 Allocate depth: 32 00:07:03.892 # threads/core: 1 00:07:03.892 Run time: 1 seconds 00:07:03.892 Verify: No 00:07:03.892 00:07:03.892 Running for 1 seconds... 00:07:03.892 00:07:03.892 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:03.892 ------------------------------------------------------------------------------------ 00:07:03.892 0,0 248640/s 986 MiB/s 0 0 00:07:03.892 ==================================================================================== 00:07:03.892 Total 248640/s 971 MiB/s 0 0' 00:07:03.892 16:42:49 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # IFS=: 00:07:03.892 16:42:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # read -r var val 00:07:03.892 16:42:49 -- accel/accel.sh@12 -- # build_accel_config 00:07:03.892 16:42:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:03.892 16:42:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:03.892 16:42:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:03.892 16:42:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:03.892 16:42:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:03.892 16:42:49 -- accel/accel.sh@41 -- # local IFS=, 00:07:03.892 16:42:49 -- accel/accel.sh@42 -- # jq -r . 00:07:03.892 [2024-11-16 16:42:49.243316] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:03.892 [2024-11-16 16:42:49.243365] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid477443 ] 00:07:03.892 EAL: No free 2048 kB hugepages reported on node 1 00:07:03.892 [2024-11-16 16:42:49.300624] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.892 [2024-11-16 16:42:49.335793] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.892 16:42:49 -- accel/accel.sh@21 -- # val= 00:07:03.892 16:42:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # IFS=: 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # read -r var val 00:07:03.892 16:42:49 -- accel/accel.sh@21 -- # val= 00:07:03.892 16:42:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # IFS=: 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # read -r var val 00:07:03.892 16:42:49 -- accel/accel.sh@21 -- # val=0x1 00:07:03.892 16:42:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # IFS=: 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # read -r var val 00:07:03.892 16:42:49 -- accel/accel.sh@21 -- # val= 00:07:03.892 16:42:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # IFS=: 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # read -r var val 00:07:03.892 16:42:49 -- accel/accel.sh@21 -- # val= 00:07:03.892 16:42:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # IFS=: 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # read -r var val 00:07:03.892 16:42:49 -- accel/accel.sh@21 -- # val=dif_verify 00:07:03.892 16:42:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.892 16:42:49 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # IFS=: 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # read -r var val 00:07:03.892 16:42:49 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:03.892 16:42:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # IFS=: 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # read -r var val 00:07:03.892 16:42:49 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:03.892 16:42:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # IFS=: 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # read -r var val 00:07:03.892 16:42:49 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:03.892 16:42:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # IFS=: 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # read -r var val 00:07:03.892 16:42:49 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:03.892 16:42:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # IFS=: 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # read -r var val 00:07:03.892 16:42:49 -- accel/accel.sh@21 -- # val= 00:07:03.892 16:42:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # IFS=: 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # read -r var val 00:07:03.892 16:42:49 -- accel/accel.sh@21 -- # val=software 00:07:03.892 16:42:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.892 16:42:49 -- accel/accel.sh@23 -- # accel_module=software 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # IFS=: 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # read -r var val 00:07:03.892 16:42:49 -- accel/accel.sh@21 -- # val=32 00:07:03.892 16:42:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # IFS=: 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # read -r var val 00:07:03.892 16:42:49 -- accel/accel.sh@21 -- # val=32 00:07:03.892 16:42:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # IFS=: 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # read -r var val 00:07:03.892 16:42:49 -- accel/accel.sh@21 -- # val=1 00:07:03.892 16:42:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # IFS=: 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # read -r var val 00:07:03.892 16:42:49 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:03.892 16:42:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # IFS=: 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # read -r var val 00:07:03.892 16:42:49 -- accel/accel.sh@21 -- # val=No 00:07:03.892 16:42:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # IFS=: 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # read -r var val 00:07:03.892 16:42:49 -- accel/accel.sh@21 -- # val= 00:07:03.892 16:42:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # IFS=: 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # read -r var val 00:07:03.892 16:42:49 -- accel/accel.sh@21 -- # val= 00:07:03.892 16:42:49 -- accel/accel.sh@22 -- # case "$var" in 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # IFS=: 00:07:03.892 16:42:49 -- accel/accel.sh@20 -- # read -r var val 00:07:04.830 16:42:50 -- accel/accel.sh@21 -- # val= 00:07:04.830 16:42:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.830 16:42:50 -- accel/accel.sh@20 -- # IFS=: 00:07:04.830 16:42:50 -- accel/accel.sh@20 -- # read -r var val 00:07:04.830 16:42:50 -- accel/accel.sh@21 -- # val= 00:07:04.830 16:42:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.830 16:42:50 -- accel/accel.sh@20 -- # IFS=: 00:07:04.830 16:42:50 -- accel/accel.sh@20 -- # read -r var val 00:07:04.830 16:42:50 -- accel/accel.sh@21 -- # val= 00:07:04.830 16:42:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.830 16:42:50 -- accel/accel.sh@20 -- # IFS=: 00:07:04.830 16:42:50 -- accel/accel.sh@20 -- # read -r var val 00:07:04.830 16:42:50 -- accel/accel.sh@21 -- # val= 00:07:04.830 16:42:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.830 16:42:50 -- accel/accel.sh@20 -- # IFS=: 00:07:04.830 16:42:50 -- accel/accel.sh@20 -- # read -r var val 00:07:04.830 16:42:50 -- accel/accel.sh@21 -- # val= 00:07:04.830 16:42:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.830 16:42:50 -- accel/accel.sh@20 -- # IFS=: 00:07:04.830 16:42:50 -- accel/accel.sh@20 -- # read -r var val 00:07:04.830 16:42:50 -- accel/accel.sh@21 -- # val= 00:07:04.830 16:42:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.830 16:42:50 -- accel/accel.sh@20 -- # IFS=: 00:07:04.830 16:42:50 -- accel/accel.sh@20 -- # read -r var val 00:07:04.830 16:42:50 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:04.830 16:42:50 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:07:04.830 16:42:50 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:04.830 00:07:04.830 real 0m2.557s 00:07:04.830 user 0m2.328s 00:07:04.830 sys 0m0.240s 00:07:04.830 16:42:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:04.830 16:42:50 -- common/autotest_common.sh@10 -- # set +x 00:07:04.830 ************************************ 00:07:04.830 END TEST accel_dif_verify 00:07:04.830 ************************************ 00:07:04.830 16:42:50 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:04.830 16:42:50 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:04.830 16:42:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:04.830 16:42:50 -- common/autotest_common.sh@10 -- # set +x 00:07:04.830 ************************************ 00:07:04.830 START TEST accel_dif_generate 00:07:04.830 ************************************ 00:07:04.830 16:42:50 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate 00:07:04.830 16:42:50 -- accel/accel.sh@16 -- # local accel_opc 00:07:04.830 16:42:50 -- accel/accel.sh@17 -- # local accel_module 00:07:04.830 16:42:50 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:07:04.830 16:42:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:04.830 16:42:50 -- accel/accel.sh@12 -- # build_accel_config 00:07:04.830 16:42:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:04.830 16:42:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:04.830 16:42:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:04.830 16:42:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:04.830 16:42:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:04.830 16:42:50 -- accel/accel.sh@41 -- # local IFS=, 00:07:04.830 16:42:50 -- accel/accel.sh@42 -- # jq -r . 00:07:04.830 [2024-11-16 16:42:50.569235] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:04.830 [2024-11-16 16:42:50.569322] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid477616 ] 00:07:05.089 EAL: No free 2048 kB hugepages reported on node 1 00:07:05.089 [2024-11-16 16:42:50.636269] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.089 [2024-11-16 16:42:50.671756] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.466 16:42:51 -- accel/accel.sh@18 -- # out=' 00:07:06.466 SPDK Configuration: 00:07:06.466 Core mask: 0x1 00:07:06.466 00:07:06.466 Accel Perf Configuration: 00:07:06.466 Workload Type: dif_generate 00:07:06.466 Vector size: 4096 bytes 00:07:06.466 Transfer size: 4096 bytes 00:07:06.466 Block size: 512 bytes 00:07:06.466 Metadata size: 8 bytes 00:07:06.466 Vector count 1 00:07:06.466 Module: software 00:07:06.466 Queue depth: 32 00:07:06.466 Allocate depth: 32 00:07:06.466 # threads/core: 1 00:07:06.466 Run time: 1 seconds 00:07:06.466 Verify: No 00:07:06.466 00:07:06.466 Running for 1 seconds... 00:07:06.466 00:07:06.466 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:06.466 ------------------------------------------------------------------------------------ 00:07:06.466 0,0 282144/s 1119 MiB/s 0 0 00:07:06.466 ==================================================================================== 00:07:06.466 Total 282144/s 1102 MiB/s 0 0' 00:07:06.466 16:42:51 -- accel/accel.sh@20 -- # IFS=: 00:07:06.466 16:42:51 -- accel/accel.sh@20 -- # read -r var val 00:07:06.466 16:42:51 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:06.466 16:42:51 -- accel/accel.sh@12 -- # build_accel_config 00:07:06.466 16:42:51 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:06.466 16:42:51 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:06.466 16:42:51 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:06.466 16:42:51 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:06.466 16:42:51 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:06.466 16:42:51 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:06.466 16:42:51 -- accel/accel.sh@41 -- # local IFS=, 00:07:06.466 16:42:51 -- accel/accel.sh@42 -- # jq -r . 00:07:06.466 [2024-11-16 16:42:51.852267] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:06.466 [2024-11-16 16:42:51.852354] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid477868 ] 00:07:06.466 EAL: No free 2048 kB hugepages reported on node 1 00:07:06.466 [2024-11-16 16:42:51.920166] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.466 [2024-11-16 16:42:51.954058] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.466 16:42:51 -- accel/accel.sh@21 -- # val= 00:07:06.466 16:42:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.466 16:42:51 -- accel/accel.sh@20 -- # IFS=: 00:07:06.466 16:42:51 -- accel/accel.sh@20 -- # read -r var val 00:07:06.466 16:42:51 -- accel/accel.sh@21 -- # val= 00:07:06.466 16:42:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.466 16:42:51 -- accel/accel.sh@20 -- # IFS=: 00:07:06.466 16:42:51 -- accel/accel.sh@20 -- # read -r var val 00:07:06.466 16:42:51 -- accel/accel.sh@21 -- # val=0x1 00:07:06.466 16:42:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.466 16:42:51 -- accel/accel.sh@20 -- # IFS=: 00:07:06.466 16:42:51 -- accel/accel.sh@20 -- # read -r var val 00:07:06.466 16:42:51 -- accel/accel.sh@21 -- # val= 00:07:06.466 16:42:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.466 16:42:51 -- accel/accel.sh@20 -- # IFS=: 00:07:06.466 16:42:51 -- accel/accel.sh@20 -- # read -r var val 00:07:06.466 16:42:51 -- accel/accel.sh@21 -- # val= 00:07:06.466 16:42:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.466 16:42:51 -- accel/accel.sh@20 -- # IFS=: 00:07:06.466 16:42:51 -- accel/accel.sh@20 -- # read -r var val 00:07:06.466 16:42:51 -- accel/accel.sh@21 -- # val=dif_generate 00:07:06.466 16:42:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.466 16:42:51 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:07:06.466 16:42:51 -- accel/accel.sh@20 -- # IFS=: 00:07:06.466 16:42:51 -- accel/accel.sh@20 -- # read -r var val 00:07:06.466 16:42:51 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:06.466 16:42:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.466 16:42:51 -- accel/accel.sh@20 -- # IFS=: 00:07:06.466 16:42:51 -- accel/accel.sh@20 -- # read -r var val 00:07:06.466 16:42:51 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:06.466 16:42:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.466 16:42:52 -- accel/accel.sh@20 -- # IFS=: 00:07:06.466 16:42:52 -- accel/accel.sh@20 -- # read -r var val 00:07:06.466 16:42:52 -- accel/accel.sh@21 -- # val='512 bytes' 00:07:06.466 16:42:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.466 16:42:52 -- accel/accel.sh@20 -- # IFS=: 00:07:06.466 16:42:52 -- accel/accel.sh@20 -- # read -r var val 00:07:06.466 16:42:52 -- accel/accel.sh@21 -- # val='8 bytes' 00:07:06.466 16:42:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.466 16:42:52 -- accel/accel.sh@20 -- # IFS=: 00:07:06.466 16:42:52 -- accel/accel.sh@20 -- # read -r var val 00:07:06.466 16:42:52 -- accel/accel.sh@21 -- # val= 00:07:06.466 16:42:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.466 16:42:52 -- accel/accel.sh@20 -- # IFS=: 00:07:06.466 16:42:52 -- accel/accel.sh@20 -- # read -r var val 00:07:06.467 16:42:52 -- accel/accel.sh@21 -- # val=software 00:07:06.467 16:42:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.467 16:42:52 -- accel/accel.sh@23 -- # accel_module=software 00:07:06.467 16:42:52 -- accel/accel.sh@20 -- # IFS=: 00:07:06.467 16:42:52 -- accel/accel.sh@20 -- # read -r var val 00:07:06.467 16:42:52 -- accel/accel.sh@21 -- # val=32 00:07:06.467 16:42:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.467 16:42:52 -- accel/accel.sh@20 -- # IFS=: 00:07:06.467 16:42:52 -- accel/accel.sh@20 -- # read -r var val 00:07:06.467 16:42:52 -- accel/accel.sh@21 -- # val=32 00:07:06.467 16:42:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.467 16:42:52 -- accel/accel.sh@20 -- # IFS=: 00:07:06.467 16:42:52 -- accel/accel.sh@20 -- # read -r var val 00:07:06.467 16:42:52 -- accel/accel.sh@21 -- # val=1 00:07:06.467 16:42:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.467 16:42:52 -- accel/accel.sh@20 -- # IFS=: 00:07:06.467 16:42:52 -- accel/accel.sh@20 -- # read -r var val 00:07:06.467 16:42:52 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:06.467 16:42:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.467 16:42:52 -- accel/accel.sh@20 -- # IFS=: 00:07:06.467 16:42:52 -- accel/accel.sh@20 -- # read -r var val 00:07:06.467 16:42:52 -- accel/accel.sh@21 -- # val=No 00:07:06.467 16:42:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.467 16:42:52 -- accel/accel.sh@20 -- # IFS=: 00:07:06.467 16:42:52 -- accel/accel.sh@20 -- # read -r var val 00:07:06.467 16:42:52 -- accel/accel.sh@21 -- # val= 00:07:06.467 16:42:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.467 16:42:52 -- accel/accel.sh@20 -- # IFS=: 00:07:06.467 16:42:52 -- accel/accel.sh@20 -- # read -r var val 00:07:06.467 16:42:52 -- accel/accel.sh@21 -- # val= 00:07:06.467 16:42:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:06.467 16:42:52 -- accel/accel.sh@20 -- # IFS=: 00:07:06.467 16:42:52 -- accel/accel.sh@20 -- # read -r var val 00:07:07.402 16:42:53 -- accel/accel.sh@21 -- # val= 00:07:07.402 16:42:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.402 16:42:53 -- accel/accel.sh@20 -- # IFS=: 00:07:07.402 16:42:53 -- accel/accel.sh@20 -- # read -r var val 00:07:07.402 16:42:53 -- accel/accel.sh@21 -- # val= 00:07:07.402 16:42:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.402 16:42:53 -- accel/accel.sh@20 -- # IFS=: 00:07:07.402 16:42:53 -- accel/accel.sh@20 -- # read -r var val 00:07:07.402 16:42:53 -- accel/accel.sh@21 -- # val= 00:07:07.402 16:42:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.402 16:42:53 -- accel/accel.sh@20 -- # IFS=: 00:07:07.402 16:42:53 -- accel/accel.sh@20 -- # read -r var val 00:07:07.402 16:42:53 -- accel/accel.sh@21 -- # val= 00:07:07.402 16:42:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.402 16:42:53 -- accel/accel.sh@20 -- # IFS=: 00:07:07.402 16:42:53 -- accel/accel.sh@20 -- # read -r var val 00:07:07.402 16:42:53 -- accel/accel.sh@21 -- # val= 00:07:07.402 16:42:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.402 16:42:53 -- accel/accel.sh@20 -- # IFS=: 00:07:07.402 16:42:53 -- accel/accel.sh@20 -- # read -r var val 00:07:07.402 16:42:53 -- accel/accel.sh@21 -- # val= 00:07:07.402 16:42:53 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.402 16:42:53 -- accel/accel.sh@20 -- # IFS=: 00:07:07.402 16:42:53 -- accel/accel.sh@20 -- # read -r var val 00:07:07.402 16:42:53 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:07.402 16:42:53 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:07:07.402 16:42:53 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:07.402 00:07:07.402 real 0m2.572s 00:07:07.402 user 0m2.318s 00:07:07.402 sys 0m0.264s 00:07:07.402 16:42:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:07.402 16:42:53 -- common/autotest_common.sh@10 -- # set +x 00:07:07.402 ************************************ 00:07:07.402 END TEST accel_dif_generate 00:07:07.402 ************************************ 00:07:07.662 16:42:53 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:07.662 16:42:53 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:07:07.662 16:42:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:07.662 16:42:53 -- common/autotest_common.sh@10 -- # set +x 00:07:07.662 ************************************ 00:07:07.662 START TEST accel_dif_generate_copy 00:07:07.662 ************************************ 00:07:07.662 16:42:53 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate_copy 00:07:07.662 16:42:53 -- accel/accel.sh@16 -- # local accel_opc 00:07:07.662 16:42:53 -- accel/accel.sh@17 -- # local accel_module 00:07:07.662 16:42:53 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:07:07.662 16:42:53 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:07.662 16:42:53 -- accel/accel.sh@12 -- # build_accel_config 00:07:07.662 16:42:53 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:07.662 16:42:53 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:07.662 16:42:53 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:07.662 16:42:53 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:07.662 16:42:53 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:07.662 16:42:53 -- accel/accel.sh@41 -- # local IFS=, 00:07:07.662 16:42:53 -- accel/accel.sh@42 -- # jq -r . 00:07:07.662 [2024-11-16 16:42:53.184850] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:07.662 [2024-11-16 16:42:53.184944] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid478149 ] 00:07:07.662 EAL: No free 2048 kB hugepages reported on node 1 00:07:07.662 [2024-11-16 16:42:53.256905] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.662 [2024-11-16 16:42:53.290207] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.040 16:42:54 -- accel/accel.sh@18 -- # out=' 00:07:09.040 SPDK Configuration: 00:07:09.040 Core mask: 0x1 00:07:09.041 00:07:09.041 Accel Perf Configuration: 00:07:09.041 Workload Type: dif_generate_copy 00:07:09.041 Vector size: 4096 bytes 00:07:09.041 Transfer size: 4096 bytes 00:07:09.041 Vector count 1 00:07:09.041 Module: software 00:07:09.041 Queue depth: 32 00:07:09.041 Allocate depth: 32 00:07:09.041 # threads/core: 1 00:07:09.041 Run time: 1 seconds 00:07:09.041 Verify: No 00:07:09.041 00:07:09.041 Running for 1 seconds... 00:07:09.041 00:07:09.041 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:09.041 ------------------------------------------------------------------------------------ 00:07:09.041 0,0 222784/s 883 MiB/s 0 0 00:07:09.041 ==================================================================================== 00:07:09.041 Total 222784/s 870 MiB/s 0 0' 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # IFS=: 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # read -r var val 00:07:09.041 16:42:54 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:09.041 16:42:54 -- accel/accel.sh@12 -- # build_accel_config 00:07:09.041 16:42:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:09.041 16:42:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:09.041 16:42:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:09.041 16:42:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:09.041 16:42:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:09.041 16:42:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:09.041 16:42:54 -- accel/accel.sh@41 -- # local IFS=, 00:07:09.041 16:42:54 -- accel/accel.sh@42 -- # jq -r . 00:07:09.041 [2024-11-16 16:42:54.471237] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:09.041 [2024-11-16 16:42:54.471329] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid478417 ] 00:07:09.041 EAL: No free 2048 kB hugepages reported on node 1 00:07:09.041 [2024-11-16 16:42:54.538665] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.041 [2024-11-16 16:42:54.573107] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.041 16:42:54 -- accel/accel.sh@21 -- # val= 00:07:09.041 16:42:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # IFS=: 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # read -r var val 00:07:09.041 16:42:54 -- accel/accel.sh@21 -- # val= 00:07:09.041 16:42:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # IFS=: 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # read -r var val 00:07:09.041 16:42:54 -- accel/accel.sh@21 -- # val=0x1 00:07:09.041 16:42:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # IFS=: 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # read -r var val 00:07:09.041 16:42:54 -- accel/accel.sh@21 -- # val= 00:07:09.041 16:42:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # IFS=: 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # read -r var val 00:07:09.041 16:42:54 -- accel/accel.sh@21 -- # val= 00:07:09.041 16:42:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # IFS=: 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # read -r var val 00:07:09.041 16:42:54 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:07:09.041 16:42:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.041 16:42:54 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # IFS=: 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # read -r var val 00:07:09.041 16:42:54 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:09.041 16:42:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # IFS=: 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # read -r var val 00:07:09.041 16:42:54 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:09.041 16:42:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # IFS=: 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # read -r var val 00:07:09.041 16:42:54 -- accel/accel.sh@21 -- # val= 00:07:09.041 16:42:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # IFS=: 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # read -r var val 00:07:09.041 16:42:54 -- accel/accel.sh@21 -- # val=software 00:07:09.041 16:42:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.041 16:42:54 -- accel/accel.sh@23 -- # accel_module=software 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # IFS=: 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # read -r var val 00:07:09.041 16:42:54 -- accel/accel.sh@21 -- # val=32 00:07:09.041 16:42:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # IFS=: 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # read -r var val 00:07:09.041 16:42:54 -- accel/accel.sh@21 -- # val=32 00:07:09.041 16:42:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # IFS=: 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # read -r var val 00:07:09.041 16:42:54 -- accel/accel.sh@21 -- # val=1 00:07:09.041 16:42:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # IFS=: 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # read -r var val 00:07:09.041 16:42:54 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:09.041 16:42:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # IFS=: 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # read -r var val 00:07:09.041 16:42:54 -- accel/accel.sh@21 -- # val=No 00:07:09.041 16:42:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # IFS=: 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # read -r var val 00:07:09.041 16:42:54 -- accel/accel.sh@21 -- # val= 00:07:09.041 16:42:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # IFS=: 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # read -r var val 00:07:09.041 16:42:54 -- accel/accel.sh@21 -- # val= 00:07:09.041 16:42:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # IFS=: 00:07:09.041 16:42:54 -- accel/accel.sh@20 -- # read -r var val 00:07:10.419 16:42:55 -- accel/accel.sh@21 -- # val= 00:07:10.419 16:42:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.419 16:42:55 -- accel/accel.sh@20 -- # IFS=: 00:07:10.419 16:42:55 -- accel/accel.sh@20 -- # read -r var val 00:07:10.419 16:42:55 -- accel/accel.sh@21 -- # val= 00:07:10.419 16:42:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.419 16:42:55 -- accel/accel.sh@20 -- # IFS=: 00:07:10.419 16:42:55 -- accel/accel.sh@20 -- # read -r var val 00:07:10.419 16:42:55 -- accel/accel.sh@21 -- # val= 00:07:10.419 16:42:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.419 16:42:55 -- accel/accel.sh@20 -- # IFS=: 00:07:10.419 16:42:55 -- accel/accel.sh@20 -- # read -r var val 00:07:10.419 16:42:55 -- accel/accel.sh@21 -- # val= 00:07:10.419 16:42:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.419 16:42:55 -- accel/accel.sh@20 -- # IFS=: 00:07:10.419 16:42:55 -- accel/accel.sh@20 -- # read -r var val 00:07:10.419 16:42:55 -- accel/accel.sh@21 -- # val= 00:07:10.419 16:42:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.419 16:42:55 -- accel/accel.sh@20 -- # IFS=: 00:07:10.419 16:42:55 -- accel/accel.sh@20 -- # read -r var val 00:07:10.419 16:42:55 -- accel/accel.sh@21 -- # val= 00:07:10.419 16:42:55 -- accel/accel.sh@22 -- # case "$var" in 00:07:10.419 16:42:55 -- accel/accel.sh@20 -- # IFS=: 00:07:10.419 16:42:55 -- accel/accel.sh@20 -- # read -r var val 00:07:10.419 16:42:55 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:10.419 16:42:55 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:07:10.419 16:42:55 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:10.419 00:07:10.419 real 0m2.578s 00:07:10.419 user 0m2.326s 00:07:10.419 sys 0m0.260s 00:07:10.419 16:42:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:10.419 16:42:55 -- common/autotest_common.sh@10 -- # set +x 00:07:10.419 ************************************ 00:07:10.419 END TEST accel_dif_generate_copy 00:07:10.419 ************************************ 00:07:10.419 16:42:55 -- accel/accel.sh@107 -- # [[ y == y ]] 00:07:10.419 16:42:55 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:10.419 16:42:55 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:07:10.419 16:42:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:10.419 16:42:55 -- common/autotest_common.sh@10 -- # set +x 00:07:10.419 ************************************ 00:07:10.419 START TEST accel_comp 00:07:10.419 ************************************ 00:07:10.419 16:42:55 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:10.419 16:42:55 -- accel/accel.sh@16 -- # local accel_opc 00:07:10.419 16:42:55 -- accel/accel.sh@17 -- # local accel_module 00:07:10.419 16:42:55 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:10.419 16:42:55 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:10.419 16:42:55 -- accel/accel.sh@12 -- # build_accel_config 00:07:10.419 16:42:55 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:10.419 16:42:55 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:10.419 16:42:55 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:10.419 16:42:55 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:10.419 16:42:55 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:10.419 16:42:55 -- accel/accel.sh@41 -- # local IFS=, 00:07:10.419 16:42:55 -- accel/accel.sh@42 -- # jq -r . 00:07:10.419 [2024-11-16 16:42:55.802704] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:10.419 [2024-11-16 16:42:55.802785] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid478698 ] 00:07:10.419 EAL: No free 2048 kB hugepages reported on node 1 00:07:10.419 [2024-11-16 16:42:55.868314] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.419 [2024-11-16 16:42:55.903463] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.357 16:42:57 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:11.357 00:07:11.357 SPDK Configuration: 00:07:11.357 Core mask: 0x1 00:07:11.357 00:07:11.357 Accel Perf Configuration: 00:07:11.357 Workload Type: compress 00:07:11.357 Transfer size: 4096 bytes 00:07:11.357 Vector count 1 00:07:11.357 Module: software 00:07:11.357 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:11.357 Queue depth: 32 00:07:11.357 Allocate depth: 32 00:07:11.357 # threads/core: 1 00:07:11.357 Run time: 1 seconds 00:07:11.357 Verify: No 00:07:11.357 00:07:11.357 Running for 1 seconds... 00:07:11.357 00:07:11.357 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:11.357 ------------------------------------------------------------------------------------ 00:07:11.357 0,0 66912/s 278 MiB/s 0 0 00:07:11.357 ==================================================================================== 00:07:11.357 Total 66912/s 261 MiB/s 0 0' 00:07:11.357 16:42:57 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:11.357 16:42:57 -- accel/accel.sh@20 -- # IFS=: 00:07:11.357 16:42:57 -- accel/accel.sh@20 -- # read -r var val 00:07:11.357 16:42:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:11.357 16:42:57 -- accel/accel.sh@12 -- # build_accel_config 00:07:11.357 16:42:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:11.357 16:42:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:11.357 16:42:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:11.357 16:42:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:11.357 16:42:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:11.357 16:42:57 -- accel/accel.sh@41 -- # local IFS=, 00:07:11.357 16:42:57 -- accel/accel.sh@42 -- # jq -r . 00:07:11.357 [2024-11-16 16:42:57.078273] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:11.357 [2024-11-16 16:42:57.078324] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid478917 ] 00:07:11.616 EAL: No free 2048 kB hugepages reported on node 1 00:07:11.616 [2024-11-16 16:42:57.137220] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.616 [2024-11-16 16:42:57.171966] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.616 16:42:57 -- accel/accel.sh@21 -- # val= 00:07:11.616 16:42:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # IFS=: 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # read -r var val 00:07:11.616 16:42:57 -- accel/accel.sh@21 -- # val= 00:07:11.616 16:42:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # IFS=: 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # read -r var val 00:07:11.616 16:42:57 -- accel/accel.sh@21 -- # val= 00:07:11.616 16:42:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # IFS=: 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # read -r var val 00:07:11.616 16:42:57 -- accel/accel.sh@21 -- # val=0x1 00:07:11.616 16:42:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # IFS=: 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # read -r var val 00:07:11.616 16:42:57 -- accel/accel.sh@21 -- # val= 00:07:11.616 16:42:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # IFS=: 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # read -r var val 00:07:11.616 16:42:57 -- accel/accel.sh@21 -- # val= 00:07:11.616 16:42:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # IFS=: 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # read -r var val 00:07:11.616 16:42:57 -- accel/accel.sh@21 -- # val=compress 00:07:11.616 16:42:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.616 16:42:57 -- accel/accel.sh@24 -- # accel_opc=compress 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # IFS=: 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # read -r var val 00:07:11.616 16:42:57 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:11.616 16:42:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # IFS=: 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # read -r var val 00:07:11.616 16:42:57 -- accel/accel.sh@21 -- # val= 00:07:11.616 16:42:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # IFS=: 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # read -r var val 00:07:11.616 16:42:57 -- accel/accel.sh@21 -- # val=software 00:07:11.616 16:42:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.616 16:42:57 -- accel/accel.sh@23 -- # accel_module=software 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # IFS=: 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # read -r var val 00:07:11.616 16:42:57 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:11.616 16:42:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # IFS=: 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # read -r var val 00:07:11.616 16:42:57 -- accel/accel.sh@21 -- # val=32 00:07:11.616 16:42:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # IFS=: 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # read -r var val 00:07:11.616 16:42:57 -- accel/accel.sh@21 -- # val=32 00:07:11.616 16:42:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # IFS=: 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # read -r var val 00:07:11.616 16:42:57 -- accel/accel.sh@21 -- # val=1 00:07:11.616 16:42:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # IFS=: 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # read -r var val 00:07:11.616 16:42:57 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:11.616 16:42:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # IFS=: 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # read -r var val 00:07:11.616 16:42:57 -- accel/accel.sh@21 -- # val=No 00:07:11.616 16:42:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # IFS=: 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # read -r var val 00:07:11.616 16:42:57 -- accel/accel.sh@21 -- # val= 00:07:11.616 16:42:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # IFS=: 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # read -r var val 00:07:11.616 16:42:57 -- accel/accel.sh@21 -- # val= 00:07:11.616 16:42:57 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # IFS=: 00:07:11.616 16:42:57 -- accel/accel.sh@20 -- # read -r var val 00:07:12.994 16:42:58 -- accel/accel.sh@21 -- # val= 00:07:12.994 16:42:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.994 16:42:58 -- accel/accel.sh@20 -- # IFS=: 00:07:12.994 16:42:58 -- accel/accel.sh@20 -- # read -r var val 00:07:12.994 16:42:58 -- accel/accel.sh@21 -- # val= 00:07:12.994 16:42:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.994 16:42:58 -- accel/accel.sh@20 -- # IFS=: 00:07:12.994 16:42:58 -- accel/accel.sh@20 -- # read -r var val 00:07:12.994 16:42:58 -- accel/accel.sh@21 -- # val= 00:07:12.994 16:42:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.994 16:42:58 -- accel/accel.sh@20 -- # IFS=: 00:07:12.994 16:42:58 -- accel/accel.sh@20 -- # read -r var val 00:07:12.994 16:42:58 -- accel/accel.sh@21 -- # val= 00:07:12.994 16:42:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.994 16:42:58 -- accel/accel.sh@20 -- # IFS=: 00:07:12.994 16:42:58 -- accel/accel.sh@20 -- # read -r var val 00:07:12.994 16:42:58 -- accel/accel.sh@21 -- # val= 00:07:12.994 16:42:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.994 16:42:58 -- accel/accel.sh@20 -- # IFS=: 00:07:12.994 16:42:58 -- accel/accel.sh@20 -- # read -r var val 00:07:12.994 16:42:58 -- accel/accel.sh@21 -- # val= 00:07:12.994 16:42:58 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.994 16:42:58 -- accel/accel.sh@20 -- # IFS=: 00:07:12.994 16:42:58 -- accel/accel.sh@20 -- # read -r var val 00:07:12.994 16:42:58 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:12.994 16:42:58 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:07:12.994 16:42:58 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:12.994 00:07:12.994 real 0m2.560s 00:07:12.994 user 0m2.321s 00:07:12.994 sys 0m0.248s 00:07:12.994 16:42:58 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:12.994 16:42:58 -- common/autotest_common.sh@10 -- # set +x 00:07:12.994 ************************************ 00:07:12.994 END TEST accel_comp 00:07:12.994 ************************************ 00:07:12.994 16:42:58 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:12.994 16:42:58 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:12.994 16:42:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:12.994 16:42:58 -- common/autotest_common.sh@10 -- # set +x 00:07:12.994 ************************************ 00:07:12.994 START TEST accel_decomp 00:07:12.994 ************************************ 00:07:12.994 16:42:58 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:12.994 16:42:58 -- accel/accel.sh@16 -- # local accel_opc 00:07:12.994 16:42:58 -- accel/accel.sh@17 -- # local accel_module 00:07:12.994 16:42:58 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:12.994 16:42:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:12.994 16:42:58 -- accel/accel.sh@12 -- # build_accel_config 00:07:12.994 16:42:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:12.994 16:42:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.994 16:42:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.994 16:42:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:12.994 16:42:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:12.994 16:42:58 -- accel/accel.sh@41 -- # local IFS=, 00:07:12.994 16:42:58 -- accel/accel.sh@42 -- # jq -r . 00:07:12.994 [2024-11-16 16:42:58.406719] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:12.994 [2024-11-16 16:42:58.406829] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid479093 ] 00:07:12.994 EAL: No free 2048 kB hugepages reported on node 1 00:07:12.994 [2024-11-16 16:42:58.474558] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.994 [2024-11-16 16:42:58.509907] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.943 16:42:59 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:13.943 00:07:13.943 SPDK Configuration: 00:07:13.943 Core mask: 0x1 00:07:13.943 00:07:13.943 Accel Perf Configuration: 00:07:13.943 Workload Type: decompress 00:07:13.943 Transfer size: 4096 bytes 00:07:13.943 Vector count 1 00:07:13.943 Module: software 00:07:13.943 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:13.943 Queue depth: 32 00:07:13.943 Allocate depth: 32 00:07:13.943 # threads/core: 1 00:07:13.943 Run time: 1 seconds 00:07:13.943 Verify: Yes 00:07:13.943 00:07:13.943 Running for 1 seconds... 00:07:13.943 00:07:13.943 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:13.943 ------------------------------------------------------------------------------------ 00:07:13.943 0,0 93696/s 172 MiB/s 0 0 00:07:13.943 ==================================================================================== 00:07:13.943 Total 93696/s 366 MiB/s 0 0' 00:07:13.943 16:42:59 -- accel/accel.sh@20 -- # IFS=: 00:07:13.943 16:42:59 -- accel/accel.sh@20 -- # read -r var val 00:07:13.943 16:42:59 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:13.943 16:42:59 -- accel/accel.sh@12 -- # build_accel_config 00:07:13.943 16:42:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:13.943 16:42:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:13.943 16:42:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:13.943 16:42:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:13.943 16:42:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:13.943 16:42:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:13.943 16:42:59 -- accel/accel.sh@41 -- # local IFS=, 00:07:13.943 16:42:59 -- accel/accel.sh@42 -- # jq -r . 00:07:13.943 [2024-11-16 16:42:59.686927] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:13.943 [2024-11-16 16:42:59.686988] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid479279 ] 00:07:14.202 EAL: No free 2048 kB hugepages reported on node 1 00:07:14.202 [2024-11-16 16:42:59.747511] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.202 [2024-11-16 16:42:59.781819] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.202 16:42:59 -- accel/accel.sh@21 -- # val= 00:07:14.202 16:42:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.202 16:42:59 -- accel/accel.sh@20 -- # IFS=: 00:07:14.202 16:42:59 -- accel/accel.sh@20 -- # read -r var val 00:07:14.202 16:42:59 -- accel/accel.sh@21 -- # val= 00:07:14.202 16:42:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.202 16:42:59 -- accel/accel.sh@20 -- # IFS=: 00:07:14.202 16:42:59 -- accel/accel.sh@20 -- # read -r var val 00:07:14.203 16:42:59 -- accel/accel.sh@21 -- # val= 00:07:14.203 16:42:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # IFS=: 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # read -r var val 00:07:14.203 16:42:59 -- accel/accel.sh@21 -- # val=0x1 00:07:14.203 16:42:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # IFS=: 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # read -r var val 00:07:14.203 16:42:59 -- accel/accel.sh@21 -- # val= 00:07:14.203 16:42:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # IFS=: 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # read -r var val 00:07:14.203 16:42:59 -- accel/accel.sh@21 -- # val= 00:07:14.203 16:42:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # IFS=: 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # read -r var val 00:07:14.203 16:42:59 -- accel/accel.sh@21 -- # val=decompress 00:07:14.203 16:42:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.203 16:42:59 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # IFS=: 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # read -r var val 00:07:14.203 16:42:59 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:14.203 16:42:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # IFS=: 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # read -r var val 00:07:14.203 16:42:59 -- accel/accel.sh@21 -- # val= 00:07:14.203 16:42:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # IFS=: 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # read -r var val 00:07:14.203 16:42:59 -- accel/accel.sh@21 -- # val=software 00:07:14.203 16:42:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.203 16:42:59 -- accel/accel.sh@23 -- # accel_module=software 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # IFS=: 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # read -r var val 00:07:14.203 16:42:59 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:14.203 16:42:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # IFS=: 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # read -r var val 00:07:14.203 16:42:59 -- accel/accel.sh@21 -- # val=32 00:07:14.203 16:42:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # IFS=: 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # read -r var val 00:07:14.203 16:42:59 -- accel/accel.sh@21 -- # val=32 00:07:14.203 16:42:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # IFS=: 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # read -r var val 00:07:14.203 16:42:59 -- accel/accel.sh@21 -- # val=1 00:07:14.203 16:42:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # IFS=: 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # read -r var val 00:07:14.203 16:42:59 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:14.203 16:42:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # IFS=: 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # read -r var val 00:07:14.203 16:42:59 -- accel/accel.sh@21 -- # val=Yes 00:07:14.203 16:42:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # IFS=: 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # read -r var val 00:07:14.203 16:42:59 -- accel/accel.sh@21 -- # val= 00:07:14.203 16:42:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # IFS=: 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # read -r var val 00:07:14.203 16:42:59 -- accel/accel.sh@21 -- # val= 00:07:14.203 16:42:59 -- accel/accel.sh@22 -- # case "$var" in 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # IFS=: 00:07:14.203 16:42:59 -- accel/accel.sh@20 -- # read -r var val 00:07:15.582 16:43:00 -- accel/accel.sh@21 -- # val= 00:07:15.582 16:43:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.582 16:43:00 -- accel/accel.sh@20 -- # IFS=: 00:07:15.582 16:43:00 -- accel/accel.sh@20 -- # read -r var val 00:07:15.582 16:43:00 -- accel/accel.sh@21 -- # val= 00:07:15.582 16:43:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.582 16:43:00 -- accel/accel.sh@20 -- # IFS=: 00:07:15.582 16:43:00 -- accel/accel.sh@20 -- # read -r var val 00:07:15.582 16:43:00 -- accel/accel.sh@21 -- # val= 00:07:15.582 16:43:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.582 16:43:00 -- accel/accel.sh@20 -- # IFS=: 00:07:15.582 16:43:00 -- accel/accel.sh@20 -- # read -r var val 00:07:15.582 16:43:00 -- accel/accel.sh@21 -- # val= 00:07:15.582 16:43:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.582 16:43:00 -- accel/accel.sh@20 -- # IFS=: 00:07:15.582 16:43:00 -- accel/accel.sh@20 -- # read -r var val 00:07:15.582 16:43:00 -- accel/accel.sh@21 -- # val= 00:07:15.582 16:43:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.582 16:43:00 -- accel/accel.sh@20 -- # IFS=: 00:07:15.582 16:43:00 -- accel/accel.sh@20 -- # read -r var val 00:07:15.582 16:43:00 -- accel/accel.sh@21 -- # val= 00:07:15.582 16:43:00 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.582 16:43:00 -- accel/accel.sh@20 -- # IFS=: 00:07:15.582 16:43:00 -- accel/accel.sh@20 -- # read -r var val 00:07:15.582 16:43:00 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:15.582 16:43:00 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:15.582 16:43:00 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:15.582 00:07:15.582 real 0m2.569s 00:07:15.582 user 0m2.324s 00:07:15.582 sys 0m0.255s 00:07:15.582 16:43:00 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:15.582 16:43:00 -- common/autotest_common.sh@10 -- # set +x 00:07:15.582 ************************************ 00:07:15.582 END TEST accel_decomp 00:07:15.582 ************************************ 00:07:15.582 16:43:00 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:15.582 16:43:00 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:15.582 16:43:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:15.582 16:43:00 -- common/autotest_common.sh@10 -- # set +x 00:07:15.582 ************************************ 00:07:15.582 START TEST accel_decmop_full 00:07:15.582 ************************************ 00:07:15.582 16:43:00 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:15.582 16:43:00 -- accel/accel.sh@16 -- # local accel_opc 00:07:15.582 16:43:00 -- accel/accel.sh@17 -- # local accel_module 00:07:15.582 16:43:00 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:15.582 16:43:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:15.582 16:43:00 -- accel/accel.sh@12 -- # build_accel_config 00:07:15.582 16:43:00 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:15.582 16:43:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:15.582 16:43:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:15.582 16:43:00 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:15.582 16:43:00 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:15.582 16:43:00 -- accel/accel.sh@41 -- # local IFS=, 00:07:15.582 16:43:01 -- accel/accel.sh@42 -- # jq -r . 00:07:15.582 [2024-11-16 16:43:01.016659] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:15.582 [2024-11-16 16:43:01.016847] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid479560 ] 00:07:15.582 EAL: No free 2048 kB hugepages reported on node 1 00:07:15.582 [2024-11-16 16:43:01.086079] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:15.582 [2024-11-16 16:43:01.122019] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.961 16:43:02 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:16.961 00:07:16.961 SPDK Configuration: 00:07:16.961 Core mask: 0x1 00:07:16.961 00:07:16.961 Accel Perf Configuration: 00:07:16.961 Workload Type: decompress 00:07:16.961 Transfer size: 111250 bytes 00:07:16.961 Vector count 1 00:07:16.961 Module: software 00:07:16.961 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:16.961 Queue depth: 32 00:07:16.961 Allocate depth: 32 00:07:16.961 # threads/core: 1 00:07:16.961 Run time: 1 seconds 00:07:16.961 Verify: Yes 00:07:16.961 00:07:16.961 Running for 1 seconds... 00:07:16.961 00:07:16.961 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:16.961 ------------------------------------------------------------------------------------ 00:07:16.961 0,0 5952/s 245 MiB/s 0 0 00:07:16.961 ==================================================================================== 00:07:16.961 Total 5952/s 631 MiB/s 0 0' 00:07:16.961 16:43:02 -- accel/accel.sh@20 -- # IFS=: 00:07:16.961 16:43:02 -- accel/accel.sh@20 -- # read -r var val 00:07:16.961 16:43:02 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:16.961 16:43:02 -- accel/accel.sh@12 -- # build_accel_config 00:07:16.961 16:43:02 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:16.961 16:43:02 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:16.961 16:43:02 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:16.961 16:43:02 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:16.961 16:43:02 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:16.962 16:43:02 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:16.962 16:43:02 -- accel/accel.sh@41 -- # local IFS=, 00:07:16.962 16:43:02 -- accel/accel.sh@42 -- # jq -r . 00:07:16.962 [2024-11-16 16:43:02.314631] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:16.962 [2024-11-16 16:43:02.314738] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid479834 ] 00:07:16.962 EAL: No free 2048 kB hugepages reported on node 1 00:07:16.962 [2024-11-16 16:43:02.381346] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.962 [2024-11-16 16:43:02.415069] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.962 16:43:02 -- accel/accel.sh@21 -- # val= 00:07:16.962 16:43:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # IFS=: 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # read -r var val 00:07:16.962 16:43:02 -- accel/accel.sh@21 -- # val= 00:07:16.962 16:43:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # IFS=: 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # read -r var val 00:07:16.962 16:43:02 -- accel/accel.sh@21 -- # val= 00:07:16.962 16:43:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # IFS=: 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # read -r var val 00:07:16.962 16:43:02 -- accel/accel.sh@21 -- # val=0x1 00:07:16.962 16:43:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # IFS=: 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # read -r var val 00:07:16.962 16:43:02 -- accel/accel.sh@21 -- # val= 00:07:16.962 16:43:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # IFS=: 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # read -r var val 00:07:16.962 16:43:02 -- accel/accel.sh@21 -- # val= 00:07:16.962 16:43:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # IFS=: 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # read -r var val 00:07:16.962 16:43:02 -- accel/accel.sh@21 -- # val=decompress 00:07:16.962 16:43:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.962 16:43:02 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # IFS=: 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # read -r var val 00:07:16.962 16:43:02 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:16.962 16:43:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # IFS=: 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # read -r var val 00:07:16.962 16:43:02 -- accel/accel.sh@21 -- # val= 00:07:16.962 16:43:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # IFS=: 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # read -r var val 00:07:16.962 16:43:02 -- accel/accel.sh@21 -- # val=software 00:07:16.962 16:43:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.962 16:43:02 -- accel/accel.sh@23 -- # accel_module=software 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # IFS=: 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # read -r var val 00:07:16.962 16:43:02 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:16.962 16:43:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # IFS=: 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # read -r var val 00:07:16.962 16:43:02 -- accel/accel.sh@21 -- # val=32 00:07:16.962 16:43:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # IFS=: 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # read -r var val 00:07:16.962 16:43:02 -- accel/accel.sh@21 -- # val=32 00:07:16.962 16:43:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # IFS=: 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # read -r var val 00:07:16.962 16:43:02 -- accel/accel.sh@21 -- # val=1 00:07:16.962 16:43:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # IFS=: 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # read -r var val 00:07:16.962 16:43:02 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:16.962 16:43:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # IFS=: 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # read -r var val 00:07:16.962 16:43:02 -- accel/accel.sh@21 -- # val=Yes 00:07:16.962 16:43:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # IFS=: 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # read -r var val 00:07:16.962 16:43:02 -- accel/accel.sh@21 -- # val= 00:07:16.962 16:43:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # IFS=: 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # read -r var val 00:07:16.962 16:43:02 -- accel/accel.sh@21 -- # val= 00:07:16.962 16:43:02 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # IFS=: 00:07:16.962 16:43:02 -- accel/accel.sh@20 -- # read -r var val 00:07:17.899 16:43:03 -- accel/accel.sh@21 -- # val= 00:07:17.899 16:43:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.899 16:43:03 -- accel/accel.sh@20 -- # IFS=: 00:07:17.899 16:43:03 -- accel/accel.sh@20 -- # read -r var val 00:07:17.899 16:43:03 -- accel/accel.sh@21 -- # val= 00:07:17.899 16:43:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.899 16:43:03 -- accel/accel.sh@20 -- # IFS=: 00:07:17.899 16:43:03 -- accel/accel.sh@20 -- # read -r var val 00:07:17.899 16:43:03 -- accel/accel.sh@21 -- # val= 00:07:17.899 16:43:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.899 16:43:03 -- accel/accel.sh@20 -- # IFS=: 00:07:17.900 16:43:03 -- accel/accel.sh@20 -- # read -r var val 00:07:17.900 16:43:03 -- accel/accel.sh@21 -- # val= 00:07:17.900 16:43:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.900 16:43:03 -- accel/accel.sh@20 -- # IFS=: 00:07:17.900 16:43:03 -- accel/accel.sh@20 -- # read -r var val 00:07:17.900 16:43:03 -- accel/accel.sh@21 -- # val= 00:07:17.900 16:43:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.900 16:43:03 -- accel/accel.sh@20 -- # IFS=: 00:07:17.900 16:43:03 -- accel/accel.sh@20 -- # read -r var val 00:07:17.900 16:43:03 -- accel/accel.sh@21 -- # val= 00:07:17.900 16:43:03 -- accel/accel.sh@22 -- # case "$var" in 00:07:17.900 16:43:03 -- accel/accel.sh@20 -- # IFS=: 00:07:17.900 16:43:03 -- accel/accel.sh@20 -- # read -r var val 00:07:17.900 16:43:03 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:17.900 16:43:03 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:17.900 16:43:03 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:17.900 00:07:17.900 real 0m2.600s 00:07:17.900 user 0m2.348s 00:07:17.900 sys 0m0.258s 00:07:17.900 16:43:03 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:17.900 16:43:03 -- common/autotest_common.sh@10 -- # set +x 00:07:17.900 ************************************ 00:07:17.900 END TEST accel_decmop_full 00:07:17.900 ************************************ 00:07:17.900 16:43:03 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:17.900 16:43:03 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:17.900 16:43:03 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:17.900 16:43:03 -- common/autotest_common.sh@10 -- # set +x 00:07:17.900 ************************************ 00:07:17.900 START TEST accel_decomp_mcore 00:07:17.900 ************************************ 00:07:17.900 16:43:03 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:17.900 16:43:03 -- accel/accel.sh@16 -- # local accel_opc 00:07:17.900 16:43:03 -- accel/accel.sh@17 -- # local accel_module 00:07:17.900 16:43:03 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:17.900 16:43:03 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:17.900 16:43:03 -- accel/accel.sh@12 -- # build_accel_config 00:07:17.900 16:43:03 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:17.900 16:43:03 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:17.900 16:43:03 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:17.900 16:43:03 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:17.900 16:43:03 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:17.900 16:43:03 -- accel/accel.sh@41 -- # local IFS=, 00:07:18.159 16:43:03 -- accel/accel.sh@42 -- # jq -r . 00:07:18.159 [2024-11-16 16:43:03.661397] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:18.159 [2024-11-16 16:43:03.661475] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid480115 ] 00:07:18.159 EAL: No free 2048 kB hugepages reported on node 1 00:07:18.159 [2024-11-16 16:43:03.728877] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:18.159 [2024-11-16 16:43:03.766445] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:18.159 [2024-11-16 16:43:03.766538] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:18.159 [2024-11-16 16:43:03.766638] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:18.159 [2024-11-16 16:43:03.766640] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.538 16:43:04 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:19.538 00:07:19.538 SPDK Configuration: 00:07:19.538 Core mask: 0xf 00:07:19.538 00:07:19.538 Accel Perf Configuration: 00:07:19.538 Workload Type: decompress 00:07:19.538 Transfer size: 4096 bytes 00:07:19.538 Vector count 1 00:07:19.538 Module: software 00:07:19.538 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:19.538 Queue depth: 32 00:07:19.538 Allocate depth: 32 00:07:19.538 # threads/core: 1 00:07:19.538 Run time: 1 seconds 00:07:19.538 Verify: Yes 00:07:19.538 00:07:19.538 Running for 1 seconds... 00:07:19.538 00:07:19.538 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:19.538 ------------------------------------------------------------------------------------ 00:07:19.538 0,0 78176/s 144 MiB/s 0 0 00:07:19.538 3,0 78688/s 145 MiB/s 0 0 00:07:19.538 2,0 78304/s 144 MiB/s 0 0 00:07:19.538 1,0 78464/s 144 MiB/s 0 0 00:07:19.538 ==================================================================================== 00:07:19.538 Total 313632/s 1225 MiB/s 0 0' 00:07:19.538 16:43:04 -- accel/accel.sh@20 -- # IFS=: 00:07:19.538 16:43:04 -- accel/accel.sh@20 -- # read -r var val 00:07:19.538 16:43:04 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:19.538 16:43:04 -- accel/accel.sh@12 -- # build_accel_config 00:07:19.538 16:43:04 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:19.538 16:43:04 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:19.538 16:43:04 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:19.538 16:43:04 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:19.538 16:43:04 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:19.538 16:43:04 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:19.538 16:43:04 -- accel/accel.sh@41 -- # local IFS=, 00:07:19.538 16:43:04 -- accel/accel.sh@42 -- # jq -r . 00:07:19.538 [2024-11-16 16:43:04.948906] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:19.538 [2024-11-16 16:43:04.948955] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid480385 ] 00:07:19.538 EAL: No free 2048 kB hugepages reported on node 1 00:07:19.538 [2024-11-16 16:43:05.010802] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:19.538 [2024-11-16 16:43:05.047722] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:19.538 [2024-11-16 16:43:05.047816] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:19.538 [2024-11-16 16:43:05.047876] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:19.538 [2024-11-16 16:43:05.047878] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.538 16:43:05 -- accel/accel.sh@21 -- # val= 00:07:19.538 16:43:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # IFS=: 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # read -r var val 00:07:19.538 16:43:05 -- accel/accel.sh@21 -- # val= 00:07:19.538 16:43:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # IFS=: 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # read -r var val 00:07:19.538 16:43:05 -- accel/accel.sh@21 -- # val= 00:07:19.538 16:43:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # IFS=: 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # read -r var val 00:07:19.538 16:43:05 -- accel/accel.sh@21 -- # val=0xf 00:07:19.538 16:43:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # IFS=: 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # read -r var val 00:07:19.538 16:43:05 -- accel/accel.sh@21 -- # val= 00:07:19.538 16:43:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # IFS=: 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # read -r var val 00:07:19.538 16:43:05 -- accel/accel.sh@21 -- # val= 00:07:19.538 16:43:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # IFS=: 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # read -r var val 00:07:19.538 16:43:05 -- accel/accel.sh@21 -- # val=decompress 00:07:19.538 16:43:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.538 16:43:05 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # IFS=: 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # read -r var val 00:07:19.538 16:43:05 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:19.538 16:43:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # IFS=: 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # read -r var val 00:07:19.538 16:43:05 -- accel/accel.sh@21 -- # val= 00:07:19.538 16:43:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # IFS=: 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # read -r var val 00:07:19.538 16:43:05 -- accel/accel.sh@21 -- # val=software 00:07:19.538 16:43:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.538 16:43:05 -- accel/accel.sh@23 -- # accel_module=software 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # IFS=: 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # read -r var val 00:07:19.538 16:43:05 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:19.538 16:43:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # IFS=: 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # read -r var val 00:07:19.538 16:43:05 -- accel/accel.sh@21 -- # val=32 00:07:19.538 16:43:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # IFS=: 00:07:19.538 16:43:05 -- accel/accel.sh@20 -- # read -r var val 00:07:19.539 16:43:05 -- accel/accel.sh@21 -- # val=32 00:07:19.539 16:43:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.539 16:43:05 -- accel/accel.sh@20 -- # IFS=: 00:07:19.539 16:43:05 -- accel/accel.sh@20 -- # read -r var val 00:07:19.539 16:43:05 -- accel/accel.sh@21 -- # val=1 00:07:19.539 16:43:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.539 16:43:05 -- accel/accel.sh@20 -- # IFS=: 00:07:19.539 16:43:05 -- accel/accel.sh@20 -- # read -r var val 00:07:19.539 16:43:05 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:19.539 16:43:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.539 16:43:05 -- accel/accel.sh@20 -- # IFS=: 00:07:19.539 16:43:05 -- accel/accel.sh@20 -- # read -r var val 00:07:19.539 16:43:05 -- accel/accel.sh@21 -- # val=Yes 00:07:19.539 16:43:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.539 16:43:05 -- accel/accel.sh@20 -- # IFS=: 00:07:19.539 16:43:05 -- accel/accel.sh@20 -- # read -r var val 00:07:19.539 16:43:05 -- accel/accel.sh@21 -- # val= 00:07:19.539 16:43:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.539 16:43:05 -- accel/accel.sh@20 -- # IFS=: 00:07:19.539 16:43:05 -- accel/accel.sh@20 -- # read -r var val 00:07:19.539 16:43:05 -- accel/accel.sh@21 -- # val= 00:07:19.539 16:43:05 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.539 16:43:05 -- accel/accel.sh@20 -- # IFS=: 00:07:19.539 16:43:05 -- accel/accel.sh@20 -- # read -r var val 00:07:20.480 16:43:06 -- accel/accel.sh@21 -- # val= 00:07:20.480 16:43:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.480 16:43:06 -- accel/accel.sh@20 -- # IFS=: 00:07:20.480 16:43:06 -- accel/accel.sh@20 -- # read -r var val 00:07:20.480 16:43:06 -- accel/accel.sh@21 -- # val= 00:07:20.480 16:43:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.480 16:43:06 -- accel/accel.sh@20 -- # IFS=: 00:07:20.480 16:43:06 -- accel/accel.sh@20 -- # read -r var val 00:07:20.480 16:43:06 -- accel/accel.sh@21 -- # val= 00:07:20.480 16:43:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.480 16:43:06 -- accel/accel.sh@20 -- # IFS=: 00:07:20.480 16:43:06 -- accel/accel.sh@20 -- # read -r var val 00:07:20.480 16:43:06 -- accel/accel.sh@21 -- # val= 00:07:20.480 16:43:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.480 16:43:06 -- accel/accel.sh@20 -- # IFS=: 00:07:20.480 16:43:06 -- accel/accel.sh@20 -- # read -r var val 00:07:20.480 16:43:06 -- accel/accel.sh@21 -- # val= 00:07:20.480 16:43:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.480 16:43:06 -- accel/accel.sh@20 -- # IFS=: 00:07:20.480 16:43:06 -- accel/accel.sh@20 -- # read -r var val 00:07:20.480 16:43:06 -- accel/accel.sh@21 -- # val= 00:07:20.480 16:43:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.480 16:43:06 -- accel/accel.sh@20 -- # IFS=: 00:07:20.480 16:43:06 -- accel/accel.sh@20 -- # read -r var val 00:07:20.480 16:43:06 -- accel/accel.sh@21 -- # val= 00:07:20.480 16:43:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.480 16:43:06 -- accel/accel.sh@20 -- # IFS=: 00:07:20.480 16:43:06 -- accel/accel.sh@20 -- # read -r var val 00:07:20.480 16:43:06 -- accel/accel.sh@21 -- # val= 00:07:20.480 16:43:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.480 16:43:06 -- accel/accel.sh@20 -- # IFS=: 00:07:20.480 16:43:06 -- accel/accel.sh@20 -- # read -r var val 00:07:20.740 16:43:06 -- accel/accel.sh@21 -- # val= 00:07:20.740 16:43:06 -- accel/accel.sh@22 -- # case "$var" in 00:07:20.740 16:43:06 -- accel/accel.sh@20 -- # IFS=: 00:07:20.740 16:43:06 -- accel/accel.sh@20 -- # read -r var val 00:07:20.740 16:43:06 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:20.740 16:43:06 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:20.740 16:43:06 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:20.740 00:07:20.740 real 0m2.588s 00:07:20.740 user 0m8.992s 00:07:20.740 sys 0m0.260s 00:07:20.740 16:43:06 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:20.740 16:43:06 -- common/autotest_common.sh@10 -- # set +x 00:07:20.740 ************************************ 00:07:20.740 END TEST accel_decomp_mcore 00:07:20.740 ************************************ 00:07:20.740 16:43:06 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:20.740 16:43:06 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:20.740 16:43:06 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:20.740 16:43:06 -- common/autotest_common.sh@10 -- # set +x 00:07:20.740 ************************************ 00:07:20.740 START TEST accel_decomp_full_mcore 00:07:20.740 ************************************ 00:07:20.740 16:43:06 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:20.740 16:43:06 -- accel/accel.sh@16 -- # local accel_opc 00:07:20.740 16:43:06 -- accel/accel.sh@17 -- # local accel_module 00:07:20.740 16:43:06 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:20.740 16:43:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:20.740 16:43:06 -- accel/accel.sh@12 -- # build_accel_config 00:07:20.740 16:43:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:20.740 16:43:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:20.740 16:43:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:20.740 16:43:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:20.740 16:43:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:20.740 16:43:06 -- accel/accel.sh@41 -- # local IFS=, 00:07:20.740 16:43:06 -- accel/accel.sh@42 -- # jq -r . 00:07:20.740 [2024-11-16 16:43:06.291695] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:20.740 [2024-11-16 16:43:06.291781] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid480591 ] 00:07:20.740 EAL: No free 2048 kB hugepages reported on node 1 00:07:20.740 [2024-11-16 16:43:06.360087] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:20.740 [2024-11-16 16:43:06.397666] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:20.740 [2024-11-16 16:43:06.397765] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:20.741 [2024-11-16 16:43:06.397839] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:20.741 [2024-11-16 16:43:06.397841] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.120 16:43:07 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:22.120 00:07:22.120 SPDK Configuration: 00:07:22.120 Core mask: 0xf 00:07:22.120 00:07:22.120 Accel Perf Configuration: 00:07:22.120 Workload Type: decompress 00:07:22.120 Transfer size: 111250 bytes 00:07:22.120 Vector count 1 00:07:22.120 Module: software 00:07:22.120 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:22.120 Queue depth: 32 00:07:22.120 Allocate depth: 32 00:07:22.120 # threads/core: 1 00:07:22.120 Run time: 1 seconds 00:07:22.120 Verify: Yes 00:07:22.120 00:07:22.120 Running for 1 seconds... 00:07:22.120 00:07:22.120 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:22.120 ------------------------------------------------------------------------------------ 00:07:22.120 0,0 5792/s 239 MiB/s 0 0 00:07:22.120 3,0 5824/s 240 MiB/s 0 0 00:07:22.120 2,0 5824/s 240 MiB/s 0 0 00:07:22.120 1,0 5824/s 240 MiB/s 0 0 00:07:22.120 ==================================================================================== 00:07:22.120 Total 23264/s 2468 MiB/s 0 0' 00:07:22.120 16:43:07 -- accel/accel.sh@20 -- # IFS=: 00:07:22.120 16:43:07 -- accel/accel.sh@20 -- # read -r var val 00:07:22.120 16:43:07 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:22.120 16:43:07 -- accel/accel.sh@12 -- # build_accel_config 00:07:22.120 16:43:07 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:22.120 16:43:07 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:22.120 16:43:07 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:22.120 16:43:07 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:22.120 16:43:07 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:22.120 16:43:07 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:22.120 16:43:07 -- accel/accel.sh@41 -- # local IFS=, 00:07:22.120 16:43:07 -- accel/accel.sh@42 -- # jq -r . 00:07:22.120 [2024-11-16 16:43:07.598039] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:22.120 [2024-11-16 16:43:07.598126] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid480747 ] 00:07:22.120 EAL: No free 2048 kB hugepages reported on node 1 00:07:22.120 [2024-11-16 16:43:07.665020] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:22.120 [2024-11-16 16:43:07.701857] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:22.120 [2024-11-16 16:43:07.701952] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:22.120 [2024-11-16 16:43:07.702039] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:22.120 [2024-11-16 16:43:07.702041] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.120 16:43:07 -- accel/accel.sh@21 -- # val= 00:07:22.120 16:43:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.120 16:43:07 -- accel/accel.sh@20 -- # IFS=: 00:07:22.120 16:43:07 -- accel/accel.sh@20 -- # read -r var val 00:07:22.120 16:43:07 -- accel/accel.sh@21 -- # val= 00:07:22.120 16:43:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.120 16:43:07 -- accel/accel.sh@20 -- # IFS=: 00:07:22.120 16:43:07 -- accel/accel.sh@20 -- # read -r var val 00:07:22.120 16:43:07 -- accel/accel.sh@21 -- # val= 00:07:22.120 16:43:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.120 16:43:07 -- accel/accel.sh@20 -- # IFS=: 00:07:22.120 16:43:07 -- accel/accel.sh@20 -- # read -r var val 00:07:22.121 16:43:07 -- accel/accel.sh@21 -- # val=0xf 00:07:22.121 16:43:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # IFS=: 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # read -r var val 00:07:22.121 16:43:07 -- accel/accel.sh@21 -- # val= 00:07:22.121 16:43:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # IFS=: 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # read -r var val 00:07:22.121 16:43:07 -- accel/accel.sh@21 -- # val= 00:07:22.121 16:43:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # IFS=: 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # read -r var val 00:07:22.121 16:43:07 -- accel/accel.sh@21 -- # val=decompress 00:07:22.121 16:43:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.121 16:43:07 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # IFS=: 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # read -r var val 00:07:22.121 16:43:07 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:22.121 16:43:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # IFS=: 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # read -r var val 00:07:22.121 16:43:07 -- accel/accel.sh@21 -- # val= 00:07:22.121 16:43:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # IFS=: 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # read -r var val 00:07:22.121 16:43:07 -- accel/accel.sh@21 -- # val=software 00:07:22.121 16:43:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.121 16:43:07 -- accel/accel.sh@23 -- # accel_module=software 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # IFS=: 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # read -r var val 00:07:22.121 16:43:07 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:22.121 16:43:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # IFS=: 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # read -r var val 00:07:22.121 16:43:07 -- accel/accel.sh@21 -- # val=32 00:07:22.121 16:43:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # IFS=: 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # read -r var val 00:07:22.121 16:43:07 -- accel/accel.sh@21 -- # val=32 00:07:22.121 16:43:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # IFS=: 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # read -r var val 00:07:22.121 16:43:07 -- accel/accel.sh@21 -- # val=1 00:07:22.121 16:43:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # IFS=: 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # read -r var val 00:07:22.121 16:43:07 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:22.121 16:43:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # IFS=: 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # read -r var val 00:07:22.121 16:43:07 -- accel/accel.sh@21 -- # val=Yes 00:07:22.121 16:43:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # IFS=: 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # read -r var val 00:07:22.121 16:43:07 -- accel/accel.sh@21 -- # val= 00:07:22.121 16:43:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # IFS=: 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # read -r var val 00:07:22.121 16:43:07 -- accel/accel.sh@21 -- # val= 00:07:22.121 16:43:07 -- accel/accel.sh@22 -- # case "$var" in 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # IFS=: 00:07:22.121 16:43:07 -- accel/accel.sh@20 -- # read -r var val 00:07:23.500 16:43:08 -- accel/accel.sh@21 -- # val= 00:07:23.500 16:43:08 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.500 16:43:08 -- accel/accel.sh@20 -- # IFS=: 00:07:23.500 16:43:08 -- accel/accel.sh@20 -- # read -r var val 00:07:23.500 16:43:08 -- accel/accel.sh@21 -- # val= 00:07:23.500 16:43:08 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.500 16:43:08 -- accel/accel.sh@20 -- # IFS=: 00:07:23.500 16:43:08 -- accel/accel.sh@20 -- # read -r var val 00:07:23.500 16:43:08 -- accel/accel.sh@21 -- # val= 00:07:23.500 16:43:08 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.500 16:43:08 -- accel/accel.sh@20 -- # IFS=: 00:07:23.500 16:43:08 -- accel/accel.sh@20 -- # read -r var val 00:07:23.500 16:43:08 -- accel/accel.sh@21 -- # val= 00:07:23.500 16:43:08 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.500 16:43:08 -- accel/accel.sh@20 -- # IFS=: 00:07:23.500 16:43:08 -- accel/accel.sh@20 -- # read -r var val 00:07:23.500 16:43:08 -- accel/accel.sh@21 -- # val= 00:07:23.500 16:43:08 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.500 16:43:08 -- accel/accel.sh@20 -- # IFS=: 00:07:23.500 16:43:08 -- accel/accel.sh@20 -- # read -r var val 00:07:23.500 16:43:08 -- accel/accel.sh@21 -- # val= 00:07:23.500 16:43:08 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.500 16:43:08 -- accel/accel.sh@20 -- # IFS=: 00:07:23.500 16:43:08 -- accel/accel.sh@20 -- # read -r var val 00:07:23.500 16:43:08 -- accel/accel.sh@21 -- # val= 00:07:23.500 16:43:08 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.500 16:43:08 -- accel/accel.sh@20 -- # IFS=: 00:07:23.500 16:43:08 -- accel/accel.sh@20 -- # read -r var val 00:07:23.500 16:43:08 -- accel/accel.sh@21 -- # val= 00:07:23.500 16:43:08 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.500 16:43:08 -- accel/accel.sh@20 -- # IFS=: 00:07:23.500 16:43:08 -- accel/accel.sh@20 -- # read -r var val 00:07:23.500 16:43:08 -- accel/accel.sh@21 -- # val= 00:07:23.500 16:43:08 -- accel/accel.sh@22 -- # case "$var" in 00:07:23.500 16:43:08 -- accel/accel.sh@20 -- # IFS=: 00:07:23.500 16:43:08 -- accel/accel.sh@20 -- # read -r var val 00:07:23.500 16:43:08 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:23.500 16:43:08 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:23.500 16:43:08 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:23.500 00:07:23.500 real 0m2.619s 00:07:23.500 user 0m9.051s 00:07:23.500 sys 0m0.276s 00:07:23.500 16:43:08 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:23.500 16:43:08 -- common/autotest_common.sh@10 -- # set +x 00:07:23.500 ************************************ 00:07:23.500 END TEST accel_decomp_full_mcore 00:07:23.500 ************************************ 00:07:23.500 16:43:08 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:23.500 16:43:08 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:23.500 16:43:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:23.500 16:43:08 -- common/autotest_common.sh@10 -- # set +x 00:07:23.500 ************************************ 00:07:23.500 START TEST accel_decomp_mthread 00:07:23.500 ************************************ 00:07:23.500 16:43:08 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:23.500 16:43:08 -- accel/accel.sh@16 -- # local accel_opc 00:07:23.500 16:43:08 -- accel/accel.sh@17 -- # local accel_module 00:07:23.500 16:43:08 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:23.500 16:43:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:23.500 16:43:08 -- accel/accel.sh@12 -- # build_accel_config 00:07:23.501 16:43:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:23.501 16:43:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:23.501 16:43:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:23.501 16:43:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:23.501 16:43:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:23.501 16:43:08 -- accel/accel.sh@41 -- # local IFS=, 00:07:23.501 16:43:08 -- accel/accel.sh@42 -- # jq -r . 00:07:23.501 [2024-11-16 16:43:08.939012] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:23.501 [2024-11-16 16:43:08.939060] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid480990 ] 00:07:23.501 EAL: No free 2048 kB hugepages reported on node 1 00:07:23.501 [2024-11-16 16:43:09.001468] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.501 [2024-11-16 16:43:09.036240] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.880 16:43:10 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:24.880 00:07:24.880 SPDK Configuration: 00:07:24.880 Core mask: 0x1 00:07:24.880 00:07:24.880 Accel Perf Configuration: 00:07:24.880 Workload Type: decompress 00:07:24.880 Transfer size: 4096 bytes 00:07:24.880 Vector count 1 00:07:24.880 Module: software 00:07:24.880 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:24.880 Queue depth: 32 00:07:24.880 Allocate depth: 32 00:07:24.880 # threads/core: 2 00:07:24.880 Run time: 1 seconds 00:07:24.880 Verify: Yes 00:07:24.880 00:07:24.880 Running for 1 seconds... 00:07:24.880 00:07:24.880 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:24.880 ------------------------------------------------------------------------------------ 00:07:24.880 0,1 47552/s 87 MiB/s 0 0 00:07:24.880 0,0 47392/s 87 MiB/s 0 0 00:07:24.880 ==================================================================================== 00:07:24.880 Total 94944/s 370 MiB/s 0 0' 00:07:24.880 16:43:10 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # IFS=: 00:07:24.880 16:43:10 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # read -r var val 00:07:24.880 16:43:10 -- accel/accel.sh@12 -- # build_accel_config 00:07:24.880 16:43:10 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:24.880 16:43:10 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:24.880 16:43:10 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:24.880 16:43:10 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:24.880 16:43:10 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:24.880 16:43:10 -- accel/accel.sh@41 -- # local IFS=, 00:07:24.880 16:43:10 -- accel/accel.sh@42 -- # jq -r . 00:07:24.880 [2024-11-16 16:43:10.214885] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:24.880 [2024-11-16 16:43:10.214933] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid481256 ] 00:07:24.880 EAL: No free 2048 kB hugepages reported on node 1 00:07:24.880 [2024-11-16 16:43:10.272743] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:24.880 [2024-11-16 16:43:10.308017] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.880 16:43:10 -- accel/accel.sh@21 -- # val= 00:07:24.880 16:43:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # IFS=: 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # read -r var val 00:07:24.880 16:43:10 -- accel/accel.sh@21 -- # val= 00:07:24.880 16:43:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # IFS=: 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # read -r var val 00:07:24.880 16:43:10 -- accel/accel.sh@21 -- # val= 00:07:24.880 16:43:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # IFS=: 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # read -r var val 00:07:24.880 16:43:10 -- accel/accel.sh@21 -- # val=0x1 00:07:24.880 16:43:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # IFS=: 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # read -r var val 00:07:24.880 16:43:10 -- accel/accel.sh@21 -- # val= 00:07:24.880 16:43:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # IFS=: 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # read -r var val 00:07:24.880 16:43:10 -- accel/accel.sh@21 -- # val= 00:07:24.880 16:43:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # IFS=: 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # read -r var val 00:07:24.880 16:43:10 -- accel/accel.sh@21 -- # val=decompress 00:07:24.880 16:43:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.880 16:43:10 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # IFS=: 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # read -r var val 00:07:24.880 16:43:10 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:24.880 16:43:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # IFS=: 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # read -r var val 00:07:24.880 16:43:10 -- accel/accel.sh@21 -- # val= 00:07:24.880 16:43:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # IFS=: 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # read -r var val 00:07:24.880 16:43:10 -- accel/accel.sh@21 -- # val=software 00:07:24.880 16:43:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.880 16:43:10 -- accel/accel.sh@23 -- # accel_module=software 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # IFS=: 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # read -r var val 00:07:24.880 16:43:10 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:24.880 16:43:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # IFS=: 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # read -r var val 00:07:24.880 16:43:10 -- accel/accel.sh@21 -- # val=32 00:07:24.880 16:43:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # IFS=: 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # read -r var val 00:07:24.880 16:43:10 -- accel/accel.sh@21 -- # val=32 00:07:24.880 16:43:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # IFS=: 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # read -r var val 00:07:24.880 16:43:10 -- accel/accel.sh@21 -- # val=2 00:07:24.880 16:43:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # IFS=: 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # read -r var val 00:07:24.880 16:43:10 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:24.880 16:43:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # IFS=: 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # read -r var val 00:07:24.880 16:43:10 -- accel/accel.sh@21 -- # val=Yes 00:07:24.880 16:43:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # IFS=: 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # read -r var val 00:07:24.880 16:43:10 -- accel/accel.sh@21 -- # val= 00:07:24.880 16:43:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # IFS=: 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # read -r var val 00:07:24.880 16:43:10 -- accel/accel.sh@21 -- # val= 00:07:24.880 16:43:10 -- accel/accel.sh@22 -- # case "$var" in 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # IFS=: 00:07:24.880 16:43:10 -- accel/accel.sh@20 -- # read -r var val 00:07:25.818 16:43:11 -- accel/accel.sh@21 -- # val= 00:07:25.818 16:43:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.818 16:43:11 -- accel/accel.sh@20 -- # IFS=: 00:07:25.818 16:43:11 -- accel/accel.sh@20 -- # read -r var val 00:07:25.818 16:43:11 -- accel/accel.sh@21 -- # val= 00:07:25.818 16:43:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.818 16:43:11 -- accel/accel.sh@20 -- # IFS=: 00:07:25.818 16:43:11 -- accel/accel.sh@20 -- # read -r var val 00:07:25.818 16:43:11 -- accel/accel.sh@21 -- # val= 00:07:25.818 16:43:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.818 16:43:11 -- accel/accel.sh@20 -- # IFS=: 00:07:25.818 16:43:11 -- accel/accel.sh@20 -- # read -r var val 00:07:25.818 16:43:11 -- accel/accel.sh@21 -- # val= 00:07:25.818 16:43:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.818 16:43:11 -- accel/accel.sh@20 -- # IFS=: 00:07:25.818 16:43:11 -- accel/accel.sh@20 -- # read -r var val 00:07:25.818 16:43:11 -- accel/accel.sh@21 -- # val= 00:07:25.818 16:43:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.818 16:43:11 -- accel/accel.sh@20 -- # IFS=: 00:07:25.818 16:43:11 -- accel/accel.sh@20 -- # read -r var val 00:07:25.818 16:43:11 -- accel/accel.sh@21 -- # val= 00:07:25.818 16:43:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.818 16:43:11 -- accel/accel.sh@20 -- # IFS=: 00:07:25.818 16:43:11 -- accel/accel.sh@20 -- # read -r var val 00:07:25.818 16:43:11 -- accel/accel.sh@21 -- # val= 00:07:25.818 16:43:11 -- accel/accel.sh@22 -- # case "$var" in 00:07:25.818 16:43:11 -- accel/accel.sh@20 -- # IFS=: 00:07:25.818 16:43:11 -- accel/accel.sh@20 -- # read -r var val 00:07:25.818 16:43:11 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:25.818 16:43:11 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:25.818 16:43:11 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:25.818 00:07:25.818 real 0m2.552s 00:07:25.818 user 0m2.323s 00:07:25.818 sys 0m0.238s 00:07:25.818 16:43:11 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:25.818 16:43:11 -- common/autotest_common.sh@10 -- # set +x 00:07:25.818 ************************************ 00:07:25.818 END TEST accel_decomp_mthread 00:07:25.818 ************************************ 00:07:25.818 16:43:11 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:25.818 16:43:11 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:25.818 16:43:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:25.818 16:43:11 -- common/autotest_common.sh@10 -- # set +x 00:07:25.818 ************************************ 00:07:25.818 START TEST accel_deomp_full_mthread 00:07:25.818 ************************************ 00:07:25.818 16:43:11 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:25.818 16:43:11 -- accel/accel.sh@16 -- # local accel_opc 00:07:25.818 16:43:11 -- accel/accel.sh@17 -- # local accel_module 00:07:25.818 16:43:11 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:25.818 16:43:11 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:25.818 16:43:11 -- accel/accel.sh@12 -- # build_accel_config 00:07:25.818 16:43:11 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:25.818 16:43:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:25.818 16:43:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:25.818 16:43:11 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:25.818 16:43:11 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:25.818 16:43:11 -- accel/accel.sh@41 -- # local IFS=, 00:07:25.818 16:43:11 -- accel/accel.sh@42 -- # jq -r . 00:07:25.818 [2024-11-16 16:43:11.544862] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:25.818 [2024-11-16 16:43:11.544971] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid481545 ] 00:07:26.078 EAL: No free 2048 kB hugepages reported on node 1 00:07:26.078 [2024-11-16 16:43:11.613704] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.078 [2024-11-16 16:43:11.649075] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.456 16:43:12 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:27.456 00:07:27.456 SPDK Configuration: 00:07:27.456 Core mask: 0x1 00:07:27.456 00:07:27.456 Accel Perf Configuration: 00:07:27.456 Workload Type: decompress 00:07:27.456 Transfer size: 111250 bytes 00:07:27.456 Vector count 1 00:07:27.456 Module: software 00:07:27.456 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:27.456 Queue depth: 32 00:07:27.456 Allocate depth: 32 00:07:27.456 # threads/core: 2 00:07:27.456 Run time: 1 seconds 00:07:27.456 Verify: Yes 00:07:27.456 00:07:27.456 Running for 1 seconds... 00:07:27.456 00:07:27.456 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:27.456 ------------------------------------------------------------------------------------ 00:07:27.456 0,1 2944/s 121 MiB/s 0 0 00:07:27.456 0,0 2944/s 121 MiB/s 0 0 00:07:27.456 ==================================================================================== 00:07:27.456 Total 5888/s 624 MiB/s 0 0' 00:07:27.456 16:43:12 -- accel/accel.sh@20 -- # IFS=: 00:07:27.456 16:43:12 -- accel/accel.sh@20 -- # read -r var val 00:07:27.456 16:43:12 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:27.456 16:43:12 -- accel/accel.sh@12 -- # build_accel_config 00:07:27.456 16:43:12 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:27.456 16:43:12 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:27.456 16:43:12 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:27.456 16:43:12 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:27.456 16:43:12 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:27.456 16:43:12 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:27.456 16:43:12 -- accel/accel.sh@41 -- # local IFS=, 00:07:27.456 16:43:12 -- accel/accel.sh@42 -- # jq -r . 00:07:27.456 [2024-11-16 16:43:12.850810] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:27.456 [2024-11-16 16:43:12.850901] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid481811 ] 00:07:27.456 EAL: No free 2048 kB hugepages reported on node 1 00:07:27.456 [2024-11-16 16:43:12.917989] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:27.456 [2024-11-16 16:43:12.951703] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.456 16:43:12 -- accel/accel.sh@21 -- # val= 00:07:27.456 16:43:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.456 16:43:12 -- accel/accel.sh@20 -- # IFS=: 00:07:27.456 16:43:12 -- accel/accel.sh@20 -- # read -r var val 00:07:27.456 16:43:12 -- accel/accel.sh@21 -- # val= 00:07:27.456 16:43:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.456 16:43:12 -- accel/accel.sh@20 -- # IFS=: 00:07:27.456 16:43:12 -- accel/accel.sh@20 -- # read -r var val 00:07:27.456 16:43:12 -- accel/accel.sh@21 -- # val= 00:07:27.456 16:43:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.456 16:43:12 -- accel/accel.sh@20 -- # IFS=: 00:07:27.456 16:43:12 -- accel/accel.sh@20 -- # read -r var val 00:07:27.456 16:43:12 -- accel/accel.sh@21 -- # val=0x1 00:07:27.456 16:43:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.456 16:43:12 -- accel/accel.sh@20 -- # IFS=: 00:07:27.456 16:43:12 -- accel/accel.sh@20 -- # read -r var val 00:07:27.456 16:43:12 -- accel/accel.sh@21 -- # val= 00:07:27.456 16:43:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.456 16:43:12 -- accel/accel.sh@20 -- # IFS=: 00:07:27.456 16:43:12 -- accel/accel.sh@20 -- # read -r var val 00:07:27.456 16:43:12 -- accel/accel.sh@21 -- # val= 00:07:27.456 16:43:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.456 16:43:12 -- accel/accel.sh@20 -- # IFS=: 00:07:27.456 16:43:12 -- accel/accel.sh@20 -- # read -r var val 00:07:27.456 16:43:12 -- accel/accel.sh@21 -- # val=decompress 00:07:27.456 16:43:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.456 16:43:12 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:27.456 16:43:12 -- accel/accel.sh@20 -- # IFS=: 00:07:27.456 16:43:12 -- accel/accel.sh@20 -- # read -r var val 00:07:27.456 16:43:12 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:27.456 16:43:12 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.456 16:43:12 -- accel/accel.sh@20 -- # IFS=: 00:07:27.456 16:43:12 -- accel/accel.sh@20 -- # read -r var val 00:07:27.456 16:43:12 -- accel/accel.sh@21 -- # val= 00:07:27.456 16:43:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.456 16:43:13 -- accel/accel.sh@20 -- # IFS=: 00:07:27.456 16:43:13 -- accel/accel.sh@20 -- # read -r var val 00:07:27.456 16:43:13 -- accel/accel.sh@21 -- # val=software 00:07:27.456 16:43:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.456 16:43:13 -- accel/accel.sh@23 -- # accel_module=software 00:07:27.456 16:43:13 -- accel/accel.sh@20 -- # IFS=: 00:07:27.456 16:43:13 -- accel/accel.sh@20 -- # read -r var val 00:07:27.456 16:43:13 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:27.456 16:43:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.456 16:43:13 -- accel/accel.sh@20 -- # IFS=: 00:07:27.456 16:43:13 -- accel/accel.sh@20 -- # read -r var val 00:07:27.456 16:43:13 -- accel/accel.sh@21 -- # val=32 00:07:27.456 16:43:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.456 16:43:13 -- accel/accel.sh@20 -- # IFS=: 00:07:27.456 16:43:13 -- accel/accel.sh@20 -- # read -r var val 00:07:27.456 16:43:13 -- accel/accel.sh@21 -- # val=32 00:07:27.456 16:43:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.456 16:43:13 -- accel/accel.sh@20 -- # IFS=: 00:07:27.456 16:43:13 -- accel/accel.sh@20 -- # read -r var val 00:07:27.456 16:43:13 -- accel/accel.sh@21 -- # val=2 00:07:27.456 16:43:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.456 16:43:13 -- accel/accel.sh@20 -- # IFS=: 00:07:27.456 16:43:13 -- accel/accel.sh@20 -- # read -r var val 00:07:27.456 16:43:13 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:27.456 16:43:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.456 16:43:13 -- accel/accel.sh@20 -- # IFS=: 00:07:27.456 16:43:13 -- accel/accel.sh@20 -- # read -r var val 00:07:27.456 16:43:13 -- accel/accel.sh@21 -- # val=Yes 00:07:27.456 16:43:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.456 16:43:13 -- accel/accel.sh@20 -- # IFS=: 00:07:27.456 16:43:13 -- accel/accel.sh@20 -- # read -r var val 00:07:27.456 16:43:13 -- accel/accel.sh@21 -- # val= 00:07:27.456 16:43:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.456 16:43:13 -- accel/accel.sh@20 -- # IFS=: 00:07:27.456 16:43:13 -- accel/accel.sh@20 -- # read -r var val 00:07:27.456 16:43:13 -- accel/accel.sh@21 -- # val= 00:07:27.456 16:43:13 -- accel/accel.sh@22 -- # case "$var" in 00:07:27.456 16:43:13 -- accel/accel.sh@20 -- # IFS=: 00:07:27.456 16:43:13 -- accel/accel.sh@20 -- # read -r var val 00:07:28.393 16:43:14 -- accel/accel.sh@21 -- # val= 00:07:28.393 16:43:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.393 16:43:14 -- accel/accel.sh@20 -- # IFS=: 00:07:28.393 16:43:14 -- accel/accel.sh@20 -- # read -r var val 00:07:28.393 16:43:14 -- accel/accel.sh@21 -- # val= 00:07:28.393 16:43:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.393 16:43:14 -- accel/accel.sh@20 -- # IFS=: 00:07:28.393 16:43:14 -- accel/accel.sh@20 -- # read -r var val 00:07:28.393 16:43:14 -- accel/accel.sh@21 -- # val= 00:07:28.393 16:43:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.393 16:43:14 -- accel/accel.sh@20 -- # IFS=: 00:07:28.393 16:43:14 -- accel/accel.sh@20 -- # read -r var val 00:07:28.393 16:43:14 -- accel/accel.sh@21 -- # val= 00:07:28.393 16:43:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.393 16:43:14 -- accel/accel.sh@20 -- # IFS=: 00:07:28.393 16:43:14 -- accel/accel.sh@20 -- # read -r var val 00:07:28.393 16:43:14 -- accel/accel.sh@21 -- # val= 00:07:28.393 16:43:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.393 16:43:14 -- accel/accel.sh@20 -- # IFS=: 00:07:28.393 16:43:14 -- accel/accel.sh@20 -- # read -r var val 00:07:28.393 16:43:14 -- accel/accel.sh@21 -- # val= 00:07:28.393 16:43:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.393 16:43:14 -- accel/accel.sh@20 -- # IFS=: 00:07:28.393 16:43:14 -- accel/accel.sh@20 -- # read -r var val 00:07:28.393 16:43:14 -- accel/accel.sh@21 -- # val= 00:07:28.393 16:43:14 -- accel/accel.sh@22 -- # case "$var" in 00:07:28.393 16:43:14 -- accel/accel.sh@20 -- # IFS=: 00:07:28.393 16:43:14 -- accel/accel.sh@20 -- # read -r var val 00:07:28.393 16:43:14 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:28.653 16:43:14 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:28.653 16:43:14 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:28.653 00:07:28.653 real 0m2.619s 00:07:28.653 user 0m2.367s 00:07:28.653 sys 0m0.259s 00:07:28.653 16:43:14 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:28.653 16:43:14 -- common/autotest_common.sh@10 -- # set +x 00:07:28.653 ************************************ 00:07:28.653 END TEST accel_deomp_full_mthread 00:07:28.653 ************************************ 00:07:28.653 16:43:14 -- accel/accel.sh@116 -- # [[ n == y ]] 00:07:28.653 16:43:14 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:28.653 16:43:14 -- accel/accel.sh@129 -- # build_accel_config 00:07:28.653 16:43:14 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:07:28.653 16:43:14 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:28.653 16:43:14 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:28.653 16:43:14 -- common/autotest_common.sh@10 -- # set +x 00:07:28.653 16:43:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:28.653 16:43:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:28.653 16:43:14 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:28.653 16:43:14 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:28.653 16:43:14 -- accel/accel.sh@41 -- # local IFS=, 00:07:28.653 16:43:14 -- accel/accel.sh@42 -- # jq -r . 00:07:28.653 ************************************ 00:07:28.653 START TEST accel_dif_functional_tests 00:07:28.653 ************************************ 00:07:28.653 16:43:14 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:28.653 [2024-11-16 16:43:14.211581] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:28.653 [2024-11-16 16:43:14.211660] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid482093 ] 00:07:28.653 EAL: No free 2048 kB hugepages reported on node 1 00:07:28.653 [2024-11-16 16:43:14.278272] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:28.653 [2024-11-16 16:43:14.315115] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:28.653 [2024-11-16 16:43:14.315212] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:28.653 [2024-11-16 16:43:14.315214] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.653 00:07:28.653 00:07:28.653 CUnit - A unit testing framework for C - Version 2.1-3 00:07:28.653 http://cunit.sourceforge.net/ 00:07:28.653 00:07:28.653 00:07:28.653 Suite: accel_dif 00:07:28.653 Test: verify: DIF generated, GUARD check ...passed 00:07:28.653 Test: verify: DIF generated, APPTAG check ...passed 00:07:28.653 Test: verify: DIF generated, REFTAG check ...passed 00:07:28.653 Test: verify: DIF not generated, GUARD check ...[2024-11-16 16:43:14.378095] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:28.653 [2024-11-16 16:43:14.378146] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:28.653 passed 00:07:28.653 Test: verify: DIF not generated, APPTAG check ...[2024-11-16 16:43:14.378194] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:28.653 [2024-11-16 16:43:14.378214] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:28.653 passed 00:07:28.653 Test: verify: DIF not generated, REFTAG check ...[2024-11-16 16:43:14.378234] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:28.653 [2024-11-16 16:43:14.378253] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:28.653 passed 00:07:28.653 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:28.653 Test: verify: APPTAG incorrect, APPTAG check ...[2024-11-16 16:43:14.378299] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:28.653 passed 00:07:28.653 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:28.653 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:28.653 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:28.653 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-11-16 16:43:14.378405] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:28.653 passed 00:07:28.653 Test: generate copy: DIF generated, GUARD check ...passed 00:07:28.653 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:28.653 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:28.653 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:28.653 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:28.653 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:28.653 Test: generate copy: iovecs-len validate ...[2024-11-16 16:43:14.378591] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:28.653 passed 00:07:28.653 Test: generate copy: buffer alignment validate ...passed 00:07:28.653 00:07:28.653 Run Summary: Type Total Ran Passed Failed Inactive 00:07:28.653 suites 1 1 n/a 0 0 00:07:28.653 tests 20 20 20 0 0 00:07:28.653 asserts 204 204 204 0 n/a 00:07:28.653 00:07:28.653 Elapsed time = 0.000 seconds 00:07:28.913 00:07:28.913 real 0m0.343s 00:07:28.913 user 0m0.536s 00:07:28.913 sys 0m0.149s 00:07:28.913 16:43:14 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:28.913 16:43:14 -- common/autotest_common.sh@10 -- # set +x 00:07:28.913 ************************************ 00:07:28.913 END TEST accel_dif_functional_tests 00:07:28.913 ************************************ 00:07:28.913 00:07:28.913 real 0m54.879s 00:07:28.913 user 1m2.772s 00:07:28.913 sys 0m6.765s 00:07:28.913 16:43:14 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:28.913 16:43:14 -- common/autotest_common.sh@10 -- # set +x 00:07:28.913 ************************************ 00:07:28.913 END TEST accel 00:07:28.913 ************************************ 00:07:28.913 16:43:14 -- spdk/autotest.sh@177 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:28.913 16:43:14 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:28.913 16:43:14 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:28.913 16:43:14 -- common/autotest_common.sh@10 -- # set +x 00:07:28.913 ************************************ 00:07:28.913 START TEST accel_rpc 00:07:28.913 ************************************ 00:07:28.913 16:43:14 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:29.173 * Looking for test storage... 00:07:29.173 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:29.173 16:43:14 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:29.173 16:43:14 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:29.173 16:43:14 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:29.173 16:43:14 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:29.173 16:43:14 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:29.173 16:43:14 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:29.173 16:43:14 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:29.173 16:43:14 -- scripts/common.sh@335 -- # IFS=.-: 00:07:29.173 16:43:14 -- scripts/common.sh@335 -- # read -ra ver1 00:07:29.173 16:43:14 -- scripts/common.sh@336 -- # IFS=.-: 00:07:29.173 16:43:14 -- scripts/common.sh@336 -- # read -ra ver2 00:07:29.173 16:43:14 -- scripts/common.sh@337 -- # local 'op=<' 00:07:29.173 16:43:14 -- scripts/common.sh@339 -- # ver1_l=2 00:07:29.173 16:43:14 -- scripts/common.sh@340 -- # ver2_l=1 00:07:29.173 16:43:14 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:29.173 16:43:14 -- scripts/common.sh@343 -- # case "$op" in 00:07:29.173 16:43:14 -- scripts/common.sh@344 -- # : 1 00:07:29.173 16:43:14 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:29.173 16:43:14 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:29.173 16:43:14 -- scripts/common.sh@364 -- # decimal 1 00:07:29.173 16:43:14 -- scripts/common.sh@352 -- # local d=1 00:07:29.173 16:43:14 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:29.173 16:43:14 -- scripts/common.sh@354 -- # echo 1 00:07:29.173 16:43:14 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:29.173 16:43:14 -- scripts/common.sh@365 -- # decimal 2 00:07:29.173 16:43:14 -- scripts/common.sh@352 -- # local d=2 00:07:29.173 16:43:14 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:29.173 16:43:14 -- scripts/common.sh@354 -- # echo 2 00:07:29.173 16:43:14 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:29.173 16:43:14 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:29.173 16:43:14 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:29.173 16:43:14 -- scripts/common.sh@367 -- # return 0 00:07:29.173 16:43:14 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:29.173 16:43:14 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:29.173 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:29.173 --rc genhtml_branch_coverage=1 00:07:29.173 --rc genhtml_function_coverage=1 00:07:29.173 --rc genhtml_legend=1 00:07:29.173 --rc geninfo_all_blocks=1 00:07:29.173 --rc geninfo_unexecuted_blocks=1 00:07:29.173 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:29.173 ' 00:07:29.173 16:43:14 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:29.173 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:29.173 --rc genhtml_branch_coverage=1 00:07:29.173 --rc genhtml_function_coverage=1 00:07:29.173 --rc genhtml_legend=1 00:07:29.173 --rc geninfo_all_blocks=1 00:07:29.173 --rc geninfo_unexecuted_blocks=1 00:07:29.173 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:29.173 ' 00:07:29.173 16:43:14 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:29.173 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:29.173 --rc genhtml_branch_coverage=1 00:07:29.173 --rc genhtml_function_coverage=1 00:07:29.173 --rc genhtml_legend=1 00:07:29.173 --rc geninfo_all_blocks=1 00:07:29.173 --rc geninfo_unexecuted_blocks=1 00:07:29.173 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:29.173 ' 00:07:29.173 16:43:14 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:29.173 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:29.173 --rc genhtml_branch_coverage=1 00:07:29.173 --rc genhtml_function_coverage=1 00:07:29.173 --rc genhtml_legend=1 00:07:29.173 --rc geninfo_all_blocks=1 00:07:29.173 --rc geninfo_unexecuted_blocks=1 00:07:29.173 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:29.173 ' 00:07:29.173 16:43:14 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:29.173 16:43:14 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=482176 00:07:29.173 16:43:14 -- accel/accel_rpc.sh@15 -- # waitforlisten 482176 00:07:29.173 16:43:14 -- common/autotest_common.sh@829 -- # '[' -z 482176 ']' 00:07:29.173 16:43:14 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:29.173 16:43:14 -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:29.173 16:43:14 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:29.173 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:29.173 16:43:14 -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:29.173 16:43:14 -- common/autotest_common.sh@10 -- # set +x 00:07:29.173 16:43:14 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:29.173 [2024-11-16 16:43:14.820814] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:29.173 [2024-11-16 16:43:14.820891] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid482176 ] 00:07:29.173 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.173 [2024-11-16 16:43:14.886716] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.432 [2024-11-16 16:43:14.924099] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:29.432 [2024-11-16 16:43:14.924210] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.432 16:43:14 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:29.432 16:43:14 -- common/autotest_common.sh@862 -- # return 0 00:07:29.432 16:43:14 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:29.432 16:43:14 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:29.432 16:43:14 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:29.432 16:43:14 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:29.432 16:43:14 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:29.432 16:43:14 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:29.432 16:43:14 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:29.432 16:43:14 -- common/autotest_common.sh@10 -- # set +x 00:07:29.432 ************************************ 00:07:29.432 START TEST accel_assign_opcode 00:07:29.432 ************************************ 00:07:29.432 16:43:14 -- common/autotest_common.sh@1114 -- # accel_assign_opcode_test_suite 00:07:29.432 16:43:14 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:29.432 16:43:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:29.432 16:43:14 -- common/autotest_common.sh@10 -- # set +x 00:07:29.432 [2024-11-16 16:43:14.984660] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:29.432 16:43:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:29.432 16:43:14 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:29.432 16:43:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:29.432 16:43:14 -- common/autotest_common.sh@10 -- # set +x 00:07:29.432 [2024-11-16 16:43:14.992678] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:29.432 16:43:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:29.432 16:43:14 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:29.432 16:43:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:29.432 16:43:14 -- common/autotest_common.sh@10 -- # set +x 00:07:29.432 16:43:15 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:29.432 16:43:15 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:29.432 16:43:15 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:29.432 16:43:15 -- accel/accel_rpc.sh@42 -- # grep software 00:07:29.432 16:43:15 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:29.432 16:43:15 -- common/autotest_common.sh@10 -- # set +x 00:07:29.432 16:43:15 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:29.691 software 00:07:29.691 00:07:29.691 real 0m0.206s 00:07:29.691 user 0m0.031s 00:07:29.691 sys 0m0.014s 00:07:29.691 16:43:15 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:29.691 16:43:15 -- common/autotest_common.sh@10 -- # set +x 00:07:29.691 ************************************ 00:07:29.691 END TEST accel_assign_opcode 00:07:29.691 ************************************ 00:07:29.691 16:43:15 -- accel/accel_rpc.sh@55 -- # killprocess 482176 00:07:29.691 16:43:15 -- common/autotest_common.sh@936 -- # '[' -z 482176 ']' 00:07:29.691 16:43:15 -- common/autotest_common.sh@940 -- # kill -0 482176 00:07:29.691 16:43:15 -- common/autotest_common.sh@941 -- # uname 00:07:29.691 16:43:15 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:29.691 16:43:15 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 482176 00:07:29.691 16:43:15 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:29.691 16:43:15 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:29.691 16:43:15 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 482176' 00:07:29.691 killing process with pid 482176 00:07:29.691 16:43:15 -- common/autotest_common.sh@955 -- # kill 482176 00:07:29.691 16:43:15 -- common/autotest_common.sh@960 -- # wait 482176 00:07:29.951 00:07:29.951 real 0m0.965s 00:07:29.951 user 0m0.838s 00:07:29.951 sys 0m0.461s 00:07:29.951 16:43:15 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:29.951 16:43:15 -- common/autotest_common.sh@10 -- # set +x 00:07:29.951 ************************************ 00:07:29.951 END TEST accel_rpc 00:07:29.951 ************************************ 00:07:29.951 16:43:15 -- spdk/autotest.sh@178 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:29.951 16:43:15 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:29.951 16:43:15 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:29.951 16:43:15 -- common/autotest_common.sh@10 -- # set +x 00:07:29.951 ************************************ 00:07:29.951 START TEST app_cmdline 00:07:29.951 ************************************ 00:07:29.951 16:43:15 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:30.210 * Looking for test storage... 00:07:30.210 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:30.210 16:43:15 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:30.210 16:43:15 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:30.210 16:43:15 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:30.210 16:43:15 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:30.210 16:43:15 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:30.210 16:43:15 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:30.210 16:43:15 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:30.210 16:43:15 -- scripts/common.sh@335 -- # IFS=.-: 00:07:30.210 16:43:15 -- scripts/common.sh@335 -- # read -ra ver1 00:07:30.210 16:43:15 -- scripts/common.sh@336 -- # IFS=.-: 00:07:30.210 16:43:15 -- scripts/common.sh@336 -- # read -ra ver2 00:07:30.210 16:43:15 -- scripts/common.sh@337 -- # local 'op=<' 00:07:30.210 16:43:15 -- scripts/common.sh@339 -- # ver1_l=2 00:07:30.210 16:43:15 -- scripts/common.sh@340 -- # ver2_l=1 00:07:30.210 16:43:15 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:30.210 16:43:15 -- scripts/common.sh@343 -- # case "$op" in 00:07:30.210 16:43:15 -- scripts/common.sh@344 -- # : 1 00:07:30.210 16:43:15 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:30.210 16:43:15 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:30.210 16:43:15 -- scripts/common.sh@364 -- # decimal 1 00:07:30.210 16:43:15 -- scripts/common.sh@352 -- # local d=1 00:07:30.210 16:43:15 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:30.210 16:43:15 -- scripts/common.sh@354 -- # echo 1 00:07:30.210 16:43:15 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:30.210 16:43:15 -- scripts/common.sh@365 -- # decimal 2 00:07:30.210 16:43:15 -- scripts/common.sh@352 -- # local d=2 00:07:30.210 16:43:15 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:30.210 16:43:15 -- scripts/common.sh@354 -- # echo 2 00:07:30.210 16:43:15 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:30.210 16:43:15 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:30.210 16:43:15 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:30.210 16:43:15 -- scripts/common.sh@367 -- # return 0 00:07:30.210 16:43:15 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:30.210 16:43:15 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:30.210 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.210 --rc genhtml_branch_coverage=1 00:07:30.210 --rc genhtml_function_coverage=1 00:07:30.210 --rc genhtml_legend=1 00:07:30.210 --rc geninfo_all_blocks=1 00:07:30.210 --rc geninfo_unexecuted_blocks=1 00:07:30.210 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.210 ' 00:07:30.210 16:43:15 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:30.210 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.210 --rc genhtml_branch_coverage=1 00:07:30.210 --rc genhtml_function_coverage=1 00:07:30.210 --rc genhtml_legend=1 00:07:30.210 --rc geninfo_all_blocks=1 00:07:30.210 --rc geninfo_unexecuted_blocks=1 00:07:30.210 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.210 ' 00:07:30.210 16:43:15 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:30.210 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.210 --rc genhtml_branch_coverage=1 00:07:30.210 --rc genhtml_function_coverage=1 00:07:30.210 --rc genhtml_legend=1 00:07:30.210 --rc geninfo_all_blocks=1 00:07:30.210 --rc geninfo_unexecuted_blocks=1 00:07:30.210 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.210 ' 00:07:30.210 16:43:15 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:30.210 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.210 --rc genhtml_branch_coverage=1 00:07:30.210 --rc genhtml_function_coverage=1 00:07:30.210 --rc genhtml_legend=1 00:07:30.210 --rc geninfo_all_blocks=1 00:07:30.210 --rc geninfo_unexecuted_blocks=1 00:07:30.210 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.210 ' 00:07:30.210 16:43:15 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:30.210 16:43:15 -- app/cmdline.sh@17 -- # spdk_tgt_pid=482512 00:07:30.210 16:43:15 -- app/cmdline.sh@18 -- # waitforlisten 482512 00:07:30.210 16:43:15 -- common/autotest_common.sh@829 -- # '[' -z 482512 ']' 00:07:30.210 16:43:15 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:30.210 16:43:15 -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:30.210 16:43:15 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:30.210 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:30.210 16:43:15 -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:30.211 16:43:15 -- common/autotest_common.sh@10 -- # set +x 00:07:30.211 16:43:15 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:30.211 [2024-11-16 16:43:15.836256] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:30.211 [2024-11-16 16:43:15.836345] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid482512 ] 00:07:30.211 EAL: No free 2048 kB hugepages reported on node 1 00:07:30.211 [2024-11-16 16:43:15.903685] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.211 [2024-11-16 16:43:15.941093] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:30.211 [2024-11-16 16:43:15.941209] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.148 16:43:16 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:31.148 16:43:16 -- common/autotest_common.sh@862 -- # return 0 00:07:31.148 16:43:16 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:31.148 { 00:07:31.148 "version": "SPDK v24.01.1-pre git sha1 c13c99a5e", 00:07:31.148 "fields": { 00:07:31.148 "major": 24, 00:07:31.148 "minor": 1, 00:07:31.148 "patch": 1, 00:07:31.148 "suffix": "-pre", 00:07:31.148 "commit": "c13c99a5e" 00:07:31.148 } 00:07:31.148 } 00:07:31.148 16:43:16 -- app/cmdline.sh@22 -- # expected_methods=() 00:07:31.148 16:43:16 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:31.148 16:43:16 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:31.148 16:43:16 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:31.148 16:43:16 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:31.148 16:43:16 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:31.148 16:43:16 -- common/autotest_common.sh@10 -- # set +x 00:07:31.148 16:43:16 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:31.148 16:43:16 -- app/cmdline.sh@26 -- # sort 00:07:31.148 16:43:16 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:31.148 16:43:16 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:31.148 16:43:16 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:31.148 16:43:16 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:31.148 16:43:16 -- common/autotest_common.sh@650 -- # local es=0 00:07:31.148 16:43:16 -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:31.148 16:43:16 -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:31.148 16:43:16 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:31.148 16:43:16 -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:31.148 16:43:16 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:31.148 16:43:16 -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:31.148 16:43:16 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:31.148 16:43:16 -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:31.148 16:43:16 -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:31.148 16:43:16 -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:31.408 request: 00:07:31.408 { 00:07:31.408 "method": "env_dpdk_get_mem_stats", 00:07:31.408 "req_id": 1 00:07:31.408 } 00:07:31.408 Got JSON-RPC error response 00:07:31.408 response: 00:07:31.408 { 00:07:31.408 "code": -32601, 00:07:31.408 "message": "Method not found" 00:07:31.408 } 00:07:31.408 16:43:17 -- common/autotest_common.sh@653 -- # es=1 00:07:31.408 16:43:17 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:31.408 16:43:17 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:31.408 16:43:17 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:31.408 16:43:17 -- app/cmdline.sh@1 -- # killprocess 482512 00:07:31.408 16:43:17 -- common/autotest_common.sh@936 -- # '[' -z 482512 ']' 00:07:31.408 16:43:17 -- common/autotest_common.sh@940 -- # kill -0 482512 00:07:31.408 16:43:17 -- common/autotest_common.sh@941 -- # uname 00:07:31.408 16:43:17 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:31.408 16:43:17 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 482512 00:07:31.408 16:43:17 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:31.408 16:43:17 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:31.408 16:43:17 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 482512' 00:07:31.408 killing process with pid 482512 00:07:31.408 16:43:17 -- common/autotest_common.sh@955 -- # kill 482512 00:07:31.408 16:43:17 -- common/autotest_common.sh@960 -- # wait 482512 00:07:31.667 00:07:31.667 real 0m1.730s 00:07:31.667 user 0m2.004s 00:07:31.667 sys 0m0.472s 00:07:31.667 16:43:17 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:31.667 16:43:17 -- common/autotest_common.sh@10 -- # set +x 00:07:31.667 ************************************ 00:07:31.667 END TEST app_cmdline 00:07:31.667 ************************************ 00:07:31.667 16:43:17 -- spdk/autotest.sh@179 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:31.667 16:43:17 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:31.667 16:43:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:31.667 16:43:17 -- common/autotest_common.sh@10 -- # set +x 00:07:31.667 ************************************ 00:07:31.667 START TEST version 00:07:31.667 ************************************ 00:07:31.667 16:43:17 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:31.926 * Looking for test storage... 00:07:31.926 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:31.926 16:43:17 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:31.926 16:43:17 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:31.926 16:43:17 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:31.926 16:43:17 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:31.927 16:43:17 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:31.927 16:43:17 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:31.927 16:43:17 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:31.927 16:43:17 -- scripts/common.sh@335 -- # IFS=.-: 00:07:31.927 16:43:17 -- scripts/common.sh@335 -- # read -ra ver1 00:07:31.927 16:43:17 -- scripts/common.sh@336 -- # IFS=.-: 00:07:31.927 16:43:17 -- scripts/common.sh@336 -- # read -ra ver2 00:07:31.927 16:43:17 -- scripts/common.sh@337 -- # local 'op=<' 00:07:31.927 16:43:17 -- scripts/common.sh@339 -- # ver1_l=2 00:07:31.927 16:43:17 -- scripts/common.sh@340 -- # ver2_l=1 00:07:31.927 16:43:17 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:31.927 16:43:17 -- scripts/common.sh@343 -- # case "$op" in 00:07:31.927 16:43:17 -- scripts/common.sh@344 -- # : 1 00:07:31.927 16:43:17 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:31.927 16:43:17 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:31.927 16:43:17 -- scripts/common.sh@364 -- # decimal 1 00:07:31.927 16:43:17 -- scripts/common.sh@352 -- # local d=1 00:07:31.927 16:43:17 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:31.927 16:43:17 -- scripts/common.sh@354 -- # echo 1 00:07:31.927 16:43:17 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:31.927 16:43:17 -- scripts/common.sh@365 -- # decimal 2 00:07:31.927 16:43:17 -- scripts/common.sh@352 -- # local d=2 00:07:31.927 16:43:17 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:31.927 16:43:17 -- scripts/common.sh@354 -- # echo 2 00:07:31.927 16:43:17 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:31.927 16:43:17 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:31.927 16:43:17 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:31.927 16:43:17 -- scripts/common.sh@367 -- # return 0 00:07:31.927 16:43:17 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:31.927 16:43:17 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:31.927 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.927 --rc genhtml_branch_coverage=1 00:07:31.927 --rc genhtml_function_coverage=1 00:07:31.927 --rc genhtml_legend=1 00:07:31.927 --rc geninfo_all_blocks=1 00:07:31.927 --rc geninfo_unexecuted_blocks=1 00:07:31.927 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.927 ' 00:07:31.927 16:43:17 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:31.927 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.927 --rc genhtml_branch_coverage=1 00:07:31.927 --rc genhtml_function_coverage=1 00:07:31.927 --rc genhtml_legend=1 00:07:31.927 --rc geninfo_all_blocks=1 00:07:31.927 --rc geninfo_unexecuted_blocks=1 00:07:31.927 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.927 ' 00:07:31.927 16:43:17 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:31.927 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.927 --rc genhtml_branch_coverage=1 00:07:31.927 --rc genhtml_function_coverage=1 00:07:31.927 --rc genhtml_legend=1 00:07:31.927 --rc geninfo_all_blocks=1 00:07:31.927 --rc geninfo_unexecuted_blocks=1 00:07:31.927 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.927 ' 00:07:31.927 16:43:17 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:31.927 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.927 --rc genhtml_branch_coverage=1 00:07:31.927 --rc genhtml_function_coverage=1 00:07:31.927 --rc genhtml_legend=1 00:07:31.927 --rc geninfo_all_blocks=1 00:07:31.927 --rc geninfo_unexecuted_blocks=1 00:07:31.927 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.927 ' 00:07:31.927 16:43:17 -- app/version.sh@17 -- # get_header_version major 00:07:31.927 16:43:17 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:31.927 16:43:17 -- app/version.sh@14 -- # cut -f2 00:07:31.927 16:43:17 -- app/version.sh@14 -- # tr -d '"' 00:07:31.927 16:43:17 -- app/version.sh@17 -- # major=24 00:07:31.927 16:43:17 -- app/version.sh@18 -- # get_header_version minor 00:07:31.927 16:43:17 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:31.927 16:43:17 -- app/version.sh@14 -- # cut -f2 00:07:31.927 16:43:17 -- app/version.sh@14 -- # tr -d '"' 00:07:31.927 16:43:17 -- app/version.sh@18 -- # minor=1 00:07:31.927 16:43:17 -- app/version.sh@19 -- # get_header_version patch 00:07:31.927 16:43:17 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:31.927 16:43:17 -- app/version.sh@14 -- # cut -f2 00:07:31.927 16:43:17 -- app/version.sh@14 -- # tr -d '"' 00:07:31.927 16:43:17 -- app/version.sh@19 -- # patch=1 00:07:31.927 16:43:17 -- app/version.sh@20 -- # get_header_version suffix 00:07:31.927 16:43:17 -- app/version.sh@14 -- # tr -d '"' 00:07:31.927 16:43:17 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:31.927 16:43:17 -- app/version.sh@14 -- # cut -f2 00:07:31.927 16:43:17 -- app/version.sh@20 -- # suffix=-pre 00:07:31.927 16:43:17 -- app/version.sh@22 -- # version=24.1 00:07:31.927 16:43:17 -- app/version.sh@25 -- # (( patch != 0 )) 00:07:31.927 16:43:17 -- app/version.sh@25 -- # version=24.1.1 00:07:31.927 16:43:17 -- app/version.sh@28 -- # version=24.1.1rc0 00:07:31.927 16:43:17 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:31.927 16:43:17 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:31.927 16:43:17 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:07:31.927 16:43:17 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:07:31.927 00:07:31.927 real 0m0.226s 00:07:31.927 user 0m0.126s 00:07:31.927 sys 0m0.145s 00:07:31.927 16:43:17 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:31.927 16:43:17 -- common/autotest_common.sh@10 -- # set +x 00:07:31.927 ************************************ 00:07:31.927 END TEST version 00:07:31.927 ************************************ 00:07:31.927 16:43:17 -- spdk/autotest.sh@181 -- # '[' 0 -eq 1 ']' 00:07:31.927 16:43:17 -- spdk/autotest.sh@191 -- # uname -s 00:07:32.187 16:43:17 -- spdk/autotest.sh@191 -- # [[ Linux == Linux ]] 00:07:32.187 16:43:17 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:07:32.187 16:43:17 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:07:32.187 16:43:17 -- spdk/autotest.sh@204 -- # '[' 0 -eq 1 ']' 00:07:32.187 16:43:17 -- spdk/autotest.sh@251 -- # '[' 0 -eq 1 ']' 00:07:32.187 16:43:17 -- spdk/autotest.sh@255 -- # timing_exit lib 00:07:32.187 16:43:17 -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:32.187 16:43:17 -- common/autotest_common.sh@10 -- # set +x 00:07:32.187 16:43:17 -- spdk/autotest.sh@257 -- # '[' 0 -eq 1 ']' 00:07:32.187 16:43:17 -- spdk/autotest.sh@265 -- # '[' 0 -eq 1 ']' 00:07:32.187 16:43:17 -- spdk/autotest.sh@274 -- # '[' 0 -eq 1 ']' 00:07:32.187 16:43:17 -- spdk/autotest.sh@298 -- # '[' 0 -eq 1 ']' 00:07:32.187 16:43:17 -- spdk/autotest.sh@302 -- # '[' 0 -eq 1 ']' 00:07:32.187 16:43:17 -- spdk/autotest.sh@306 -- # '[' 0 -eq 1 ']' 00:07:32.187 16:43:17 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:32.187 16:43:17 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:07:32.187 16:43:17 -- spdk/autotest.sh@325 -- # '[' 0 -eq 1 ']' 00:07:32.187 16:43:17 -- spdk/autotest.sh@329 -- # '[' 0 -eq 1 ']' 00:07:32.187 16:43:17 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:32.187 16:43:17 -- spdk/autotest.sh@337 -- # '[' 0 -eq 1 ']' 00:07:32.187 16:43:17 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:32.187 16:43:17 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:32.187 16:43:17 -- spdk/autotest.sh@353 -- # [[ 0 -eq 1 ]] 00:07:32.187 16:43:17 -- spdk/autotest.sh@357 -- # [[ 0 -eq 1 ]] 00:07:32.187 16:43:17 -- spdk/autotest.sh@361 -- # [[ 1 -eq 1 ]] 00:07:32.187 16:43:17 -- spdk/autotest.sh@362 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:32.187 16:43:17 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:32.187 16:43:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:32.187 16:43:17 -- common/autotest_common.sh@10 -- # set +x 00:07:32.187 ************************************ 00:07:32.187 START TEST llvm_fuzz 00:07:32.187 ************************************ 00:07:32.187 16:43:17 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:32.187 * Looking for test storage... 00:07:32.187 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:32.187 16:43:17 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:32.187 16:43:17 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:32.187 16:43:17 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:32.187 16:43:17 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:32.187 16:43:17 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:32.187 16:43:17 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:32.187 16:43:17 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:32.187 16:43:17 -- scripts/common.sh@335 -- # IFS=.-: 00:07:32.187 16:43:17 -- scripts/common.sh@335 -- # read -ra ver1 00:07:32.187 16:43:17 -- scripts/common.sh@336 -- # IFS=.-: 00:07:32.187 16:43:17 -- scripts/common.sh@336 -- # read -ra ver2 00:07:32.187 16:43:17 -- scripts/common.sh@337 -- # local 'op=<' 00:07:32.187 16:43:17 -- scripts/common.sh@339 -- # ver1_l=2 00:07:32.187 16:43:17 -- scripts/common.sh@340 -- # ver2_l=1 00:07:32.187 16:43:17 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:32.187 16:43:17 -- scripts/common.sh@343 -- # case "$op" in 00:07:32.187 16:43:17 -- scripts/common.sh@344 -- # : 1 00:07:32.187 16:43:17 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:32.187 16:43:17 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:32.187 16:43:17 -- scripts/common.sh@364 -- # decimal 1 00:07:32.187 16:43:17 -- scripts/common.sh@352 -- # local d=1 00:07:32.187 16:43:17 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:32.187 16:43:17 -- scripts/common.sh@354 -- # echo 1 00:07:32.187 16:43:17 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:32.187 16:43:17 -- scripts/common.sh@365 -- # decimal 2 00:07:32.187 16:43:17 -- scripts/common.sh@352 -- # local d=2 00:07:32.187 16:43:17 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:32.187 16:43:17 -- scripts/common.sh@354 -- # echo 2 00:07:32.187 16:43:17 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:32.187 16:43:17 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:32.187 16:43:17 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:32.187 16:43:17 -- scripts/common.sh@367 -- # return 0 00:07:32.187 16:43:17 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:32.187 16:43:17 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:32.187 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.187 --rc genhtml_branch_coverage=1 00:07:32.187 --rc genhtml_function_coverage=1 00:07:32.187 --rc genhtml_legend=1 00:07:32.187 --rc geninfo_all_blocks=1 00:07:32.187 --rc geninfo_unexecuted_blocks=1 00:07:32.187 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.187 ' 00:07:32.187 16:43:17 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:32.187 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.187 --rc genhtml_branch_coverage=1 00:07:32.187 --rc genhtml_function_coverage=1 00:07:32.187 --rc genhtml_legend=1 00:07:32.187 --rc geninfo_all_blocks=1 00:07:32.187 --rc geninfo_unexecuted_blocks=1 00:07:32.187 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.187 ' 00:07:32.187 16:43:17 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:32.187 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.187 --rc genhtml_branch_coverage=1 00:07:32.188 --rc genhtml_function_coverage=1 00:07:32.188 --rc genhtml_legend=1 00:07:32.188 --rc geninfo_all_blocks=1 00:07:32.188 --rc geninfo_unexecuted_blocks=1 00:07:32.188 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.188 ' 00:07:32.188 16:43:17 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:32.188 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.188 --rc genhtml_branch_coverage=1 00:07:32.188 --rc genhtml_function_coverage=1 00:07:32.188 --rc genhtml_legend=1 00:07:32.188 --rc geninfo_all_blocks=1 00:07:32.188 --rc geninfo_unexecuted_blocks=1 00:07:32.188 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.188 ' 00:07:32.188 16:43:17 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:32.188 16:43:17 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:32.188 16:43:17 -- common/autotest_common.sh@548 -- # fuzzers=() 00:07:32.188 16:43:17 -- common/autotest_common.sh@548 -- # local fuzzers 00:07:32.188 16:43:17 -- common/autotest_common.sh@550 -- # [[ -n '' ]] 00:07:32.188 16:43:17 -- common/autotest_common.sh@553 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:32.188 16:43:17 -- common/autotest_common.sh@554 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:32.188 16:43:17 -- common/autotest_common.sh@557 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:32.188 16:43:17 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:32.188 16:43:17 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:07:32.188 16:43:17 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:32.188 16:43:17 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:32.188 16:43:17 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:32.188 16:43:17 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:32.188 16:43:17 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:32.188 16:43:17 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:32.188 16:43:17 -- fuzz/llvm.sh@19 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:32.188 16:43:17 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:32.188 16:43:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:32.188 16:43:17 -- common/autotest_common.sh@10 -- # set +x 00:07:32.188 ************************************ 00:07:32.188 START TEST nvmf_fuzz 00:07:32.188 ************************************ 00:07:32.188 16:43:17 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:32.450 * Looking for test storage... 00:07:32.450 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:32.450 16:43:18 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:32.450 16:43:18 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:32.450 16:43:18 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:32.450 16:43:18 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:32.450 16:43:18 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:32.450 16:43:18 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:32.450 16:43:18 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:32.450 16:43:18 -- scripts/common.sh@335 -- # IFS=.-: 00:07:32.450 16:43:18 -- scripts/common.sh@335 -- # read -ra ver1 00:07:32.450 16:43:18 -- scripts/common.sh@336 -- # IFS=.-: 00:07:32.450 16:43:18 -- scripts/common.sh@336 -- # read -ra ver2 00:07:32.450 16:43:18 -- scripts/common.sh@337 -- # local 'op=<' 00:07:32.450 16:43:18 -- scripts/common.sh@339 -- # ver1_l=2 00:07:32.450 16:43:18 -- scripts/common.sh@340 -- # ver2_l=1 00:07:32.450 16:43:18 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:32.450 16:43:18 -- scripts/common.sh@343 -- # case "$op" in 00:07:32.450 16:43:18 -- scripts/common.sh@344 -- # : 1 00:07:32.450 16:43:18 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:32.450 16:43:18 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:32.450 16:43:18 -- scripts/common.sh@364 -- # decimal 1 00:07:32.450 16:43:18 -- scripts/common.sh@352 -- # local d=1 00:07:32.450 16:43:18 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:32.450 16:43:18 -- scripts/common.sh@354 -- # echo 1 00:07:32.450 16:43:18 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:32.450 16:43:18 -- scripts/common.sh@365 -- # decimal 2 00:07:32.450 16:43:18 -- scripts/common.sh@352 -- # local d=2 00:07:32.450 16:43:18 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:32.450 16:43:18 -- scripts/common.sh@354 -- # echo 2 00:07:32.450 16:43:18 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:32.450 16:43:18 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:32.450 16:43:18 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:32.450 16:43:18 -- scripts/common.sh@367 -- # return 0 00:07:32.450 16:43:18 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:32.450 16:43:18 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:32.450 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.450 --rc genhtml_branch_coverage=1 00:07:32.450 --rc genhtml_function_coverage=1 00:07:32.450 --rc genhtml_legend=1 00:07:32.450 --rc geninfo_all_blocks=1 00:07:32.450 --rc geninfo_unexecuted_blocks=1 00:07:32.450 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.450 ' 00:07:32.450 16:43:18 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:32.450 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.450 --rc genhtml_branch_coverage=1 00:07:32.450 --rc genhtml_function_coverage=1 00:07:32.450 --rc genhtml_legend=1 00:07:32.450 --rc geninfo_all_blocks=1 00:07:32.450 --rc geninfo_unexecuted_blocks=1 00:07:32.450 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.450 ' 00:07:32.450 16:43:18 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:32.450 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.450 --rc genhtml_branch_coverage=1 00:07:32.450 --rc genhtml_function_coverage=1 00:07:32.450 --rc genhtml_legend=1 00:07:32.450 --rc geninfo_all_blocks=1 00:07:32.450 --rc geninfo_unexecuted_blocks=1 00:07:32.450 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.450 ' 00:07:32.450 16:43:18 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:32.450 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.450 --rc genhtml_branch_coverage=1 00:07:32.450 --rc genhtml_function_coverage=1 00:07:32.450 --rc genhtml_legend=1 00:07:32.450 --rc geninfo_all_blocks=1 00:07:32.450 --rc geninfo_unexecuted_blocks=1 00:07:32.450 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.450 ' 00:07:32.450 16:43:18 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:32.450 16:43:18 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:32.450 16:43:18 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:32.450 16:43:18 -- common/autotest_common.sh@34 -- # set -e 00:07:32.450 16:43:18 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:32.450 16:43:18 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:32.450 16:43:18 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:32.450 16:43:18 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:32.450 16:43:18 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:32.450 16:43:18 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:32.450 16:43:18 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:32.450 16:43:18 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:32.450 16:43:18 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:32.450 16:43:18 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:32.450 16:43:18 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:32.450 16:43:18 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:32.450 16:43:18 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:32.450 16:43:18 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:32.450 16:43:18 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:32.450 16:43:18 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:32.450 16:43:18 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:32.450 16:43:18 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:32.450 16:43:18 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:32.450 16:43:18 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:32.450 16:43:18 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:32.450 16:43:18 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:32.450 16:43:18 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:32.450 16:43:18 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:32.450 16:43:18 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:32.450 16:43:18 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:32.450 16:43:18 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:32.450 16:43:18 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:32.450 16:43:18 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:32.450 16:43:18 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:32.450 16:43:18 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:32.450 16:43:18 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:32.450 16:43:18 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:32.450 16:43:18 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:32.450 16:43:18 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:32.450 16:43:18 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:32.450 16:43:18 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:32.450 16:43:18 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:32.450 16:43:18 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:32.450 16:43:18 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:32.450 16:43:18 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:32.450 16:43:18 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:32.450 16:43:18 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:32.450 16:43:18 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:32.450 16:43:18 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:32.450 16:43:18 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:32.450 16:43:18 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:32.450 16:43:18 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:32.450 16:43:18 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:32.450 16:43:18 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:32.450 16:43:18 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:32.450 16:43:18 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:32.450 16:43:18 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:32.450 16:43:18 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:32.450 16:43:18 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:32.450 16:43:18 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:32.450 16:43:18 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:07:32.450 16:43:18 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:07:32.450 16:43:18 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:07:32.450 16:43:18 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:07:32.450 16:43:18 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:07:32.450 16:43:18 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:07:32.450 16:43:18 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:07:32.450 16:43:18 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:07:32.450 16:43:18 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:32.450 16:43:18 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:07:32.450 16:43:18 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:07:32.451 16:43:18 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:07:32.451 16:43:18 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:07:32.451 16:43:18 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:32.451 16:43:18 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:07:32.451 16:43:18 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:07:32.451 16:43:18 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:07:32.451 16:43:18 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:07:32.451 16:43:18 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:07:32.451 16:43:18 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:07:32.451 16:43:18 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:07:32.451 16:43:18 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:07:32.451 16:43:18 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:07:32.451 16:43:18 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:07:32.451 16:43:18 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:32.451 16:43:18 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:07:32.451 16:43:18 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:07:32.451 16:43:18 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:32.451 16:43:18 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:32.451 16:43:18 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:32.451 16:43:18 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:32.451 16:43:18 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:32.451 16:43:18 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:32.451 16:43:18 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:32.451 16:43:18 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:32.451 16:43:18 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:32.451 16:43:18 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:32.451 16:43:18 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:32.451 16:43:18 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:32.451 16:43:18 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:32.451 16:43:18 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:32.451 16:43:18 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:32.451 16:43:18 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:32.451 #define SPDK_CONFIG_H 00:07:32.451 #define SPDK_CONFIG_APPS 1 00:07:32.451 #define SPDK_CONFIG_ARCH native 00:07:32.451 #undef SPDK_CONFIG_ASAN 00:07:32.451 #undef SPDK_CONFIG_AVAHI 00:07:32.451 #undef SPDK_CONFIG_CET 00:07:32.451 #define SPDK_CONFIG_COVERAGE 1 00:07:32.451 #define SPDK_CONFIG_CROSS_PREFIX 00:07:32.451 #undef SPDK_CONFIG_CRYPTO 00:07:32.451 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:32.451 #undef SPDK_CONFIG_CUSTOMOCF 00:07:32.451 #undef SPDK_CONFIG_DAOS 00:07:32.451 #define SPDK_CONFIG_DAOS_DIR 00:07:32.451 #define SPDK_CONFIG_DEBUG 1 00:07:32.451 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:32.451 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:32.451 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:32.451 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:32.451 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:32.451 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:32.451 #define SPDK_CONFIG_EXAMPLES 1 00:07:32.451 #undef SPDK_CONFIG_FC 00:07:32.451 #define SPDK_CONFIG_FC_PATH 00:07:32.451 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:32.451 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:32.451 #undef SPDK_CONFIG_FUSE 00:07:32.451 #define SPDK_CONFIG_FUZZER 1 00:07:32.451 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:32.451 #undef SPDK_CONFIG_GOLANG 00:07:32.451 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:32.451 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:32.451 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:32.451 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:32.451 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:32.451 #define SPDK_CONFIG_IDXD 1 00:07:32.451 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:32.451 #undef SPDK_CONFIG_IPSEC_MB 00:07:32.451 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:32.451 #define SPDK_CONFIG_ISAL 1 00:07:32.451 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:32.451 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:32.451 #define SPDK_CONFIG_LIBDIR 00:07:32.451 #undef SPDK_CONFIG_LTO 00:07:32.451 #define SPDK_CONFIG_MAX_LCORES 00:07:32.451 #define SPDK_CONFIG_NVME_CUSE 1 00:07:32.451 #undef SPDK_CONFIG_OCF 00:07:32.451 #define SPDK_CONFIG_OCF_PATH 00:07:32.451 #define SPDK_CONFIG_OPENSSL_PATH 00:07:32.451 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:32.451 #undef SPDK_CONFIG_PGO_USE 00:07:32.451 #define SPDK_CONFIG_PREFIX /usr/local 00:07:32.451 #undef SPDK_CONFIG_RAID5F 00:07:32.451 #undef SPDK_CONFIG_RBD 00:07:32.451 #define SPDK_CONFIG_RDMA 1 00:07:32.451 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:32.451 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:32.451 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:32.451 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:32.451 #undef SPDK_CONFIG_SHARED 00:07:32.451 #undef SPDK_CONFIG_SMA 00:07:32.451 #define SPDK_CONFIG_TESTS 1 00:07:32.451 #undef SPDK_CONFIG_TSAN 00:07:32.451 #define SPDK_CONFIG_UBLK 1 00:07:32.451 #define SPDK_CONFIG_UBSAN 1 00:07:32.451 #undef SPDK_CONFIG_UNIT_TESTS 00:07:32.451 #undef SPDK_CONFIG_URING 00:07:32.451 #define SPDK_CONFIG_URING_PATH 00:07:32.451 #undef SPDK_CONFIG_URING_ZNS 00:07:32.451 #undef SPDK_CONFIG_USDT 00:07:32.451 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:32.451 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:32.451 #define SPDK_CONFIG_VFIO_USER 1 00:07:32.451 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:32.451 #define SPDK_CONFIG_VHOST 1 00:07:32.451 #define SPDK_CONFIG_VIRTIO 1 00:07:32.451 #undef SPDK_CONFIG_VTUNE 00:07:32.451 #define SPDK_CONFIG_VTUNE_DIR 00:07:32.451 #define SPDK_CONFIG_WERROR 1 00:07:32.451 #define SPDK_CONFIG_WPDK_DIR 00:07:32.451 #undef SPDK_CONFIG_XNVME 00:07:32.451 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:32.451 16:43:18 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:32.451 16:43:18 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:32.451 16:43:18 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:32.451 16:43:18 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:32.451 16:43:18 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:32.451 16:43:18 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:32.451 16:43:18 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:32.451 16:43:18 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:32.451 16:43:18 -- paths/export.sh@5 -- # export PATH 00:07:32.451 16:43:18 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:32.451 16:43:18 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:32.451 16:43:18 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:32.451 16:43:18 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:32.451 16:43:18 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:32.451 16:43:18 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:32.451 16:43:18 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:32.451 16:43:18 -- pm/common@16 -- # TEST_TAG=N/A 00:07:32.451 16:43:18 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:32.451 16:43:18 -- common/autotest_common.sh@52 -- # : 1 00:07:32.451 16:43:18 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:07:32.451 16:43:18 -- common/autotest_common.sh@56 -- # : 0 00:07:32.451 16:43:18 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:32.451 16:43:18 -- common/autotest_common.sh@58 -- # : 0 00:07:32.451 16:43:18 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:07:32.451 16:43:18 -- common/autotest_common.sh@60 -- # : 1 00:07:32.451 16:43:18 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:32.451 16:43:18 -- common/autotest_common.sh@62 -- # : 0 00:07:32.451 16:43:18 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:07:32.451 16:43:18 -- common/autotest_common.sh@64 -- # : 00:07:32.451 16:43:18 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:07:32.451 16:43:18 -- common/autotest_common.sh@66 -- # : 0 00:07:32.451 16:43:18 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:07:32.451 16:43:18 -- common/autotest_common.sh@68 -- # : 0 00:07:32.451 16:43:18 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:07:32.451 16:43:18 -- common/autotest_common.sh@70 -- # : 0 00:07:32.451 16:43:18 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:07:32.452 16:43:18 -- common/autotest_common.sh@72 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:32.452 16:43:18 -- common/autotest_common.sh@74 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:07:32.452 16:43:18 -- common/autotest_common.sh@76 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:07:32.452 16:43:18 -- common/autotest_common.sh@78 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:07:32.452 16:43:18 -- common/autotest_common.sh@80 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:07:32.452 16:43:18 -- common/autotest_common.sh@82 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:07:32.452 16:43:18 -- common/autotest_common.sh@84 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:07:32.452 16:43:18 -- common/autotest_common.sh@86 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:07:32.452 16:43:18 -- common/autotest_common.sh@88 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:07:32.452 16:43:18 -- common/autotest_common.sh@90 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:32.452 16:43:18 -- common/autotest_common.sh@92 -- # : 1 00:07:32.452 16:43:18 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:07:32.452 16:43:18 -- common/autotest_common.sh@94 -- # : 1 00:07:32.452 16:43:18 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:07:32.452 16:43:18 -- common/autotest_common.sh@96 -- # : rdma 00:07:32.452 16:43:18 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:32.452 16:43:18 -- common/autotest_common.sh@98 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:07:32.452 16:43:18 -- common/autotest_common.sh@100 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:07:32.452 16:43:18 -- common/autotest_common.sh@102 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:07:32.452 16:43:18 -- common/autotest_common.sh@104 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:07:32.452 16:43:18 -- common/autotest_common.sh@106 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:07:32.452 16:43:18 -- common/autotest_common.sh@108 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:07:32.452 16:43:18 -- common/autotest_common.sh@110 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:07:32.452 16:43:18 -- common/autotest_common.sh@112 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:32.452 16:43:18 -- common/autotest_common.sh@114 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:07:32.452 16:43:18 -- common/autotest_common.sh@116 -- # : 1 00:07:32.452 16:43:18 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:07:32.452 16:43:18 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:32.452 16:43:18 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:32.452 16:43:18 -- common/autotest_common.sh@120 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:07:32.452 16:43:18 -- common/autotest_common.sh@122 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:07:32.452 16:43:18 -- common/autotest_common.sh@124 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:07:32.452 16:43:18 -- common/autotest_common.sh@126 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:07:32.452 16:43:18 -- common/autotest_common.sh@128 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:07:32.452 16:43:18 -- common/autotest_common.sh@130 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:07:32.452 16:43:18 -- common/autotest_common.sh@132 -- # : v22.11.4 00:07:32.452 16:43:18 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:07:32.452 16:43:18 -- common/autotest_common.sh@134 -- # : true 00:07:32.452 16:43:18 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:07:32.452 16:43:18 -- common/autotest_common.sh@136 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:07:32.452 16:43:18 -- common/autotest_common.sh@138 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:07:32.452 16:43:18 -- common/autotest_common.sh@140 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:07:32.452 16:43:18 -- common/autotest_common.sh@142 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:07:32.452 16:43:18 -- common/autotest_common.sh@144 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:07:32.452 16:43:18 -- common/autotest_common.sh@146 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:07:32.452 16:43:18 -- common/autotest_common.sh@148 -- # : 00:07:32.452 16:43:18 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:07:32.452 16:43:18 -- common/autotest_common.sh@150 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:07:32.452 16:43:18 -- common/autotest_common.sh@152 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:07:32.452 16:43:18 -- common/autotest_common.sh@154 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:07:32.452 16:43:18 -- common/autotest_common.sh@156 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:07:32.452 16:43:18 -- common/autotest_common.sh@158 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:07:32.452 16:43:18 -- common/autotest_common.sh@160 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:07:32.452 16:43:18 -- common/autotest_common.sh@163 -- # : 00:07:32.452 16:43:18 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:07:32.452 16:43:18 -- common/autotest_common.sh@165 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:07:32.452 16:43:18 -- common/autotest_common.sh@167 -- # : 0 00:07:32.452 16:43:18 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:32.452 16:43:18 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:32.452 16:43:18 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:32.452 16:43:18 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:32.452 16:43:18 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:32.452 16:43:18 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:32.452 16:43:18 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:32.452 16:43:18 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:32.452 16:43:18 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:32.452 16:43:18 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:32.452 16:43:18 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:32.452 16:43:18 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:32.452 16:43:18 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:32.452 16:43:18 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:32.452 16:43:18 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:07:32.452 16:43:18 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:32.452 16:43:18 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:32.452 16:43:18 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:32.452 16:43:18 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:32.452 16:43:18 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:32.453 16:43:18 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:07:32.453 16:43:18 -- common/autotest_common.sh@196 -- # cat 00:07:32.453 16:43:18 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:07:32.453 16:43:18 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:32.453 16:43:18 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:32.453 16:43:18 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:32.453 16:43:18 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:32.453 16:43:18 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:07:32.453 16:43:18 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:07:32.453 16:43:18 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:32.453 16:43:18 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:32.453 16:43:18 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:32.453 16:43:18 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:32.453 16:43:18 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:32.453 16:43:18 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:32.453 16:43:18 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:32.453 16:43:18 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:32.453 16:43:18 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:32.453 16:43:18 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:32.453 16:43:18 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:32.453 16:43:18 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:32.453 16:43:18 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:07:32.453 16:43:18 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:07:32.453 16:43:18 -- common/autotest_common.sh@249 -- # _LCOV= 00:07:32.453 16:43:18 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:07:32.453 16:43:18 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:07:32.453 16:43:18 -- common/autotest_common.sh@250 -- # _LCOV=1 00:07:32.453 16:43:18 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:32.453 16:43:18 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:07:32.453 16:43:18 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:32.453 16:43:18 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:07:32.453 16:43:18 -- common/autotest_common.sh@259 -- # export valgrind= 00:07:32.453 16:43:18 -- common/autotest_common.sh@259 -- # valgrind= 00:07:32.453 16:43:18 -- common/autotest_common.sh@265 -- # uname -s 00:07:32.453 16:43:18 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:07:32.453 16:43:18 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:07:32.453 16:43:18 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:07:32.453 16:43:18 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:07:32.453 16:43:18 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:32.453 16:43:18 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:32.453 16:43:18 -- common/autotest_common.sh@275 -- # MAKE=make 00:07:32.453 16:43:18 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:07:32.453 16:43:18 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:07:32.453 16:43:18 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:07:32.453 16:43:18 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:32.453 16:43:18 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:07:32.453 16:43:18 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:07:32.453 16:43:18 -- common/autotest_common.sh@319 -- # [[ -z 482959 ]] 00:07:32.453 16:43:18 -- common/autotest_common.sh@319 -- # kill -0 482959 00:07:32.453 16:43:18 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:07:32.453 16:43:18 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:07:32.453 16:43:18 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:07:32.453 16:43:18 -- common/autotest_common.sh@332 -- # local mount target_dir 00:07:32.453 16:43:18 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:07:32.453 16:43:18 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:07:32.453 16:43:18 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:07:32.453 16:43:18 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:07:32.453 16:43:18 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.mF2zZF 00:07:32.453 16:43:18 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:32.453 16:43:18 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:07:32.453 16:43:18 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:07:32.453 16:43:18 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.mF2zZF/tests/nvmf /tmp/spdk.mF2zZF 00:07:32.453 16:43:18 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:07:32.453 16:43:18 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:32.453 16:43:18 -- common/autotest_common.sh@328 -- # df -T 00:07:32.453 16:43:18 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:07:32.453 16:43:18 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:07:32.453 16:43:18 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:07:32.453 16:43:18 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:07:32.453 16:43:18 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:07:32.453 16:43:18 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:07:32.453 16:43:18 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:32.453 16:43:18 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:07:32.453 16:43:18 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:07:32.453 16:43:18 -- common/autotest_common.sh@363 -- # avails["$mount"]=4096 00:07:32.453 16:43:18 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:07:32.453 16:43:18 -- common/autotest_common.sh@364 -- # uses["$mount"]=5284425728 00:07:32.453 16:43:18 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:32.453 16:43:18 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:07:32.453 16:43:18 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:07:32.453 16:43:18 -- common/autotest_common.sh@363 -- # avails["$mount"]=53102751744 00:07:32.453 16:43:18 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730578432 00:07:32.453 16:43:18 -- common/autotest_common.sh@364 -- # uses["$mount"]=8627826688 00:07:32.453 16:43:18 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:32.453 16:43:18 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:32.453 16:43:18 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:32.453 16:43:18 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864031744 00:07:32.453 16:43:18 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865289216 00:07:32.453 16:43:18 -- common/autotest_common.sh@364 -- # uses["$mount"]=1257472 00:07:32.453 16:43:18 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:32.453 16:43:18 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:32.453 16:43:18 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:32.453 16:43:18 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340117504 00:07:32.453 16:43:18 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346118144 00:07:32.453 16:43:18 -- common/autotest_common.sh@364 -- # uses["$mount"]=6000640 00:07:32.453 16:43:18 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:32.453 16:43:18 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:32.453 16:43:18 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:32.453 16:43:18 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864969728 00:07:32.453 16:43:18 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865289216 00:07:32.453 16:43:18 -- common/autotest_common.sh@364 -- # uses["$mount"]=319488 00:07:32.453 16:43:18 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:32.453 16:43:18 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:32.453 16:43:18 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:32.453 16:43:18 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:07:32.453 16:43:18 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:07:32.453 16:43:18 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:07:32.453 16:43:18 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:32.453 16:43:18 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:07:32.453 * Looking for test storage... 00:07:32.453 16:43:18 -- common/autotest_common.sh@369 -- # local target_space new_size 00:07:32.453 16:43:18 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:07:32.453 16:43:18 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:32.453 16:43:18 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:32.453 16:43:18 -- common/autotest_common.sh@373 -- # mount=/ 00:07:32.453 16:43:18 -- common/autotest_common.sh@375 -- # target_space=53102751744 00:07:32.453 16:43:18 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:07:32.453 16:43:18 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:07:32.453 16:43:18 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:07:32.453 16:43:18 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:07:32.453 16:43:18 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:07:32.453 16:43:18 -- common/autotest_common.sh@382 -- # new_size=10842419200 00:07:32.453 16:43:18 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:32.453 16:43:18 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:32.453 16:43:18 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:32.453 16:43:18 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:32.453 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:32.453 16:43:18 -- common/autotest_common.sh@390 -- # return 0 00:07:32.453 16:43:18 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:07:32.453 16:43:18 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:07:32.454 16:43:18 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:32.454 16:43:18 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:32.454 16:43:18 -- common/autotest_common.sh@1682 -- # true 00:07:32.454 16:43:18 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:07:32.454 16:43:18 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:32.454 16:43:18 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:32.454 16:43:18 -- common/autotest_common.sh@27 -- # exec 00:07:32.454 16:43:18 -- common/autotest_common.sh@29 -- # exec 00:07:32.454 16:43:18 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:32.454 16:43:18 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:32.454 16:43:18 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:32.454 16:43:18 -- common/autotest_common.sh@18 -- # set -x 00:07:32.454 16:43:18 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:32.454 16:43:18 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:32.454 16:43:18 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:32.714 16:43:18 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:32.714 16:43:18 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:32.714 16:43:18 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:32.714 16:43:18 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:32.714 16:43:18 -- scripts/common.sh@335 -- # IFS=.-: 00:07:32.714 16:43:18 -- scripts/common.sh@335 -- # read -ra ver1 00:07:32.714 16:43:18 -- scripts/common.sh@336 -- # IFS=.-: 00:07:32.714 16:43:18 -- scripts/common.sh@336 -- # read -ra ver2 00:07:32.714 16:43:18 -- scripts/common.sh@337 -- # local 'op=<' 00:07:32.714 16:43:18 -- scripts/common.sh@339 -- # ver1_l=2 00:07:32.714 16:43:18 -- scripts/common.sh@340 -- # ver2_l=1 00:07:32.714 16:43:18 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:32.714 16:43:18 -- scripts/common.sh@343 -- # case "$op" in 00:07:32.714 16:43:18 -- scripts/common.sh@344 -- # : 1 00:07:32.714 16:43:18 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:32.714 16:43:18 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:32.714 16:43:18 -- scripts/common.sh@364 -- # decimal 1 00:07:32.714 16:43:18 -- scripts/common.sh@352 -- # local d=1 00:07:32.714 16:43:18 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:32.714 16:43:18 -- scripts/common.sh@354 -- # echo 1 00:07:32.714 16:43:18 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:32.714 16:43:18 -- scripts/common.sh@365 -- # decimal 2 00:07:32.714 16:43:18 -- scripts/common.sh@352 -- # local d=2 00:07:32.714 16:43:18 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:32.714 16:43:18 -- scripts/common.sh@354 -- # echo 2 00:07:32.714 16:43:18 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:32.714 16:43:18 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:32.714 16:43:18 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:32.714 16:43:18 -- scripts/common.sh@367 -- # return 0 00:07:32.714 16:43:18 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:32.714 16:43:18 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:32.714 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.714 --rc genhtml_branch_coverage=1 00:07:32.714 --rc genhtml_function_coverage=1 00:07:32.714 --rc genhtml_legend=1 00:07:32.714 --rc geninfo_all_blocks=1 00:07:32.714 --rc geninfo_unexecuted_blocks=1 00:07:32.714 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.714 ' 00:07:32.714 16:43:18 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:32.714 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.714 --rc genhtml_branch_coverage=1 00:07:32.714 --rc genhtml_function_coverage=1 00:07:32.714 --rc genhtml_legend=1 00:07:32.714 --rc geninfo_all_blocks=1 00:07:32.714 --rc geninfo_unexecuted_blocks=1 00:07:32.714 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.714 ' 00:07:32.714 16:43:18 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:32.714 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.714 --rc genhtml_branch_coverage=1 00:07:32.714 --rc genhtml_function_coverage=1 00:07:32.714 --rc genhtml_legend=1 00:07:32.714 --rc geninfo_all_blocks=1 00:07:32.714 --rc geninfo_unexecuted_blocks=1 00:07:32.714 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.714 ' 00:07:32.714 16:43:18 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:32.714 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.714 --rc genhtml_branch_coverage=1 00:07:32.714 --rc genhtml_function_coverage=1 00:07:32.714 --rc genhtml_legend=1 00:07:32.714 --rc geninfo_all_blocks=1 00:07:32.714 --rc geninfo_unexecuted_blocks=1 00:07:32.714 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.714 ' 00:07:32.714 16:43:18 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:32.714 16:43:18 -- ../common.sh@8 -- # pids=() 00:07:32.714 16:43:18 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:32.714 16:43:18 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:32.714 16:43:18 -- nvmf/run.sh@56 -- # fuzz_num=25 00:07:32.714 16:43:18 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:07:32.714 16:43:18 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:07:32.714 16:43:18 -- nvmf/run.sh@61 -- # mem_size=512 00:07:32.714 16:43:18 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:07:32.714 16:43:18 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:07:32.714 16:43:18 -- ../common.sh@69 -- # local fuzz_num=25 00:07:32.714 16:43:18 -- ../common.sh@70 -- # local time=1 00:07:32.714 16:43:18 -- ../common.sh@72 -- # (( i = 0 )) 00:07:32.714 16:43:18 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:32.714 16:43:18 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:32.714 16:43:18 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:32.714 16:43:18 -- nvmf/run.sh@24 -- # local timen=1 00:07:32.714 16:43:18 -- nvmf/run.sh@25 -- # local core=0x1 00:07:32.714 16:43:18 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:32.714 16:43:18 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:32.714 16:43:18 -- nvmf/run.sh@29 -- # printf %02d 0 00:07:32.714 16:43:18 -- nvmf/run.sh@29 -- # port=4400 00:07:32.714 16:43:18 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:32.714 16:43:18 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:32.714 16:43:18 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:32.714 16:43:18 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:07:32.714 [2024-11-16 16:43:18.312359] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:32.714 [2024-11-16 16:43:18.312428] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid483024 ] 00:07:32.714 EAL: No free 2048 kB hugepages reported on node 1 00:07:32.974 [2024-11-16 16:43:18.494895] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.974 [2024-11-16 16:43:18.514887] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:32.974 [2024-11-16 16:43:18.515022] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.974 [2024-11-16 16:43:18.566473] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:32.974 [2024-11-16 16:43:18.582828] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:32.974 INFO: Running with entropic power schedule (0xFF, 100). 00:07:32.974 INFO: Seed: 2273260829 00:07:32.974 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:32.974 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:32.974 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:32.974 INFO: A corpus is not provided, starting from an empty corpus 00:07:32.974 #2 INITED exec/s: 0 rss: 59Mb 00:07:32.974 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:32.974 This may also happen if the target rejected all inputs we tried so far 00:07:32.974 [2024-11-16 16:43:18.658703] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:32.974 [2024-11-16 16:43:18.658750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.233 NEW_FUNC[1/669]: 0x451418 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:33.233 NEW_FUNC[2/669]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:33.233 #42 NEW cov: 11543 ft: 11544 corp: 2/119b lim: 320 exec/s: 0 rss: 67Mb L: 118/118 MS: 5 InsertRepeatedBytes-CMP-EraseBytes-InsertByte-InsertRepeatedBytes- DE: "\377\377\377\377"- 00:07:33.493 [2024-11-16 16:43:18.989678] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.493 [2024-11-16 16:43:18.989738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.493 NEW_FUNC[1/1]: 0x16ca2c8 in nvme_qpair_get_state /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1456 00:07:33.493 #43 NEW cov: 11658 ft: 12121 corp: 3/237b lim: 320 exec/s: 0 rss: 67Mb L: 118/118 MS: 1 ShuffleBytes- 00:07:33.493 [2024-11-16 16:43:19.039724] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.493 [2024-11-16 16:43:19.039755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.493 #44 NEW cov: 11664 ft: 12497 corp: 4/355b lim: 320 exec/s: 0 rss: 67Mb L: 118/118 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:07:33.493 [2024-11-16 16:43:19.079814] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.493 [2024-11-16 16:43:19.079843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.493 #45 NEW cov: 11749 ft: 12768 corp: 5/473b lim: 320 exec/s: 0 rss: 67Mb L: 118/118 MS: 1 ChangeBit- 00:07:33.493 [2024-11-16 16:43:19.119965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:8b8b8b8b cdw10:8b8b8b8b cdw11:8b8b8b8b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:33.493 [2024-11-16 16:43:19.119992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.493 NEW_FUNC[1/1]: 0x16dd468 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:07:33.493 #50 NEW cov: 11784 ft: 13181 corp: 6/595b lim: 320 exec/s: 0 rss: 67Mb L: 122/122 MS: 5 CrossOver-ChangeByte-ChangeBinInt-CopyPart-InsertRepeatedBytes- 00:07:33.493 [2024-11-16 16:43:19.160103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f6) qid:0 cid:4 nsid:f7f6f6 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.493 [2024-11-16 16:43:19.160129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.493 #51 NEW cov: 11785 ft: 13277 corp: 7/713b lim: 320 exec/s: 0 rss: 67Mb L: 118/122 MS: 1 ChangeBinInt- 00:07:33.493 [2024-11-16 16:43:19.200263] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.493 [2024-11-16 16:43:19.200288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.493 #57 NEW cov: 11785 ft: 13342 corp: 8/831b lim: 320 exec/s: 0 rss: 67Mb L: 118/122 MS: 1 CopyPart- 00:07:33.493 [2024-11-16 16:43:19.240354] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.493 [2024-11-16 16:43:19.240381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.753 #58 NEW cov: 11785 ft: 13374 corp: 9/949b lim: 320 exec/s: 0 rss: 67Mb L: 118/122 MS: 1 ChangeByte- 00:07:33.753 [2024-11-16 16:43:19.280405] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.753 [2024-11-16 16:43:19.280432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.753 #59 NEW cov: 11785 ft: 13414 corp: 10/1067b lim: 320 exec/s: 0 rss: 68Mb L: 118/122 MS: 1 ChangeBit- 00:07:33.753 [2024-11-16 16:43:19.320613] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.753 [2024-11-16 16:43:19.320640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.753 #60 NEW cov: 11785 ft: 13437 corp: 11/1186b lim: 320 exec/s: 0 rss: 68Mb L: 119/122 MS: 1 InsertByte- 00:07:33.753 [2024-11-16 16:43:19.360730] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.753 [2024-11-16 16:43:19.360757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.753 #66 NEW cov: 11785 ft: 13455 corp: 12/1304b lim: 320 exec/s: 0 rss: 68Mb L: 118/122 MS: 1 CopyPart- 00:07:33.753 [2024-11-16 16:43:19.400806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f6) qid:0 cid:4 nsid:f7f6f6 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.753 [2024-11-16 16:43:19.400832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.753 #67 NEW cov: 11785 ft: 13489 corp: 13/1422b lim: 320 exec/s: 0 rss: 68Mb L: 118/122 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:07:33.753 [2024-11-16 16:43:19.441421] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.753 [2024-11-16 16:43:19.441449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.753 [2024-11-16 16:43:19.441581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:09000000 cdw11:09090909 00:07:33.753 [2024-11-16 16:43:19.441596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.753 [2024-11-16 16:43:19.441710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:33.753 [2024-11-16 16:43:19.441728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:33.753 #68 NEW cov: 11787 ft: 13739 corp: 14/1647b lim: 320 exec/s: 0 rss: 68Mb L: 225/225 MS: 1 InsertRepeatedBytes- 00:07:33.753 [2024-11-16 16:43:19.481108] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:33.753 [2024-11-16 16:43:19.481134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.753 #69 NEW cov: 11787 ft: 13783 corp: 15/1765b lim: 320 exec/s: 0 rss: 68Mb L: 118/225 MS: 1 ChangeBit- 00:07:34.012 [2024-11-16 16:43:19.521154] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.012 [2024-11-16 16:43:19.521183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.012 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:34.012 #70 NEW cov: 11810 ft: 13828 corp: 16/1885b lim: 320 exec/s: 0 rss: 68Mb L: 120/225 MS: 1 CMP- DE: "\017\000"- 00:07:34.012 [2024-11-16 16:43:19.560921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2a) qid:0 cid:4 nsid:8b8b8b8b cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x8b8b8b8b8b8b8b8b 00:07:34.012 [2024-11-16 16:43:19.560950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.012 #73 NEW cov: 11810 ft: 13846 corp: 17/2008b lim: 320 exec/s: 0 rss: 68Mb L: 123/225 MS: 3 ShuffleBytes-ChangeBit-CrossOver- 00:07:34.012 [2024-11-16 16:43:19.611521] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.012 [2024-11-16 16:43:19.611549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.012 #74 NEW cov: 11810 ft: 14041 corp: 18/2126b lim: 320 exec/s: 74 rss: 68Mb L: 118/225 MS: 1 ChangeBinInt- 00:07:34.012 [2024-11-16 16:43:19.651620] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.012 [2024-11-16 16:43:19.651648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.012 #75 NEW cov: 11810 ft: 14092 corp: 19/2208b lim: 320 exec/s: 75 rss: 68Mb L: 82/225 MS: 1 CrossOver- 00:07:34.012 [2024-11-16 16:43:19.681682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f6) qid:0 cid:4 nsid:f7f6f6 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.012 [2024-11-16 16:43:19.681710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.012 #81 NEW cov: 11810 ft: 14124 corp: 20/2326b lim: 320 exec/s: 81 rss: 68Mb L: 118/225 MS: 1 ChangeByte- 00:07:34.012 [2024-11-16 16:43:19.721770] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.012 [2024-11-16 16:43:19.721815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.012 #82 NEW cov: 11810 ft: 14138 corp: 21/2412b lim: 320 exec/s: 82 rss: 68Mb L: 86/225 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:07:34.271 [2024-11-16 16:43:19.761474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (0f) qid:0 cid:4 nsid:f60a00ff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.271 [2024-11-16 16:43:19.761505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.271 NEW_FUNC[1/1]: 0x12de638 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2016 00:07:34.271 #87 NEW cov: 11841 ft: 14197 corp: 22/2537b lim: 320 exec/s: 87 rss: 68Mb L: 125/225 MS: 5 PersAutoDict-ChangeByte-PersAutoDict-PersAutoDict-CrossOver- DE: "\017\000"-"\377\377\377\377"-"\017\000"- 00:07:34.271 [2024-11-16 16:43:19.812107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f6) qid:0 cid:4 nsid:f7f6f6 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.271 [2024-11-16 16:43:19.812136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.271 #93 NEW cov: 11841 ft: 14228 corp: 23/2657b lim: 320 exec/s: 93 rss: 68Mb L: 120/225 MS: 1 PersAutoDict- DE: "\017\000"- 00:07:34.271 [2024-11-16 16:43:19.851865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f6) qid:0 cid:4 nsid:f7f6f6 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.271 [2024-11-16 16:43:19.851894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.271 #94 NEW cov: 11841 ft: 14238 corp: 24/2775b lim: 320 exec/s: 94 rss: 68Mb L: 118/225 MS: 1 ChangeByte- 00:07:34.271 [2024-11-16 16:43:19.892023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f6) qid:0 cid:4 nsid:f7f6f6 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.271 [2024-11-16 16:43:19.892053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.271 #95 NEW cov: 11841 ft: 14242 corp: 25/2893b lim: 320 exec/s: 95 rss: 68Mb L: 118/225 MS: 1 ChangeByte- 00:07:34.271 [2024-11-16 16:43:19.922393] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.271 [2024-11-16 16:43:19.922420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.271 #96 NEW cov: 11841 ft: 14254 corp: 26/3014b lim: 320 exec/s: 96 rss: 68Mb L: 121/225 MS: 1 PersAutoDict- DE: "\017\000"- 00:07:34.271 [2024-11-16 16:43:19.962599] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.271 [2024-11-16 16:43:19.962629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.271 #97 NEW cov: 11841 ft: 14262 corp: 27/3133b lim: 320 exec/s: 97 rss: 68Mb L: 119/225 MS: 1 InsertByte- 00:07:34.271 [2024-11-16 16:43:20.002714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2a) qid:0 cid:4 nsid:8b8b8b8b cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x8b8b8b8b8b8b8b8b 00:07:34.271 [2024-11-16 16:43:20.002743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.531 #98 NEW cov: 11841 ft: 14282 corp: 28/3260b lim: 320 exec/s: 98 rss: 69Mb L: 127/225 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:34.531 [2024-11-16 16:43:20.052344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (0f) qid:0 cid:4 nsid:f60a00ff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffff 00:07:34.531 [2024-11-16 16:43:20.052375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.531 #99 NEW cov: 11841 ft: 14317 corp: 29/3337b lim: 320 exec/s: 99 rss: 69Mb L: 77/225 MS: 1 EraseBytes- 00:07:34.531 [2024-11-16 16:43:20.092677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2a) qid:0 cid:4 nsid:8b8b8b8b cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x8b8b8b8b8b8b8b8b 00:07:34.531 [2024-11-16 16:43:20.092708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.531 [2024-11-16 16:43:20.092851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (8b) qid:0 cid:5 nsid:8b8b8b8b cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x8b8b8b8b8b8b8b8b 00:07:34.531 [2024-11-16 16:43:20.092867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.531 #100 NEW cov: 11841 ft: 14447 corp: 30/3465b lim: 320 exec/s: 100 rss: 69Mb L: 128/225 MS: 1 InsertByte- 00:07:34.531 [2024-11-16 16:43:20.153307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2a) qid:0 cid:4 nsid:8b8b8b8b cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x8b8b8b8b8b8b8b8b 00:07:34.531 [2024-11-16 16:43:20.153336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.531 [2024-11-16 16:43:20.153482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (8b) qid:0 cid:5 nsid:8b8b8b8b cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x8b8b8b8b8b8b8b8b 00:07:34.531 [2024-11-16 16:43:20.153502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.531 #101 NEW cov: 11841 ft: 14460 corp: 31/3638b lim: 320 exec/s: 101 rss: 69Mb L: 173/225 MS: 1 CopyPart- 00:07:34.531 [2024-11-16 16:43:20.193091] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.531 [2024-11-16 16:43:20.193120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.531 [2024-11-16 16:43:20.193257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.531 [2024-11-16 16:43:20.193275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.531 [2024-11-16 16:43:20.193411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.531 [2024-11-16 16:43:20.193428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.531 #102 NEW cov: 11841 ft: 14548 corp: 32/3863b lim: 320 exec/s: 102 rss: 69Mb L: 225/225 MS: 1 InsertRepeatedBytes- 00:07:34.531 [2024-11-16 16:43:20.233136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (f6) qid:0 cid:4 nsid:f7f6f6 cdw10:09090909 cdw11:09090909 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.531 [2024-11-16 16:43:20.233164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.531 [2024-11-16 16:43:20.233276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:34.531 [2024-11-16 16:43:20.233292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.531 #104 NEW cov: 11841 ft: 14591 corp: 33/4006b lim: 320 exec/s: 104 rss: 69Mb L: 143/225 MS: 2 EraseBytes-CrossOver- 00:07:34.531 [2024-11-16 16:43:20.273450] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.531 [2024-11-16 16:43:20.273479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.790 #105 NEW cov: 11841 ft: 14604 corp: 34/4124b lim: 320 exec/s: 105 rss: 69Mb L: 118/225 MS: 1 ChangeBit- 00:07:34.790 [2024-11-16 16:43:20.313512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (0f) qid:0 cid:4 nsid:f60a00ff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffff 00:07:34.790 [2024-11-16 16:43:20.313540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.790 #106 NEW cov: 11841 ft: 14617 corp: 35/4202b lim: 320 exec/s: 106 rss: 69Mb L: 78/225 MS: 1 CrossOver- 00:07:34.790 [2024-11-16 16:43:20.353706] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:34.790 [2024-11-16 16:43:20.353736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.790 #107 NEW cov: 11841 ft: 14629 corp: 36/4320b lim: 320 exec/s: 107 rss: 69Mb L: 118/225 MS: 1 ChangeBinInt- 00:07:34.790 [2024-11-16 16:43:20.393749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (0f) qid:0 cid:4 nsid:f60a00ff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffff 00:07:34.790 [2024-11-16 16:43:20.393777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.790 #108 NEW cov: 11841 ft: 14642 corp: 37/4398b lim: 320 exec/s: 108 rss: 69Mb L: 78/225 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:34.790 [2024-11-16 16:43:20.433931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2a) qid:0 cid:4 nsid:8b8b8b8b cdw10:8b8b8b8b cdw11:8b8b8b8b SGL TRANSPORT DATA BLOCK TRANSPORT 0x8b8b8b8b8b8b8b8b 00:07:34.790 [2024-11-16 16:43:20.433957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.790 #109 NEW cov: 11841 ft: 14645 corp: 38/4521b lim: 320 exec/s: 109 rss: 69Mb L: 123/225 MS: 1 ChangeBinInt- 00:07:34.790 [2024-11-16 16:43:20.473676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (0f) qid:0 cid:4 nsid:f60a00ff cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xfffffe 00:07:34.790 [2024-11-16 16:43:20.473704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.790 #110 NEW cov: 11841 ft: 14649 corp: 39/4599b lim: 320 exec/s: 110 rss: 69Mb L: 78/225 MS: 1 ChangeBinInt- 00:07:34.790 [2024-11-16 16:43:20.514142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:8b8b8b8b cdw10:8b8b8b8b cdw11:8b8b8b8b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:34.790 [2024-11-16 16:43:20.514169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.048 #111 NEW cov: 11841 ft: 14651 corp: 40/4721b lim: 320 exec/s: 111 rss: 69Mb L: 122/225 MS: 1 ChangeBinInt- 00:07:35.048 [2024-11-16 16:43:20.554617] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:91919191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.048 [2024-11-16 16:43:20.554644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.049 [2024-11-16 16:43:20.554771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (91) qid:0 cid:5 nsid:91919191 cdw10:91919191 cdw11:91919191 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.049 [2024-11-16 16:43:20.554790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.049 [2024-11-16 16:43:20.554914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (91) qid:0 cid:6 nsid:91919191 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.049 [2024-11-16 16:43:20.554929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.049 #112 NEW cov: 11841 ft: 15139 corp: 41/4965b lim: 320 exec/s: 112 rss: 69Mb L: 244/244 MS: 1 InsertRepeatedBytes- 00:07:35.049 [2024-11-16 16:43:20.594458] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.049 [2024-11-16 16:43:20.594487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.049 #113 NEW cov: 11841 ft: 15152 corp: 42/5083b lim: 320 exec/s: 113 rss: 69Mb L: 118/244 MS: 1 ChangeByte- 00:07:35.049 [2024-11-16 16:43:20.634212] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.049 [2024-11-16 16:43:20.634243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.049 #114 NEW cov: 11841 ft: 15192 corp: 43/5205b lim: 320 exec/s: 57 rss: 69Mb L: 122/244 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:07:35.049 #114 DONE cov: 11841 ft: 15192 corp: 43/5205b lim: 320 exec/s: 57 rss: 69Mb 00:07:35.049 ###### Recommended dictionary. ###### 00:07:35.049 "\377\377\377\377" # Uses: 7 00:07:35.049 "\017\000" # Uses: 4 00:07:35.049 "\000\000\000\000" # Uses: 1 00:07:35.049 ###### End of recommended dictionary. ###### 00:07:35.049 Done 114 runs in 2 second(s) 00:07:35.049 16:43:20 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:07:35.049 16:43:20 -- ../common.sh@72 -- # (( i++ )) 00:07:35.049 16:43:20 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:35.049 16:43:20 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:35.049 16:43:20 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:35.049 16:43:20 -- nvmf/run.sh@24 -- # local timen=1 00:07:35.049 16:43:20 -- nvmf/run.sh@25 -- # local core=0x1 00:07:35.049 16:43:20 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:35.049 16:43:20 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:35.049 16:43:20 -- nvmf/run.sh@29 -- # printf %02d 1 00:07:35.049 16:43:20 -- nvmf/run.sh@29 -- # port=4401 00:07:35.049 16:43:20 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:35.049 16:43:20 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:35.049 16:43:20 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:35.049 16:43:20 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:07:35.307 [2024-11-16 16:43:20.800813] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:35.307 [2024-11-16 16:43:20.800877] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid483557 ] 00:07:35.307 EAL: No free 2048 kB hugepages reported on node 1 00:07:35.307 [2024-11-16 16:43:20.974770] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.307 [2024-11-16 16:43:20.994284] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:35.307 [2024-11-16 16:43:20.994413] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.307 [2024-11-16 16:43:21.045687] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:35.566 [2024-11-16 16:43:21.061983] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:35.566 INFO: Running with entropic power schedule (0xFF, 100). 00:07:35.566 INFO: Seed: 456266026 00:07:35.566 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:35.566 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:35.566 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:35.566 INFO: A corpus is not provided, starting from an empty corpus 00:07:35.566 #2 INITED exec/s: 0 rss: 59Mb 00:07:35.566 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:35.566 This may also happen if the target rejected all inputs we tried so far 00:07:35.566 [2024-11-16 16:43:21.110904] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:35.566 [2024-11-16 16:43:21.111022] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:35.566 [2024-11-16 16:43:21.111229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.566 [2024-11-16 16:43:21.111263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.566 [2024-11-16 16:43:21.111317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.566 [2024-11-16 16:43:21.111331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.826 NEW_FUNC[1/671]: 0x451d18 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:35.826 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:35.826 #9 NEW cov: 11626 ft: 11627 corp: 2/15b lim: 30 exec/s: 0 rss: 67Mb L: 14/14 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:35.826 [2024-11-16 16:43:21.421709] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:35.826 [2024-11-16 16:43:21.421839] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:35.826 [2024-11-16 16:43:21.422059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.826 [2024-11-16 16:43:21.422092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.826 [2024-11-16 16:43:21.422156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.826 [2024-11-16 16:43:21.422171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.826 #10 NEW cov: 11739 ft: 12064 corp: 3/30b lim: 30 exec/s: 0 rss: 67Mb L: 15/15 MS: 1 CrossOver- 00:07:35.826 [2024-11-16 16:43:21.471772] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:35.826 [2024-11-16 16:43:21.471905] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:35.826 [2024-11-16 16:43:21.472015] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (65796) > buf size (4096) 00:07:35.826 [2024-11-16 16:43:21.472222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.826 [2024-11-16 16:43:21.472248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.826 [2024-11-16 16:43:21.472301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.826 [2024-11-16 16:43:21.472316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.826 [2024-11-16 16:43:21.472368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:40400040 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.826 [2024-11-16 16:43:21.472382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:35.826 #11 NEW cov: 11768 ft: 12595 corp: 4/53b lim: 30 exec/s: 0 rss: 67Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:07:35.826 [2024-11-16 16:43:21.511822] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:35.826 [2024-11-16 16:43:21.511936] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:35.826 [2024-11-16 16:43:21.512135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.826 [2024-11-16 16:43:21.512161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.826 [2024-11-16 16:43:21.512217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.826 [2024-11-16 16:43:21.512234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.826 #12 NEW cov: 11853 ft: 12919 corp: 5/68b lim: 30 exec/s: 0 rss: 67Mb L: 15/23 MS: 1 ChangeBit- 00:07:35.826 [2024-11-16 16:43:21.551940] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:35.826 [2024-11-16 16:43:21.552056] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:35.826 [2024-11-16 16:43:21.552268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.826 [2024-11-16 16:43:21.552294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.826 [2024-11-16 16:43:21.552350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.826 [2024-11-16 16:43:21.552364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.086 #13 NEW cov: 11853 ft: 13055 corp: 6/83b lim: 30 exec/s: 0 rss: 67Mb L: 15/23 MS: 1 ChangeBinInt- 00:07:36.086 [2024-11-16 16:43:21.592074] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.086 [2024-11-16 16:43:21.592210] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.086 [2024-11-16 16:43:21.592418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.086 [2024-11-16 16:43:21.592444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.086 [2024-11-16 16:43:21.592500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff8324 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.086 [2024-11-16 16:43:21.592515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.086 #14 NEW cov: 11853 ft: 13103 corp: 7/99b lim: 30 exec/s: 0 rss: 67Mb L: 16/23 MS: 1 InsertByte- 00:07:36.086 [2024-11-16 16:43:21.632201] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.086 [2024-11-16 16:43:21.632315] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.086 [2024-11-16 16:43:21.632522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.086 [2024-11-16 16:43:21.632547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.086 [2024-11-16 16:43:21.632605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff8324 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.086 [2024-11-16 16:43:21.632619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.086 #15 NEW cov: 11853 ft: 13242 corp: 8/115b lim: 30 exec/s: 0 rss: 67Mb L: 16/23 MS: 1 CopyPart- 00:07:36.086 [2024-11-16 16:43:21.672295] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.086 [2024-11-16 16:43:21.672410] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.086 [2024-11-16 16:43:21.672645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.086 [2024-11-16 16:43:21.672675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.086 [2024-11-16 16:43:21.672731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.086 [2024-11-16 16:43:21.672747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.086 #16 NEW cov: 11853 ft: 13274 corp: 9/131b lim: 30 exec/s: 0 rss: 67Mb L: 16/23 MS: 1 InsertByte- 00:07:36.086 [2024-11-16 16:43:21.712407] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.086 [2024-11-16 16:43:21.712519] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.086 [2024-11-16 16:43:21.712740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.086 [2024-11-16 16:43:21.712766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.086 [2024-11-16 16:43:21.712823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.086 [2024-11-16 16:43:21.712837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.086 #17 NEW cov: 11853 ft: 13310 corp: 10/145b lim: 30 exec/s: 0 rss: 67Mb L: 14/23 MS: 1 ShuffleBytes- 00:07:36.086 [2024-11-16 16:43:21.752590] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.086 [2024-11-16 16:43:21.752714] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.086 [2024-11-16 16:43:21.752827] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004dff 00:07:36.086 [2024-11-16 16:43:21.752936] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.086 [2024-11-16 16:43:21.753147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.086 [2024-11-16 16:43:21.753173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.086 [2024-11-16 16:43:21.753229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.086 [2024-11-16 16:43:21.753243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.086 [2024-11-16 16:43:21.753297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.086 [2024-11-16 16:43:21.753310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.086 [2024-11-16 16:43:21.753365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.086 [2024-11-16 16:43:21.753378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.086 #18 NEW cov: 11853 ft: 13819 corp: 11/172b lim: 30 exec/s: 0 rss: 67Mb L: 27/27 MS: 1 CopyPart- 00:07:36.086 [2024-11-16 16:43:21.792707] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.086 [2024-11-16 16:43:21.792825] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.086 [2024-11-16 16:43:21.793036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.086 [2024-11-16 16:43:21.793061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.086 [2024-11-16 16:43:21.793117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.086 [2024-11-16 16:43:21.793131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.086 #19 NEW cov: 11853 ft: 13864 corp: 12/187b lim: 30 exec/s: 0 rss: 67Mb L: 15/27 MS: 1 ChangeByte- 00:07:36.086 [2024-11-16 16:43:21.832759] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.086 [2024-11-16 16:43:21.832873] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.086 [2024-11-16 16:43:21.833072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.086 [2024-11-16 16:43:21.833097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.086 [2024-11-16 16:43:21.833156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.086 [2024-11-16 16:43:21.833170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.346 #20 NEW cov: 11853 ft: 13935 corp: 13/201b lim: 30 exec/s: 0 rss: 67Mb L: 14/27 MS: 1 ChangeBit- 00:07:36.346 [2024-11-16 16:43:21.872876] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.346 [2024-11-16 16:43:21.872993] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.346 [2024-11-16 16:43:21.873204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.346 [2024-11-16 16:43:21.873229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.346 [2024-11-16 16:43:21.873284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.346 [2024-11-16 16:43:21.873298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.346 #21 NEW cov: 11853 ft: 13967 corp: 14/217b lim: 30 exec/s: 0 rss: 68Mb L: 16/27 MS: 1 InsertByte- 00:07:36.346 [2024-11-16 16:43:21.913055] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.346 [2024-11-16 16:43:21.913167] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.346 [2024-11-16 16:43:21.913362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.346 [2024-11-16 16:43:21.913387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.346 [2024-11-16 16:43:21.913445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff8324 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.346 [2024-11-16 16:43:21.913458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.346 #22 NEW cov: 11853 ft: 14002 corp: 15/234b lim: 30 exec/s: 0 rss: 68Mb L: 17/27 MS: 1 InsertByte- 00:07:36.346 [2024-11-16 16:43:21.953089] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.346 [2024-11-16 16:43:21.953292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.346 [2024-11-16 16:43:21.953316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.346 #23 NEW cov: 11853 ft: 14438 corp: 16/244b lim: 30 exec/s: 0 rss: 68Mb L: 10/27 MS: 1 EraseBytes- 00:07:36.346 [2024-11-16 16:43:21.993252] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000ffff 00:07:36.346 [2024-11-16 16:43:21.993366] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.346 [2024-11-16 16:43:21.993570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff81ff cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.346 [2024-11-16 16:43:21.993599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.346 [2024-11-16 16:43:21.993656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.346 [2024-11-16 16:43:21.993674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.346 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:36.346 #24 NEW cov: 11876 ft: 14467 corp: 17/260b lim: 30 exec/s: 0 rss: 68Mb L: 16/27 MS: 1 InsertByte- 00:07:36.346 [2024-11-16 16:43:22.033378] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.346 [2024-11-16 16:43:22.033500] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.346 [2024-11-16 16:43:22.033610] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000009ff 00:07:36.346 [2024-11-16 16:43:22.033820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.346 [2024-11-16 16:43:22.033846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.346 [2024-11-16 16:43:22.033902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.346 [2024-11-16 16:43:22.033916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.346 [2024-11-16 16:43:22.033972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:09098109 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.346 [2024-11-16 16:43:22.033985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.346 #25 NEW cov: 11876 ft: 14556 corp: 18/279b lim: 30 exec/s: 0 rss: 68Mb L: 19/27 MS: 1 InsertRepeatedBytes- 00:07:36.346 [2024-11-16 16:43:22.073403] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.346 [2024-11-16 16:43:22.073614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.346 [2024-11-16 16:43:22.073640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.606 #26 NEW cov: 11876 ft: 14680 corp: 19/287b lim: 30 exec/s: 26 rss: 68Mb L: 8/27 MS: 1 EraseBytes- 00:07:36.606 [2024-11-16 16:43:22.113578] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.607 [2024-11-16 16:43:22.113701] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.607 [2024-11-16 16:43:22.113911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.607 [2024-11-16 16:43:22.113937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.607 [2024-11-16 16:43:22.113992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:b8ff8324 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.607 [2024-11-16 16:43:22.114006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.607 #27 NEW cov: 11876 ft: 14692 corp: 20/303b lim: 30 exec/s: 27 rss: 68Mb L: 16/27 MS: 1 ChangeByte- 00:07:36.607 [2024-11-16 16:43:22.143625] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (104448) > buf size (4096) 00:07:36.607 [2024-11-16 16:43:22.143750] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000ffff 00:07:36.607 [2024-11-16 16:43:22.143965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.607 [2024-11-16 16:43:22.143991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.607 [2024-11-16 16:43:22.144047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.607 [2024-11-16 16:43:22.144061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.607 #28 NEW cov: 11876 ft: 14763 corp: 21/320b lim: 30 exec/s: 28 rss: 68Mb L: 17/27 MS: 1 ChangeBinInt- 00:07:36.607 [2024-11-16 16:43:22.183850] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.607 [2024-11-16 16:43:22.183965] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.607 [2024-11-16 16:43:22.184075] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004dff 00:07:36.607 [2024-11-16 16:43:22.184183] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.607 [2024-11-16 16:43:22.184399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.607 [2024-11-16 16:43:22.184424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.607 [2024-11-16 16:43:22.184481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.607 [2024-11-16 16:43:22.184496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.607 [2024-11-16 16:43:22.184550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.607 [2024-11-16 16:43:22.184564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.607 [2024-11-16 16:43:22.184619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.607 [2024-11-16 16:43:22.184632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.607 #29 NEW cov: 11876 ft: 14782 corp: 22/347b lim: 30 exec/s: 29 rss: 68Mb L: 27/27 MS: 1 CopyPart- 00:07:36.607 [2024-11-16 16:43:22.223830] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.607 [2024-11-16 16:43:22.224039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:655b83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.607 [2024-11-16 16:43:22.224064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.607 #30 NEW cov: 11876 ft: 14791 corp: 23/355b lim: 30 exec/s: 30 rss: 68Mb L: 8/27 MS: 1 ChangeByte- 00:07:36.607 [2024-11-16 16:43:22.264018] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.607 [2024-11-16 16:43:22.264132] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.607 [2024-11-16 16:43:22.264335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.607 [2024-11-16 16:43:22.264360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.607 [2024-11-16 16:43:22.264415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ff2483ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.607 [2024-11-16 16:43:22.264432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.607 #31 NEW cov: 11876 ft: 14828 corp: 24/371b lim: 30 exec/s: 31 rss: 68Mb L: 16/27 MS: 1 ShuffleBytes- 00:07:36.607 [2024-11-16 16:43:22.294142] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.607 [2024-11-16 16:43:22.294272] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.607 [2024-11-16 16:43:22.294384] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1000009ff 00:07:36.607 [2024-11-16 16:43:22.294493] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000909 00:07:36.607 [2024-11-16 16:43:22.294702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.607 [2024-11-16 16:43:22.294730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.607 [2024-11-16 16:43:22.294786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.607 [2024-11-16 16:43:22.294800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.607 [2024-11-16 16:43:22.294854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:09098109 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.607 [2024-11-16 16:43:22.294868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.607 [2024-11-16 16:43:22.294924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.607 [2024-11-16 16:43:22.294937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.607 #32 NEW cov: 11876 ft: 14840 corp: 25/399b lim: 30 exec/s: 32 rss: 68Mb L: 28/28 MS: 1 CopyPart- 00:07:36.607 [2024-11-16 16:43:22.334216] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff01 00:07:36.607 [2024-11-16 16:43:22.334517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.607 [2024-11-16 16:43:22.334542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.607 [2024-11-16 16:43:22.334597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.607 [2024-11-16 16:43:22.334611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.867 #33 NEW cov: 11893 ft: 14870 corp: 26/415b lim: 30 exec/s: 33 rss: 68Mb L: 16/28 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:36.867 [2024-11-16 16:43:22.374452] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.867 [2024-11-16 16:43:22.374570] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000707 00:07:36.867 [2024-11-16 16:43:22.374686] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000707 00:07:36.867 [2024-11-16 16:43:22.374792] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000707 00:07:36.867 [2024-11-16 16:43:22.374895] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300002a0d 00:07:36.868 [2024-11-16 16:43:22.375104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.375129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.868 [2024-11-16 16:43:22.375184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.375201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.868 [2024-11-16 16:43:22.375250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:07078307 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.375263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.868 [2024-11-16 16:43:22.375315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:07078307 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.375328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.868 [2024-11-16 16:43:22.375380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:07ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.375393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:36.868 #34 NEW cov: 11893 ft: 14942 corp: 27/445b lim: 30 exec/s: 34 rss: 68Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:07:36.868 [2024-11-16 16:43:22.414436] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.868 [2024-11-16 16:43:22.414550] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.868 [2024-11-16 16:43:22.414762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.414788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.868 [2024-11-16 16:43:22.414844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.414858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.868 #35 NEW cov: 11893 ft: 14967 corp: 28/460b lim: 30 exec/s: 35 rss: 68Mb L: 15/30 MS: 1 ChangeBinInt- 00:07:36.868 [2024-11-16 16:43:22.454611] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.868 [2024-11-16 16:43:22.454731] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.868 [2024-11-16 16:43:22.454839] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (271400) > buf size (4096) 00:07:36.868 [2024-11-16 16:43:22.455136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.455162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.868 [2024-11-16 16:43:22.455217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.455230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.868 [2024-11-16 16:43:22.455284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:09098109 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.455298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.868 [2024-11-16 16:43:22.455350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.455363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.868 #36 NEW cov: 11893 ft: 14999 corp: 29/488b lim: 30 exec/s: 36 rss: 68Mb L: 28/30 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:36.868 [2024-11-16 16:43:22.494731] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.868 [2024-11-16 16:43:22.494854] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:07:36.868 [2024-11-16 16:43:22.495194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.495220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.868 [2024-11-16 16:43:22.495277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.495291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.868 [2024-11-16 16:43:22.495346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.495360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.868 #37 NEW cov: 11893 ft: 15011 corp: 30/511b lim: 30 exec/s: 37 rss: 69Mb L: 23/30 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:36.868 [2024-11-16 16:43:22.534866] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.868 [2024-11-16 16:43:22.534987] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:07:36.868 [2024-11-16 16:43:22.535297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.535323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.868 [2024-11-16 16:43:22.535379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.535393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.868 [2024-11-16 16:43:22.535447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.535461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.868 #38 NEW cov: 11893 ft: 15021 corp: 31/533b lim: 30 exec/s: 38 rss: 69Mb L: 22/30 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:36.868 [2024-11-16 16:43:22.574976] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10772) > buf size (4096) 00:07:36.868 [2024-11-16 16:43:22.575097] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (135700) > buf size (4096) 00:07:36.868 [2024-11-16 16:43:22.575210] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (135700) > buf size (4096) 00:07:36.868 [2024-11-16 16:43:22.575325] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (135700) > buf size (4096) 00:07:36.868 [2024-11-16 16:43:22.575534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a840084 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.575558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.868 [2024-11-16 16:43:22.575613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:84840084 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.575627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.868 [2024-11-16 16:43:22.575687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:84840084 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.575701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.868 [2024-11-16 16:43:22.575754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:84840084 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.575767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.868 #39 NEW cov: 11893 ft: 15060 corp: 32/560b lim: 30 exec/s: 39 rss: 69Mb L: 27/30 MS: 1 InsertRepeatedBytes- 00:07:36.868 [2024-11-16 16:43:22.615049] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.868 [2024-11-16 16:43:22.615164] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.868 [2024-11-16 16:43:22.615370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.615396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.868 [2024-11-16 16:43:22.615451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff8324 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.868 [2024-11-16 16:43:22.615465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.128 #40 NEW cov: 11893 ft: 15065 corp: 33/576b lim: 30 exec/s: 40 rss: 69Mb L: 16/30 MS: 1 ChangeByte- 00:07:37.128 [2024-11-16 16:43:22.645208] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10772) > buf size (4096) 00:07:37.128 [2024-11-16 16:43:22.645325] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (135828) > buf size (4096) 00:07:37.128 [2024-11-16 16:43:22.645435] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (135700) > buf size (4096) 00:07:37.128 [2024-11-16 16:43:22.645543] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (135700) > buf size (4096) 00:07:37.128 [2024-11-16 16:43:22.645760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a840084 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.128 [2024-11-16 16:43:22.645786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.128 [2024-11-16 16:43:22.645840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:84a40084 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.128 [2024-11-16 16:43:22.645854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.128 [2024-11-16 16:43:22.645904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:84840084 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.128 [2024-11-16 16:43:22.645917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.128 [2024-11-16 16:43:22.645971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:84840084 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.128 [2024-11-16 16:43:22.645985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.128 #41 NEW cov: 11893 ft: 15071 corp: 34/603b lim: 30 exec/s: 41 rss: 69Mb L: 27/30 MS: 1 ChangeBit- 00:07:37.128 [2024-11-16 16:43:22.685320] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.128 [2024-11-16 16:43:22.685437] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.128 [2024-11-16 16:43:22.685551] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.128 [2024-11-16 16:43:22.685764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.128 [2024-11-16 16:43:22.685791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.128 [2024-11-16 16:43:22.685847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.128 [2024-11-16 16:43:22.685860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.129 [2024-11-16 16:43:22.685912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ff65835b cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.129 [2024-11-16 16:43:22.685925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.129 #42 NEW cov: 11893 ft: 15080 corp: 35/625b lim: 30 exec/s: 42 rss: 69Mb L: 22/30 MS: 1 CrossOver- 00:07:37.129 [2024-11-16 16:43:22.725388] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.129 [2024-11-16 16:43:22.725505] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.129 [2024-11-16 16:43:22.725618] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (271400) > buf size (4096) 00:07:37.129 [2024-11-16 16:43:22.725829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.129 [2024-11-16 16:43:22.725855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.129 [2024-11-16 16:43:22.725911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.129 [2024-11-16 16:43:22.725925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.129 [2024-11-16 16:43:22.725978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:09098109 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.129 [2024-11-16 16:43:22.725991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.129 #43 NEW cov: 11893 ft: 15083 corp: 36/645b lim: 30 exec/s: 43 rss: 69Mb L: 20/30 MS: 1 CrossOver- 00:07:37.129 [2024-11-16 16:43:22.765528] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.129 [2024-11-16 16:43:22.765657] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000ffff 00:07:37.129 [2024-11-16 16:43:22.765775] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.129 [2024-11-16 16:43:22.766002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.129 [2024-11-16 16:43:22.766028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.129 [2024-11-16 16:43:22.766084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff8124 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.129 [2024-11-16 16:43:22.766098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.129 [2024-11-16 16:43:22.766152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.129 [2024-11-16 16:43:22.766165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.129 #44 NEW cov: 11893 ft: 15102 corp: 37/664b lim: 30 exec/s: 44 rss: 69Mb L: 19/30 MS: 1 CrossOver- 00:07:37.129 [2024-11-16 16:43:22.805601] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.129 [2024-11-16 16:43:22.805817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.129 [2024-11-16 16:43:22.805843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.129 #45 NEW cov: 11893 ft: 15114 corp: 38/673b lim: 30 exec/s: 45 rss: 69Mb L: 9/30 MS: 1 CopyPart- 00:07:37.129 [2024-11-16 16:43:22.845784] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.129 [2024-11-16 16:43:22.845902] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.129 [2024-11-16 16:43:22.846019] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (590084) > buf size (4096) 00:07:37.129 [2024-11-16 16:43:22.846239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.129 [2024-11-16 16:43:22.846263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.129 [2024-11-16 16:43:22.846317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.129 [2024-11-16 16:43:22.846330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.129 [2024-11-16 16:43:22.846385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:40400240 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.129 [2024-11-16 16:43:22.846399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.129 #46 NEW cov: 11893 ft: 15154 corp: 39/696b lim: 30 exec/s: 46 rss: 69Mb L: 23/30 MS: 1 CMP- DE: "\002\000"- 00:07:37.389 [2024-11-16 16:43:22.885878] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.389 [2024-11-16 16:43:22.885993] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000fff7 00:07:37.389 [2024-11-16 16:43:22.886197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.389 [2024-11-16 16:43:22.886223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.389 [2024-11-16 16:43:22.886278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.389 [2024-11-16 16:43:22.886292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.389 #47 NEW cov: 11893 ft: 15196 corp: 40/710b lim: 30 exec/s: 47 rss: 69Mb L: 14/30 MS: 1 ChangeBit- 00:07:37.389 [2024-11-16 16:43:22.925972] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.389 [2024-11-16 16:43:22.926083] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.389 [2024-11-16 16:43:22.926297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.389 [2024-11-16 16:43:22.926322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.389 [2024-11-16 16:43:22.926379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.389 [2024-11-16 16:43:22.926392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.389 #48 NEW cov: 11893 ft: 15197 corp: 41/726b lim: 30 exec/s: 48 rss: 69Mb L: 16/30 MS: 1 CopyPart- 00:07:37.389 [2024-11-16 16:43:22.966152] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.389 [2024-11-16 16:43:22.966264] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.389 [2024-11-16 16:43:22.966479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.389 [2024-11-16 16:43:22.966504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.389 [2024-11-16 16:43:22.966560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:89ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.389 [2024-11-16 16:43:22.966574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.389 #49 NEW cov: 11893 ft: 15268 corp: 42/742b lim: 30 exec/s: 49 rss: 69Mb L: 16/30 MS: 1 InsertByte- 00:07:37.389 [2024-11-16 16:43:23.006279] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10772) > buf size (4096) 00:07:37.389 [2024-11-16 16:43:23.006396] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (135700) > buf size (4096) 00:07:37.389 [2024-11-16 16:43:23.006507] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (33924) > len (4) 00:07:37.389 [2024-11-16 16:43:23.006615] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (135700) > buf size (4096) 00:07:37.389 [2024-11-16 16:43:23.006809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a840084 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.389 [2024-11-16 16:43:23.006835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.389 [2024-11-16 16:43:23.006893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:84840001 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.389 [2024-11-16 16:43:23.006906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.389 [2024-11-16 16:43:23.006961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.389 [2024-11-16 16:43:23.006975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.389 [2024-11-16 16:43:23.007030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:84840084 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.389 [2024-11-16 16:43:23.007044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.390 #50 NEW cov: 11899 ft: 15284 corp: 43/769b lim: 30 exec/s: 50 rss: 69Mb L: 27/30 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:37.390 [2024-11-16 16:43:23.046393] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.390 [2024-11-16 16:43:23.046510] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.390 [2024-11-16 16:43:23.046618] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300004dff 00:07:37.390 [2024-11-16 16:43:23.046733] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.390 [2024-11-16 16:43:23.046941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:65ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.390 [2024-11-16 16:43:23.046967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.390 [2024-11-16 16:43:23.047022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.390 [2024-11-16 16:43:23.047039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.390 [2024-11-16 16:43:23.047094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.390 [2024-11-16 16:43:23.047108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.390 [2024-11-16 16:43:23.047164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.390 [2024-11-16 16:43:23.047178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.390 #51 NEW cov: 11899 ft: 15300 corp: 44/796b lim: 30 exec/s: 51 rss: 69Mb L: 27/30 MS: 1 ShuffleBytes- 00:07:37.390 [2024-11-16 16:43:23.086540] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.390 [2024-11-16 16:43:23.086658] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.390 [2024-11-16 16:43:23.086772] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100000909 00:07:37.390 [2024-11-16 16:43:23.086877] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff09 00:07:37.390 [2024-11-16 16:43:23.087086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:e5ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.390 [2024-11-16 16:43:23.087111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.390 [2024-11-16 16:43:23.087167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.390 [2024-11-16 16:43:23.087181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.390 [2024-11-16 16:43:23.087234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:09098121 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.390 [2024-11-16 16:43:23.087247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.390 [2024-11-16 16:43:23.087302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.390 [2024-11-16 16:43:23.087315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.390 #52 NEW cov: 11899 ft: 15303 corp: 45/825b lim: 30 exec/s: 26 rss: 69Mb L: 29/30 MS: 1 InsertByte- 00:07:37.390 #52 DONE cov: 11899 ft: 15303 corp: 45/825b lim: 30 exec/s: 26 rss: 69Mb 00:07:37.390 ###### Recommended dictionary. ###### 00:07:37.390 "\001\000\000\000\000\000\000\000" # Uses: 4 00:07:37.390 "\002\000" # Uses: 0 00:07:37.390 ###### End of recommended dictionary. ###### 00:07:37.390 Done 52 runs in 2 second(s) 00:07:37.650 16:43:23 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:07:37.650 16:43:23 -- ../common.sh@72 -- # (( i++ )) 00:07:37.650 16:43:23 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:37.650 16:43:23 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:37.650 16:43:23 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:37.650 16:43:23 -- nvmf/run.sh@24 -- # local timen=1 00:07:37.650 16:43:23 -- nvmf/run.sh@25 -- # local core=0x1 00:07:37.650 16:43:23 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:37.650 16:43:23 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:37.650 16:43:23 -- nvmf/run.sh@29 -- # printf %02d 2 00:07:37.650 16:43:23 -- nvmf/run.sh@29 -- # port=4402 00:07:37.650 16:43:23 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:37.650 16:43:23 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:37.650 16:43:23 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:37.650 16:43:23 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:07:37.650 [2024-11-16 16:43:23.256553] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:37.650 [2024-11-16 16:43:23.256644] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid484019 ] 00:07:37.650 EAL: No free 2048 kB hugepages reported on node 1 00:07:37.910 [2024-11-16 16:43:23.443349] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.910 [2024-11-16 16:43:23.463951] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:37.910 [2024-11-16 16:43:23.464065] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.910 [2024-11-16 16:43:23.515426] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:37.910 [2024-11-16 16:43:23.531768] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:37.910 INFO: Running with entropic power schedule (0xFF, 100). 00:07:37.910 INFO: Seed: 2928264397 00:07:37.910 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:37.910 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:37.910 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:37.910 INFO: A corpus is not provided, starting from an empty corpus 00:07:37.910 #2 INITED exec/s: 0 rss: 59Mb 00:07:37.910 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:37.910 This may also happen if the target rejected all inputs we tried so far 00:07:37.910 [2024-11-16 16:43:23.576332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.910 [2024-11-16 16:43:23.576365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.170 NEW_FUNC[1/670]: 0x454738 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:38.170 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:38.170 #4 NEW cov: 11581 ft: 11582 corp: 2/12b lim: 35 exec/s: 0 rss: 67Mb L: 11/11 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:38.170 [2024-11-16 16:43:23.907191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.170 [2024-11-16 16:43:23.907228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.170 [2024-11-16 16:43:23.907261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:a6a600a6 cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.170 [2024-11-16 16:43:23.907276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.429 #5 NEW cov: 11697 ft: 12399 corp: 3/26b lim: 35 exec/s: 0 rss: 68Mb L: 14/14 MS: 1 InsertRepeatedBytes- 00:07:38.430 [2024-11-16 16:43:23.967085] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.430 [2024-11-16 16:43:23.967228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.430 [2024-11-16 16:43:23.967252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.430 #6 NEW cov: 11712 ft: 12663 corp: 4/36b lim: 35 exec/s: 0 rss: 68Mb L: 10/14 MS: 1 EraseBytes- 00:07:38.430 [2024-11-16 16:43:24.027229] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.430 [2024-11-16 16:43:24.027364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.430 [2024-11-16 16:43:24.027388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.430 #7 NEW cov: 11797 ft: 13005 corp: 5/47b lim: 35 exec/s: 0 rss: 68Mb L: 11/14 MS: 1 InsertByte- 00:07:38.430 [2024-11-16 16:43:24.087550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.430 [2024-11-16 16:43:24.087579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.430 [2024-11-16 16:43:24.087625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:a6a600a6 cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.430 [2024-11-16 16:43:24.087640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.430 #13 NEW cov: 11797 ft: 13139 corp: 6/61b lim: 35 exec/s: 0 rss: 68Mb L: 14/14 MS: 1 ShuffleBytes- 00:07:38.430 [2024-11-16 16:43:24.147540] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.430 [2024-11-16 16:43:24.147683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.430 [2024-11-16 16:43:24.147707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.689 #14 NEW cov: 11797 ft: 13256 corp: 7/71b lim: 35 exec/s: 0 rss: 68Mb L: 10/14 MS: 1 EraseBytes- 00:07:38.689 [2024-11-16 16:43:24.207699] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.689 [2024-11-16 16:43:24.207832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.689 [2024-11-16 16:43:24.207855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.689 #25 NEW cov: 11797 ft: 13352 corp: 8/81b lim: 35 exec/s: 0 rss: 68Mb L: 10/14 MS: 1 CopyPart- 00:07:38.689 [2024-11-16 16:43:24.257839] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.689 [2024-11-16 16:43:24.257972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.689 [2024-11-16 16:43:24.257996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.689 #26 NEW cov: 11797 ft: 13463 corp: 9/91b lim: 35 exec/s: 0 rss: 68Mb L: 10/14 MS: 1 ShuffleBytes- 00:07:38.689 [2024-11-16 16:43:24.308157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:2700a6d2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.689 [2024-11-16 16:43:24.308186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.689 [2024-11-16 16:43:24.308254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:a6a600a6 cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.689 [2024-11-16 16:43:24.308271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.689 #27 NEW cov: 11797 ft: 13663 corp: 10/113b lim: 35 exec/s: 0 rss: 68Mb L: 22/22 MS: 1 CMP- DE: "\322'\010\002\000\000\000\000"- 00:07:38.689 [2024-11-16 16:43:24.368124] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.690 [2024-11-16 16:43:24.368262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.690 [2024-11-16 16:43:24.368285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.690 #28 NEW cov: 11797 ft: 13730 corp: 11/124b lim: 35 exec/s: 0 rss: 68Mb L: 11/22 MS: 1 ChangeBinInt- 00:07:38.690 [2024-11-16 16:43:24.419268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:56560056 cdw11:56005656 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.690 [2024-11-16 16:43:24.419295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.949 #31 NEW cov: 11797 ft: 14118 corp: 12/140b lim: 35 exec/s: 0 rss: 68Mb L: 16/22 MS: 3 ShuffleBytes-ChangeBinInt-InsertRepeatedBytes- 00:07:38.949 [2024-11-16 16:43:24.458940] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.949 [2024-11-16 16:43:24.459174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000003d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.949 [2024-11-16 16:43:24.459202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.949 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:38.949 #32 NEW cov: 11820 ft: 14210 corp: 13/150b lim: 35 exec/s: 0 rss: 68Mb L: 10/22 MS: 1 ChangeByte- 00:07:38.949 [2024-11-16 16:43:24.499213] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.949 [2024-11-16 16:43:24.499504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:56000056 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.949 [2024-11-16 16:43:24.499534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.949 #33 NEW cov: 11820 ft: 14248 corp: 14/166b lim: 35 exec/s: 0 rss: 68Mb L: 16/22 MS: 1 PersAutoDict- DE: "\322'\010\002\000\000\000\000"- 00:07:38.949 [2024-11-16 16:43:24.549715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:2700a6d2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.949 [2024-11-16 16:43:24.549740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.949 [2024-11-16 16:43:24.549881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:31310031 cdw11:a60031a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.949 [2024-11-16 16:43:24.549896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.949 #34 NEW cov: 11820 ft: 14266 corp: 15/193b lim: 35 exec/s: 34 rss: 68Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:07:38.949 [2024-11-16 16:43:24.589570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.949 [2024-11-16 16:43:24.589596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.949 #35 NEW cov: 11820 ft: 14286 corp: 16/204b lim: 35 exec/s: 35 rss: 68Mb L: 11/27 MS: 1 ShuffleBytes- 00:07:38.949 [2024-11-16 16:43:24.629425] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.949 [2024-11-16 16:43:24.629642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.949 [2024-11-16 16:43:24.629675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.950 #36 NEW cov: 11820 ft: 14312 corp: 17/216b lim: 35 exec/s: 36 rss: 68Mb L: 12/27 MS: 1 InsertByte- 00:07:38.950 [2024-11-16 16:43:24.669725] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.950 [2024-11-16 16:43:24.669951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6d2000a cdw11:02002708 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.950 [2024-11-16 16:43:24.669977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.950 [2024-11-16 16:43:24.670037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.950 [2024-11-16 16:43:24.670053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.950 #37 NEW cov: 11820 ft: 14376 corp: 18/230b lim: 35 exec/s: 37 rss: 68Mb L: 14/27 MS: 1 PersAutoDict- DE: "\322'\010\002\000\000\000\000"- 00:07:39.210 [2024-11-16 16:43:24.709695] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.210 [2024-11-16 16:43:24.710029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.210 [2024-11-16 16:43:24.710056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.210 [2024-11-16 16:43:24.710114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.210 [2024-11-16 16:43:24.710129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.210 #38 NEW cov: 11820 ft: 14497 corp: 19/248b lim: 35 exec/s: 38 rss: 68Mb L: 18/27 MS: 1 CopyPart- 00:07:39.210 [2024-11-16 16:43:24.749978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.210 [2024-11-16 16:43:24.750004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.210 #39 NEW cov: 11820 ft: 14513 corp: 20/259b lim: 35 exec/s: 39 rss: 68Mb L: 11/27 MS: 1 CrossOver- 00:07:39.210 [2024-11-16 16:43:24.780196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.210 [2024-11-16 16:43:24.780221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.210 [2024-11-16 16:43:24.780279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:a6a600a6 cdw11:a600d9a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.210 [2024-11-16 16:43:24.780292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.210 #40 NEW cov: 11820 ft: 14582 corp: 21/273b lim: 35 exec/s: 40 rss: 68Mb L: 14/27 MS: 1 ChangeByte- 00:07:39.210 [2024-11-16 16:43:24.820018] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.210 [2024-11-16 16:43:24.820232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.210 [2024-11-16 16:43:24.820260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.210 #41 NEW cov: 11820 ft: 14640 corp: 22/284b lim: 35 exec/s: 41 rss: 68Mb L: 11/27 MS: 1 ShuffleBytes- 00:07:39.210 [2024-11-16 16:43:24.860123] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.210 [2024-11-16 16:43:24.860350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.210 [2024-11-16 16:43:24.860377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.210 #42 NEW cov: 11820 ft: 14696 corp: 23/294b lim: 35 exec/s: 42 rss: 68Mb L: 10/27 MS: 1 CopyPart- 00:07:39.210 [2024-11-16 16:43:24.900627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.210 [2024-11-16 16:43:24.900652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.210 [2024-11-16 16:43:24.900717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:a6a600a6 cdw11:a600d92d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.210 [2024-11-16 16:43:24.900730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.210 #43 NEW cov: 11820 ft: 14726 corp: 24/309b lim: 35 exec/s: 43 rss: 68Mb L: 15/27 MS: 1 InsertByte- 00:07:39.210 [2024-11-16 16:43:24.940472] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.210 [2024-11-16 16:43:24.940988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:bfbf0000 cdw11:bf00bfbf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.210 [2024-11-16 16:43:24.941017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.210 [2024-11-16 16:43:24.941077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:bfbf00bf cdw11:bf00bfbf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.210 [2024-11-16 16:43:24.941091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.210 [2024-11-16 16:43:24.941148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:bfbf00bf cdw11:bf00bfbf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.210 [2024-11-16 16:43:24.941162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.210 [2024-11-16 16:43:24.941218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:bfbf00bf cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.210 [2024-11-16 16:43:24.941232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.470 #44 NEW cov: 11820 ft: 15232 corp: 25/342b lim: 35 exec/s: 44 rss: 68Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:07:39.470 [2024-11-16 16:43:24.990872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6ff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.470 [2024-11-16 16:43:24.990897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.470 [2024-11-16 16:43:24.990952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d22700ff cdw11:00000802 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.470 [2024-11-16 16:43:24.990966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.470 #45 NEW cov: 11820 ft: 15275 corp: 26/362b lim: 35 exec/s: 45 rss: 68Mb L: 20/33 MS: 1 InsertRepeatedBytes- 00:07:39.470 [2024-11-16 16:43:25.030636] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.470 [2024-11-16 16:43:25.030863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.470 [2024-11-16 16:43:25.030891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.470 #46 NEW cov: 11820 ft: 15286 corp: 27/371b lim: 35 exec/s: 46 rss: 68Mb L: 9/33 MS: 1 EraseBytes- 00:07:39.470 [2024-11-16 16:43:25.071072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:a600e6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.470 [2024-11-16 16:43:25.071096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.470 [2024-11-16 16:43:25.071157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:a6a600a6 cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.470 [2024-11-16 16:43:25.071171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.470 #47 NEW cov: 11820 ft: 15290 corp: 28/385b lim: 35 exec/s: 47 rss: 68Mb L: 14/33 MS: 1 ChangeBit- 00:07:39.470 [2024-11-16 16:43:25.111150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.470 [2024-11-16 16:43:25.111175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.470 [2024-11-16 16:43:25.111231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.470 [2024-11-16 16:43:25.111245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.470 #48 NEW cov: 11820 ft: 15317 corp: 29/400b lim: 35 exec/s: 48 rss: 69Mb L: 15/33 MS: 1 CMP- DE: "\001\000\000\004"- 00:07:39.470 [2024-11-16 16:43:25.150979] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.470 [2024-11-16 16:43:25.151205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00d20000 cdw11:02002708 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.470 [2024-11-16 16:43:25.151233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.470 #49 NEW cov: 11820 ft: 15326 corp: 30/410b lim: 35 exec/s: 49 rss: 69Mb L: 10/33 MS: 1 PersAutoDict- DE: "\322'\010\002\000\000\000\000"- 00:07:39.470 [2024-11-16 16:43:25.181338] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.470 [2024-11-16 16:43:25.181662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6ff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.470 [2024-11-16 16:43:25.181691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.470 [2024-11-16 16:43:25.181750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d22700ff cdw11:00000802 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.470 [2024-11-16 16:43:25.181764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.470 [2024-11-16 16:43:25.181819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00a60000 cdw11:ff00a6ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.470 [2024-11-16 16:43:25.181835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.470 [2024-11-16 16:43:25.181892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.470 [2024-11-16 16:43:25.181906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.470 #50 NEW cov: 11820 ft: 15357 corp: 31/441b lim: 35 exec/s: 50 rss: 69Mb L: 31/33 MS: 1 InsertRepeatedBytes- 00:07:39.730 [2024-11-16 16:43:25.221210] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.730 [2024-11-16 16:43:25.221428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.730 [2024-11-16 16:43:25.221455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.730 #51 NEW cov: 11820 ft: 15375 corp: 32/453b lim: 35 exec/s: 51 rss: 69Mb L: 12/33 MS: 1 InsertByte- 00:07:39.730 [2024-11-16 16:43:25.261640] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.730 [2024-11-16 16:43:25.262113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6ff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.730 [2024-11-16 16:43:25.262138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.730 [2024-11-16 16:43:25.262198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d22700ff cdw11:01000802 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.730 [2024-11-16 16:43:25.262212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.730 [2024-11-16 16:43:25.262270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.730 [2024-11-16 16:43:25.262286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.730 [2024-11-16 16:43:25.262345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00a6 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.730 [2024-11-16 16:43:25.262359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.730 [2024-11-16 16:43:25.262417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ffff00ff cdw11:a600ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.730 [2024-11-16 16:43:25.262431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:39.730 #52 NEW cov: 11820 ft: 15442 corp: 33/488b lim: 35 exec/s: 52 rss: 69Mb L: 35/35 MS: 1 CMP- DE: "\001\000\000\000"- 00:07:39.730 [2024-11-16 16:43:25.311456] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.730 [2024-11-16 16:43:25.311683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.730 [2024-11-16 16:43:25.311709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.730 #53 NEW cov: 11820 ft: 15503 corp: 34/499b lim: 35 exec/s: 53 rss: 69Mb L: 11/35 MS: 1 ChangeBinInt- 00:07:39.730 [2024-11-16 16:43:25.351912] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.730 [2024-11-16 16:43:25.352348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6ff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.730 [2024-11-16 16:43:25.352374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.730 [2024-11-16 16:43:25.352436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d22700ff cdw11:01000802 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.730 [2024-11-16 16:43:25.352450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.730 [2024-11-16 16:43:25.352511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.730 [2024-11-16 16:43:25.352526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.730 [2024-11-16 16:43:25.352585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00a6 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.730 [2024-11-16 16:43:25.352598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.730 [2024-11-16 16:43:25.352661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:0aff00ff cdw11:a600ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.731 [2024-11-16 16:43:25.352679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:39.731 #54 NEW cov: 11820 ft: 15519 corp: 35/534b lim: 35 exec/s: 54 rss: 69Mb L: 35/35 MS: 1 CrossOver- 00:07:39.731 [2024-11-16 16:43:25.401716] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.731 [2024-11-16 16:43:25.401938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.731 [2024-11-16 16:43:25.401966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.731 #55 NEW cov: 11820 ft: 15563 corp: 36/542b lim: 35 exec/s: 55 rss: 69Mb L: 8/35 MS: 1 EraseBytes- 00:07:39.731 [2024-11-16 16:43:25.442155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6a6000a cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.731 [2024-11-16 16:43:25.442180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.731 [2024-11-16 16:43:25.442256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:a6a600a6 cdw11:a600a6a6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.731 [2024-11-16 16:43:25.442270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.731 #56 NEW cov: 11820 ft: 15569 corp: 37/561b lim: 35 exec/s: 56 rss: 69Mb L: 19/35 MS: 1 CrossOver- 00:07:39.991 [2024-11-16 16:43:25.482550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff0034 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.991 [2024-11-16 16:43:25.482575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.991 [2024-11-16 16:43:25.482631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.991 [2024-11-16 16:43:25.482646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.991 [2024-11-16 16:43:25.482705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.991 [2024-11-16 16:43:25.482719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.991 [2024-11-16 16:43:25.482778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.991 [2024-11-16 16:43:25.482792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.991 #60 NEW cov: 11820 ft: 15574 corp: 38/593b lim: 35 exec/s: 60 rss: 69Mb L: 32/35 MS: 4 ChangeBit-InsertByte-CrossOver-InsertRepeatedBytes- 00:07:39.991 [2024-11-16 16:43:25.522334] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.991 [2024-11-16 16:43:25.522575] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.991 [2024-11-16 16:43:25.522803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:a6ff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.991 [2024-11-16 16:43:25.522830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.991 [2024-11-16 16:43:25.522892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d22700ff cdw11:01000802 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.991 [2024-11-16 16:43:25.522906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.991 [2024-11-16 16:43:25.522969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.991 [2024-11-16 16:43:25.522985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.991 [2024-11-16 16:43:25.523046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000023 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.991 [2024-11-16 16:43:25.523060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.991 [2024-11-16 16:43:25.523119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ffff0000 cdw11:a600ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.991 [2024-11-16 16:43:25.523135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:39.991 #61 NEW cov: 11820 ft: 15618 corp: 39/628b lim: 35 exec/s: 61 rss: 69Mb L: 35/35 MS: 1 ChangeBinInt- 00:07:39.991 [2024-11-16 16:43:25.562179] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.991 [2024-11-16 16:43:25.562500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.991 [2024-11-16 16:43:25.562525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.991 [2024-11-16 16:43:25.562580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.991 [2024-11-16 16:43:25.562594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.991 #62 NEW cov: 11820 ft: 15637 corp: 40/646b lim: 35 exec/s: 31 rss: 69Mb L: 18/35 MS: 1 CopyPart- 00:07:39.991 #62 DONE cov: 11820 ft: 15637 corp: 40/646b lim: 35 exec/s: 31 rss: 69Mb 00:07:39.991 ###### Recommended dictionary. ###### 00:07:39.991 "\322'\010\002\000\000\000\000" # Uses: 3 00:07:39.991 "\001\000\000\004" # Uses: 0 00:07:39.991 "\001\000\000\000" # Uses: 0 00:07:39.991 ###### End of recommended dictionary. ###### 00:07:39.991 Done 62 runs in 2 second(s) 00:07:39.991 16:43:25 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:07:39.991 16:43:25 -- ../common.sh@72 -- # (( i++ )) 00:07:39.991 16:43:25 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:39.991 16:43:25 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:39.991 16:43:25 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:39.991 16:43:25 -- nvmf/run.sh@24 -- # local timen=1 00:07:39.991 16:43:25 -- nvmf/run.sh@25 -- # local core=0x1 00:07:39.991 16:43:25 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:39.991 16:43:25 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:39.991 16:43:25 -- nvmf/run.sh@29 -- # printf %02d 3 00:07:39.991 16:43:25 -- nvmf/run.sh@29 -- # port=4403 00:07:39.991 16:43:25 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:39.991 16:43:25 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:39.991 16:43:25 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:39.991 16:43:25 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:07:39.991 [2024-11-16 16:43:25.737321] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:39.991 [2024-11-16 16:43:25.737416] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid484386 ] 00:07:40.251 EAL: No free 2048 kB hugepages reported on node 1 00:07:40.251 [2024-11-16 16:43:25.910560] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.251 [2024-11-16 16:43:25.930014] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:40.251 [2024-11-16 16:43:25.930144] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.251 [2024-11-16 16:43:25.981423] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:40.251 [2024-11-16 16:43:25.997778] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:40.511 INFO: Running with entropic power schedule (0xFF, 100). 00:07:40.511 INFO: Seed: 1098343507 00:07:40.511 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:40.511 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:40.511 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:40.511 INFO: A corpus is not provided, starting from an empty corpus 00:07:40.511 #2 INITED exec/s: 0 rss: 59Mb 00:07:40.511 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:40.511 This may also happen if the target rejected all inputs we tried so far 00:07:40.771 NEW_FUNC[1/659]: 0x456418 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:40.771 NEW_FUNC[2/659]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:40.771 #8 NEW cov: 11494 ft: 11491 corp: 2/9b lim: 20 exec/s: 0 rss: 67Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:07:40.771 #9 NEW cov: 11607 ft: 12081 corp: 3/17b lim: 20 exec/s: 0 rss: 67Mb L: 8/8 MS: 1 ShuffleBytes- 00:07:40.771 #10 NEW cov: 11613 ft: 12665 corp: 4/21b lim: 20 exec/s: 0 rss: 67Mb L: 4/8 MS: 1 InsertRepeatedBytes- 00:07:41.030 #11 NEW cov: 11715 ft: 13369 corp: 5/37b lim: 20 exec/s: 0 rss: 67Mb L: 16/16 MS: 1 CopyPart- 00:07:41.030 #12 NEW cov: 11719 ft: 13561 corp: 6/51b lim: 20 exec/s: 0 rss: 67Mb L: 14/16 MS: 1 CopyPart- 00:07:41.030 #13 NEW cov: 11719 ft: 13670 corp: 7/55b lim: 20 exec/s: 0 rss: 67Mb L: 4/16 MS: 1 ChangeByte- 00:07:41.030 [2024-11-16 16:43:26.666429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.030 [2024-11-16 16:43:26.666476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.030 NEW_FUNC[1/20]: 0x1137598 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3224 00:07:41.030 NEW_FUNC[2/20]: 0x1138118 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3166 00:07:41.030 #20 NEW cov: 12044 ft: 14102 corp: 8/73b lim: 20 exec/s: 0 rss: 67Mb L: 18/18 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:07:41.030 #21 NEW cov: 12044 ft: 14150 corp: 9/81b lim: 20 exec/s: 0 rss: 67Mb L: 8/18 MS: 1 CrossOver- 00:07:41.345 #22 NEW cov: 12044 ft: 14215 corp: 10/88b lim: 20 exec/s: 0 rss: 67Mb L: 7/18 MS: 1 CrossOver- 00:07:41.345 #23 NEW cov: 12044 ft: 14245 corp: 11/96b lim: 20 exec/s: 0 rss: 67Mb L: 8/18 MS: 1 ChangeBinInt- 00:07:41.345 #24 NEW cov: 12044 ft: 14264 corp: 12/104b lim: 20 exec/s: 0 rss: 68Mb L: 8/18 MS: 1 ChangeBit- 00:07:41.345 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:41.345 #25 NEW cov: 12067 ft: 14315 corp: 13/110b lim: 20 exec/s: 0 rss: 68Mb L: 6/18 MS: 1 EraseBytes- 00:07:41.345 #26 NEW cov: 12067 ft: 14348 corp: 14/117b lim: 20 exec/s: 0 rss: 68Mb L: 7/18 MS: 1 ChangeByte- 00:07:41.345 [2024-11-16 16:43:27.027438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.345 [2024-11-16 16:43:27.027475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.345 #27 NEW cov: 12067 ft: 14410 corp: 15/130b lim: 20 exec/s: 27 rss: 68Mb L: 13/18 MS: 1 InsertRepeatedBytes- 00:07:41.697 #28 NEW cov: 12067 ft: 14438 corp: 16/137b lim: 20 exec/s: 28 rss: 68Mb L: 7/18 MS: 1 ChangeByte- 00:07:41.697 #29 NEW cov: 12067 ft: 14443 corp: 17/141b lim: 20 exec/s: 29 rss: 68Mb L: 4/18 MS: 1 CrossOver- 00:07:41.697 [2024-11-16 16:43:27.188185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.697 [2024-11-16 16:43:27.188217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.697 #30 NEW cov: 12067 ft: 14593 corp: 18/160b lim: 20 exec/s: 30 rss: 68Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:41.697 [2024-11-16 16:43:27.238543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.697 [2024-11-16 16:43:27.238575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.697 #31 NEW cov: 12067 ft: 14689 corp: 19/180b lim: 20 exec/s: 31 rss: 68Mb L: 20/20 MS: 1 CrossOver- 00:07:41.697 #32 NEW cov: 12067 ft: 14715 corp: 20/194b lim: 20 exec/s: 32 rss: 68Mb L: 14/20 MS: 1 InsertByte- 00:07:41.697 #33 NEW cov: 12067 ft: 14748 corp: 21/202b lim: 20 exec/s: 33 rss: 68Mb L: 8/20 MS: 1 ChangeBinInt- 00:07:41.697 #34 NEW cov: 12067 ft: 14779 corp: 22/210b lim: 20 exec/s: 34 rss: 68Mb L: 8/20 MS: 1 EraseBytes- 00:07:42.060 [2024-11-16 16:43:27.449301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.060 [2024-11-16 16:43:27.449333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.060 #35 NEW cov: 12067 ft: 14809 corp: 23/230b lim: 20 exec/s: 35 rss: 68Mb L: 20/20 MS: 1 ChangeBit- 00:07:42.060 #36 NEW cov: 12067 ft: 14824 corp: 24/246b lim: 20 exec/s: 36 rss: 68Mb L: 16/20 MS: 1 CopyPart- 00:07:42.060 #37 NEW cov: 12067 ft: 14841 corp: 25/252b lim: 20 exec/s: 37 rss: 68Mb L: 6/20 MS: 1 ChangeByte- 00:07:42.060 #38 NEW cov: 12067 ft: 14853 corp: 26/259b lim: 20 exec/s: 38 rss: 69Mb L: 7/20 MS: 1 ChangeByte- 00:07:42.060 [2024-11-16 16:43:27.690046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.060 [2024-11-16 16:43:27.690079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.060 #39 NEW cov: 12067 ft: 14873 corp: 27/279b lim: 20 exec/s: 39 rss: 69Mb L: 20/20 MS: 1 CopyPart- 00:07:42.060 #40 NEW cov: 12067 ft: 14933 corp: 28/283b lim: 20 exec/s: 40 rss: 69Mb L: 4/20 MS: 1 ChangeByte- 00:07:42.322 #41 NEW cov: 12067 ft: 14934 corp: 29/287b lim: 20 exec/s: 41 rss: 69Mb L: 4/20 MS: 1 CopyPart- 00:07:42.322 #42 NEW cov: 12070 ft: 14975 corp: 30/306b lim: 20 exec/s: 42 rss: 69Mb L: 19/20 MS: 1 CopyPart- 00:07:42.322 #43 NEW cov: 12070 ft: 14978 corp: 31/324b lim: 20 exec/s: 43 rss: 69Mb L: 18/20 MS: 1 InsertRepeatedBytes- 00:07:42.322 [2024-11-16 16:43:27.960383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.322 [2024-11-16 16:43:27.960422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.322 #44 NEW cov: 12070 ft: 14980 corp: 32/337b lim: 20 exec/s: 44 rss: 69Mb L: 13/20 MS: 1 ChangeBinInt- 00:07:42.322 #45 NEW cov: 12070 ft: 14987 corp: 33/346b lim: 20 exec/s: 45 rss: 69Mb L: 9/20 MS: 1 InsertByte- 00:07:42.593 #46 NEW cov: 12070 ft: 15003 corp: 34/354b lim: 20 exec/s: 23 rss: 69Mb L: 8/20 MS: 1 ChangeBit- 00:07:42.593 #46 DONE cov: 12070 ft: 15003 corp: 34/354b lim: 20 exec/s: 23 rss: 69Mb 00:07:42.593 Done 46 runs in 2 second(s) 00:07:42.593 16:43:28 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:07:42.593 16:43:28 -- ../common.sh@72 -- # (( i++ )) 00:07:42.593 16:43:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:42.593 16:43:28 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:42.593 16:43:28 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:42.593 16:43:28 -- nvmf/run.sh@24 -- # local timen=1 00:07:42.593 16:43:28 -- nvmf/run.sh@25 -- # local core=0x1 00:07:42.593 16:43:28 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:42.593 16:43:28 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:42.593 16:43:28 -- nvmf/run.sh@29 -- # printf %02d 4 00:07:42.593 16:43:28 -- nvmf/run.sh@29 -- # port=4404 00:07:42.593 16:43:28 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:42.593 16:43:28 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:42.593 16:43:28 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:42.593 16:43:28 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:07:42.593 [2024-11-16 16:43:28.235615] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:42.593 [2024-11-16 16:43:28.235715] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid484938 ] 00:07:42.593 EAL: No free 2048 kB hugepages reported on node 1 00:07:42.874 [2024-11-16 16:43:28.412369] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.874 [2024-11-16 16:43:28.432030] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:42.874 [2024-11-16 16:43:28.432141] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.874 [2024-11-16 16:43:28.483376] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:42.874 [2024-11-16 16:43:28.499681] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:42.874 INFO: Running with entropic power schedule (0xFF, 100). 00:07:42.874 INFO: Seed: 3600302499 00:07:42.874 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:42.874 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:42.874 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:42.874 INFO: A corpus is not provided, starting from an empty corpus 00:07:42.874 #2 INITED exec/s: 0 rss: 59Mb 00:07:42.874 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:42.874 This may also happen if the target rejected all inputs we tried so far 00:07:42.874 [2024-11-16 16:43:28.544985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.874 [2024-11-16 16:43:28.545013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.874 [2024-11-16 16:43:28.545065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.874 [2024-11-16 16:43:28.545078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.218 NEW_FUNC[1/671]: 0x457518 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:43.218 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:43.218 #5 NEW cov: 11605 ft: 11606 corp: 2/16b lim: 35 exec/s: 0 rss: 67Mb L: 15/15 MS: 3 ChangeBit-CrossOver-InsertRepeatedBytes- 00:07:43.218 [2024-11-16 16:43:28.855792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00400000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.218 [2024-11-16 16:43:28.855823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.218 [2024-11-16 16:43:28.855878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.218 [2024-11-16 16:43:28.855892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.218 #6 NEW cov: 11718 ft: 12022 corp: 3/31b lim: 35 exec/s: 0 rss: 67Mb L: 15/15 MS: 1 ChangeBit- 00:07:43.218 [2024-11-16 16:43:28.895934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00360a00 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.218 [2024-11-16 16:43:28.895961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.218 [2024-11-16 16:43:28.896015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:36363636 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.218 [2024-11-16 16:43:28.896028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.218 [2024-11-16 16:43:28.896082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:40000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.218 [2024-11-16 16:43:28.896096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.218 #7 NEW cov: 11724 ft: 12620 corp: 4/57b lim: 35 exec/s: 0 rss: 67Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:07:43.218 [2024-11-16 16:43:28.936077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00360a00 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.218 [2024-11-16 16:43:28.936105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.218 [2024-11-16 16:43:28.936145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:36363636 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.218 [2024-11-16 16:43:28.936159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.218 [2024-11-16 16:43:28.936212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:40000000 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.218 [2024-11-16 16:43:28.936227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.477 #8 NEW cov: 11809 ft: 12823 corp: 5/83b lim: 35 exec/s: 0 rss: 67Mb L: 26/26 MS: 1 ChangeByte- 00:07:43.477 [2024-11-16 16:43:28.976029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.477 [2024-11-16 16:43:28.976055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.477 [2024-11-16 16:43:28.976107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.477 [2024-11-16 16:43:28.976120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.477 #9 NEW cov: 11809 ft: 12957 corp: 6/98b lim: 35 exec/s: 0 rss: 67Mb L: 15/26 MS: 1 ChangeBinInt- 00:07:43.477 [2024-11-16 16:43:29.016115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.477 [2024-11-16 16:43:29.016141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.478 [2024-11-16 16:43:29.016193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.478 [2024-11-16 16:43:29.016210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.478 #10 NEW cov: 11809 ft: 13060 corp: 7/113b lim: 35 exec/s: 0 rss: 67Mb L: 15/26 MS: 1 CopyPart- 00:07:43.478 [2024-11-16 16:43:29.056391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.478 [2024-11-16 16:43:29.056418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.478 [2024-11-16 16:43:29.056472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00bd0000 cdw11:bdbd0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.478 [2024-11-16 16:43:29.056487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.478 [2024-11-16 16:43:29.056540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:bdbdbdbd cdw11:bdbd0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.478 [2024-11-16 16:43:29.056553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.478 #11 NEW cov: 11809 ft: 13163 corp: 8/138b lim: 35 exec/s: 0 rss: 67Mb L: 25/26 MS: 1 InsertRepeatedBytes- 00:07:43.478 [2024-11-16 16:43:29.096542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00360a60 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.478 [2024-11-16 16:43:29.096569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.478 [2024-11-16 16:43:29.096622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:36363636 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.478 [2024-11-16 16:43:29.096637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.478 [2024-11-16 16:43:29.096689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:40000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.478 [2024-11-16 16:43:29.096704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.478 #12 NEW cov: 11809 ft: 13221 corp: 9/164b lim: 35 exec/s: 0 rss: 67Mb L: 26/26 MS: 1 ChangeByte- 00:07:43.478 [2024-11-16 16:43:29.136755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.478 [2024-11-16 16:43:29.136782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.478 [2024-11-16 16:43:29.136833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.478 [2024-11-16 16:43:29.136847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.478 [2024-11-16 16:43:29.136897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.478 [2024-11-16 16:43:29.136911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.478 [2024-11-16 16:43:29.136961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.478 [2024-11-16 16:43:29.136974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.478 #13 NEW cov: 11809 ft: 13561 corp: 10/194b lim: 35 exec/s: 0 rss: 67Mb L: 30/30 MS: 1 CopyPart- 00:07:43.478 [2024-11-16 16:43:29.176729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00400000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.478 [2024-11-16 16:43:29.176760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.478 [2024-11-16 16:43:29.176813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:007b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.478 [2024-11-16 16:43:29.176827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.478 [2024-11-16 16:43:29.176880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:7b7b7b7b cdw11:7b7b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.478 [2024-11-16 16:43:29.176894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.478 #14 NEW cov: 11809 ft: 13590 corp: 11/221b lim: 35 exec/s: 0 rss: 67Mb L: 27/30 MS: 1 InsertRepeatedBytes- 00:07:43.478 [2024-11-16 16:43:29.216680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.478 [2024-11-16 16:43:29.216706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.478 [2024-11-16 16:43:29.216758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.478 [2024-11-16 16:43:29.216771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.737 #15 NEW cov: 11809 ft: 13647 corp: 12/239b lim: 35 exec/s: 0 rss: 67Mb L: 18/30 MS: 1 InsertRepeatedBytes- 00:07:43.737 [2024-11-16 16:43:29.257224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00ff0a00 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.737 [2024-11-16 16:43:29.257250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.737 [2024-11-16 16:43:29.257303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.737 [2024-11-16 16:43:29.257317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.737 [2024-11-16 16:43:29.257367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.737 [2024-11-16 16:43:29.257381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.737 [2024-11-16 16:43:29.257429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.737 [2024-11-16 16:43:29.257442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.737 [2024-11-16 16:43:29.257492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.737 [2024-11-16 16:43:29.257506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:43.737 #16 NEW cov: 11809 ft: 13714 corp: 13/274b lim: 35 exec/s: 0 rss: 67Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:43.737 [2024-11-16 16:43:29.297031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00400000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.737 [2024-11-16 16:43:29.297057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.737 [2024-11-16 16:43:29.297111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:007b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.737 [2024-11-16 16:43:29.297128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.737 [2024-11-16 16:43:29.297178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:7b7b7b79 cdw11:7b7b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.737 [2024-11-16 16:43:29.297193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.737 #17 NEW cov: 11809 ft: 13739 corp: 14/301b lim: 35 exec/s: 0 rss: 67Mb L: 27/35 MS: 1 ChangeBit- 00:07:43.737 [2024-11-16 16:43:29.337061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:2b000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.737 [2024-11-16 16:43:29.337087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.737 [2024-11-16 16:43:29.337140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.737 [2024-11-16 16:43:29.337153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.737 #18 NEW cov: 11809 ft: 13856 corp: 15/316b lim: 35 exec/s: 0 rss: 67Mb L: 15/35 MS: 1 ChangeByte- 00:07:43.737 [2024-11-16 16:43:29.377300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00360a60 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.737 [2024-11-16 16:43:29.377326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.737 [2024-11-16 16:43:29.377381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:36363636 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.737 [2024-11-16 16:43:29.377395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.737 [2024-11-16 16:43:29.377448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:40000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.737 [2024-11-16 16:43:29.377462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.737 #19 NEW cov: 11809 ft: 13881 corp: 16/343b lim: 35 exec/s: 0 rss: 68Mb L: 27/35 MS: 1 CopyPart- 00:07:43.737 [2024-11-16 16:43:29.417551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.737 [2024-11-16 16:43:29.417576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.737 [2024-11-16 16:43:29.417627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00bd0000 cdw11:0a600000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.738 [2024-11-16 16:43:29.417640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.738 [2024-11-16 16:43:29.417695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:36363636 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.738 [2024-11-16 16:43:29.417709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.738 [2024-11-16 16:43:29.417774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:bdbd36bd cdw11:bdbd0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.738 [2024-11-16 16:43:29.417788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.738 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:43.738 #20 NEW cov: 11832 ft: 13893 corp: 17/374b lim: 35 exec/s: 0 rss: 68Mb L: 31/35 MS: 1 CrossOver- 00:07:43.738 [2024-11-16 16:43:29.467693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00360a00 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.738 [2024-11-16 16:43:29.467719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.738 [2024-11-16 16:43:29.467773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:36363636 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.738 [2024-11-16 16:43:29.467787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.738 [2024-11-16 16:43:29.467838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:400a0000 cdw11:60000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.738 [2024-11-16 16:43:29.467851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.738 [2024-11-16 16:43:29.467904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:36363636 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.738 [2024-11-16 16:43:29.467916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.997 #21 NEW cov: 11832 ft: 13903 corp: 18/407b lim: 35 exec/s: 0 rss: 68Mb L: 33/35 MS: 1 CrossOver- 00:07:43.997 [2024-11-16 16:43:29.507813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00360a00 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.997 [2024-11-16 16:43:29.507839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.997 [2024-11-16 16:43:29.507890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:36363636 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.997 [2024-11-16 16:43:29.507904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.997 [2024-11-16 16:43:29.507956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00403600 cdw11:0a600000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.997 [2024-11-16 16:43:29.507969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.997 [2024-11-16 16:43:29.508019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:36363636 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.997 [2024-11-16 16:43:29.508031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.997 #22 NEW cov: 11832 ft: 13921 corp: 19/441b lim: 35 exec/s: 22 rss: 68Mb L: 34/35 MS: 1 CopyPart- 00:07:43.997 [2024-11-16 16:43:29.547532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.997 [2024-11-16 16:43:29.547558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.997 #23 NEW cov: 11832 ft: 14726 corp: 20/450b lim: 35 exec/s: 23 rss: 68Mb L: 9/35 MS: 1 EraseBytes- 00:07:43.997 [2024-11-16 16:43:29.588097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00ff0a00 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.997 [2024-11-16 16:43:29.588123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.997 [2024-11-16 16:43:29.588174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.997 [2024-11-16 16:43:29.588188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.997 [2024-11-16 16:43:29.588243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.997 [2024-11-16 16:43:29.588256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.997 [2024-11-16 16:43:29.588307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.997 [2024-11-16 16:43:29.588320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.997 #24 NEW cov: 11832 ft: 14791 corp: 21/481b lim: 35 exec/s: 24 rss: 68Mb L: 31/35 MS: 1 EraseBytes- 00:07:43.997 [2024-11-16 16:43:29.628049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00400000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.997 [2024-11-16 16:43:29.628074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.997 [2024-11-16 16:43:29.628127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0000dd00 cdw11:007b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.997 [2024-11-16 16:43:29.628141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.997 [2024-11-16 16:43:29.628193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:7b7b7b7b cdw11:7b7b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.997 [2024-11-16 16:43:29.628207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.998 #25 NEW cov: 11832 ft: 14820 corp: 22/508b lim: 35 exec/s: 25 rss: 68Mb L: 27/35 MS: 1 ChangeByte- 00:07:43.998 [2024-11-16 16:43:29.668047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.998 [2024-11-16 16:43:29.668073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.998 [2024-11-16 16:43:29.668123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:36360036 cdw11:36000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.998 [2024-11-16 16:43:29.668137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.998 #26 NEW cov: 11832 ft: 14830 corp: 23/528b lim: 35 exec/s: 26 rss: 68Mb L: 20/35 MS: 1 CrossOver- 00:07:43.998 [2024-11-16 16:43:29.708454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.998 [2024-11-16 16:43:29.708479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.998 [2024-11-16 16:43:29.708531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.998 [2024-11-16 16:43:29.708545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.998 [2024-11-16 16:43:29.708596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2a000000 cdw11:40000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.998 [2024-11-16 16:43:29.708610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.998 [2024-11-16 16:43:29.708660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.998 [2024-11-16 16:43:29.708678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.998 #27 NEW cov: 11832 ft: 14835 corp: 24/558b lim: 35 exec/s: 27 rss: 68Mb L: 30/35 MS: 1 ChangeBit- 00:07:44.257 [2024-11-16 16:43:29.748613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.257 [2024-11-16 16:43:29.748639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.257 [2024-11-16 16:43:29.748696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00bd0000 cdw11:0a600000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.257 [2024-11-16 16:43:29.748710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.257 [2024-11-16 16:43:29.748763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:36363636 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.257 [2024-11-16 16:43:29.748778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.257 [2024-11-16 16:43:29.748831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:bdbd36bd cdw11:bdbd0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.257 [2024-11-16 16:43:29.748844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.257 #28 NEW cov: 11832 ft: 14854 corp: 25/589b lim: 35 exec/s: 28 rss: 68Mb L: 31/35 MS: 1 ChangeByte- 00:07:44.257 [2024-11-16 16:43:29.788659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.257 [2024-11-16 16:43:29.788688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.257 [2024-11-16 16:43:29.788744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:40000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.257 [2024-11-16 16:43:29.788757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.257 [2024-11-16 16:43:29.788809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:7b7b0000 cdw11:7b7b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.257 [2024-11-16 16:43:29.788823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.257 [2024-11-16 16:43:29.788874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:7b7b7b7b cdw11:7b7b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.257 [2024-11-16 16:43:29.788887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.257 #29 NEW cov: 11832 ft: 14865 corp: 26/620b lim: 35 exec/s: 29 rss: 68Mb L: 31/35 MS: 1 InsertRepeatedBytes- 00:07:44.257 [2024-11-16 16:43:29.828572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00400000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.257 [2024-11-16 16:43:29.828597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.257 [2024-11-16 16:43:29.828650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:007b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.257 [2024-11-16 16:43:29.828663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.257 [2024-11-16 16:43:29.828721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:7b7b7b79 cdw11:7b7b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.257 [2024-11-16 16:43:29.828735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.257 #30 NEW cov: 11832 ft: 14925 corp: 27/647b lim: 35 exec/s: 30 rss: 68Mb L: 27/35 MS: 1 ChangeByte- 00:07:44.257 [2024-11-16 16:43:29.868577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00400000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.257 [2024-11-16 16:43:29.868603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.257 [2024-11-16 16:43:29.868654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:02000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.257 [2024-11-16 16:43:29.868667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.257 #31 NEW cov: 11832 ft: 14929 corp: 28/662b lim: 35 exec/s: 31 rss: 69Mb L: 15/35 MS: 1 ChangeBit- 00:07:44.257 [2024-11-16 16:43:29.908984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00300a00 cdw11:31360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.257 [2024-11-16 16:43:29.909009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.257 [2024-11-16 16:43:29.909062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:39303834 cdw11:30370000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.257 [2024-11-16 16:43:29.909075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.257 [2024-11-16 16:43:29.909125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:400a0000 cdw11:60000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.257 [2024-11-16 16:43:29.909138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.257 [2024-11-16 16:43:29.909186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:36363636 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.258 [2024-11-16 16:43:29.909199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.258 #32 NEW cov: 11832 ft: 14951 corp: 29/695b lim: 35 exec/s: 32 rss: 69Mb L: 33/35 MS: 1 ChangeASCIIInt- 00:07:44.258 [2024-11-16 16:43:29.948979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:2b000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.258 [2024-11-16 16:43:29.949004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.258 [2024-11-16 16:43:29.949056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.258 [2024-11-16 16:43:29.949070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.258 [2024-11-16 16:43:29.949121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.258 [2024-11-16 16:43:29.949135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.258 #33 NEW cov: 11832 ft: 14981 corp: 30/716b lim: 35 exec/s: 33 rss: 69Mb L: 21/35 MS: 1 InsertRepeatedBytes- 00:07:44.258 [2024-11-16 16:43:29.989225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.258 [2024-11-16 16:43:29.989250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.258 [2024-11-16 16:43:29.989302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.258 [2024-11-16 16:43:29.989318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.258 [2024-11-16 16:43:29.989369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.258 [2024-11-16 16:43:29.989383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.258 [2024-11-16 16:43:29.989436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.258 [2024-11-16 16:43:29.989448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.517 #34 NEW cov: 11832 ft: 14998 corp: 31/745b lim: 35 exec/s: 34 rss: 69Mb L: 29/35 MS: 1 InsertRepeatedBytes- 00:07:44.517 [2024-11-16 16:43:30.029041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00ff0a00 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.517 [2024-11-16 16:43:30.029068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.517 [2024-11-16 16:43:30.029122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.517 [2024-11-16 16:43:30.029136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.517 #35 NEW cov: 11832 ft: 15024 corp: 32/765b lim: 35 exec/s: 35 rss: 69Mb L: 20/35 MS: 1 EraseBytes- 00:07:44.517 [2024-11-16 16:43:30.069213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.517 [2024-11-16 16:43:30.069241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.517 [2024-11-16 16:43:30.069295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.517 [2024-11-16 16:43:30.069309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.517 #36 NEW cov: 11832 ft: 15027 corp: 33/785b lim: 35 exec/s: 36 rss: 69Mb L: 20/35 MS: 1 InsertRepeatedBytes- 00:07:44.517 [2024-11-16 16:43:30.109619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00400000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.517 [2024-11-16 16:43:30.109646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.517 [2024-11-16 16:43:30.109699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:007b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.517 [2024-11-16 16:43:30.109713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.517 [2024-11-16 16:43:30.109767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:7b7b7b7b cdw11:7b7b0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.518 [2024-11-16 16:43:30.109781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.518 [2024-11-16 16:43:30.109831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:7b7b267b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.518 [2024-11-16 16:43:30.109844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.518 #37 NEW cov: 11832 ft: 15038 corp: 34/813b lim: 35 exec/s: 37 rss: 69Mb L: 28/35 MS: 1 InsertByte- 00:07:44.518 [2024-11-16 16:43:30.149412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:10ff0a00 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.518 [2024-11-16 16:43:30.149437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.518 [2024-11-16 16:43:30.149491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.518 [2024-11-16 16:43:30.149506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.518 #38 NEW cov: 11832 ft: 15068 corp: 35/833b lim: 35 exec/s: 38 rss: 69Mb L: 20/35 MS: 1 ChangeBit- 00:07:44.518 [2024-11-16 16:43:30.189742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.518 [2024-11-16 16:43:30.189767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.518 [2024-11-16 16:43:30.189821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00bd0000 cdw11:bdbd0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.518 [2024-11-16 16:43:30.189835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.518 [2024-11-16 16:43:30.189888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:bdbdbdbd cdw11:bdb80000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.518 [2024-11-16 16:43:30.189902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.518 #39 NEW cov: 11832 ft: 15088 corp: 36/858b lim: 35 exec/s: 39 rss: 69Mb L: 25/35 MS: 1 ChangeBinInt- 00:07:44.518 [2024-11-16 16:43:30.229659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00400000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.518 [2024-11-16 16:43:30.229688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.518 [2024-11-16 16:43:30.229741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00400000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.518 [2024-11-16 16:43:30.229755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.518 #40 NEW cov: 11832 ft: 15115 corp: 37/873b lim: 35 exec/s: 40 rss: 69Mb L: 15/35 MS: 1 CopyPart- 00:07:44.778 [2024-11-16 16:43:30.270080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00360a00 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.270106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.778 [2024-11-16 16:43:30.270160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:36363636 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.270174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.778 [2024-11-16 16:43:30.270228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:400a0000 cdw11:60000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.270243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.778 [2024-11-16 16:43:30.270293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:36363636 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.270307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.778 [2024-11-16 16:43:30.310221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00360a00 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.310247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.778 [2024-11-16 16:43:30.310299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:36363636 cdw11:40360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.310312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.778 [2024-11-16 16:43:30.310366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:360a3600 cdw11:60000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.310379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.778 [2024-11-16 16:43:30.310428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:36363636 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.310441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.778 #42 NEW cov: 11832 ft: 15125 corp: 38/906b lim: 35 exec/s: 42 rss: 69Mb L: 33/35 MS: 2 ChangeByte-ShuffleBytes- 00:07:44.778 [2024-11-16 16:43:30.350321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00300a00 cdw11:31360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.350347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.778 [2024-11-16 16:43:30.350401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:39303834 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.350415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.778 [2024-11-16 16:43:30.350465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:36363636 cdw11:36000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.350479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.778 [2024-11-16 16:43:30.350531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:36360036 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.350544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.778 #43 NEW cov: 11832 ft: 15136 corp: 39/939b lim: 35 exec/s: 43 rss: 69Mb L: 33/35 MS: 1 CopyPart- 00:07:44.778 [2024-11-16 16:43:30.390449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00360a00 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.390475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.778 [2024-11-16 16:43:30.390528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:36363636 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.390542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.778 [2024-11-16 16:43:30.390594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:36000000 cdw11:36400000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.390608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.778 [2024-11-16 16:43:30.390660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:36366036 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.390685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.778 #44 NEW cov: 11832 ft: 15142 corp: 40/972b lim: 35 exec/s: 44 rss: 69Mb L: 33/35 MS: 1 ShuffleBytes- 00:07:44.778 [2024-11-16 16:43:30.430567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00360a00 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.430592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.778 [2024-11-16 16:43:30.430645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:36363636 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.430659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.778 [2024-11-16 16:43:30.430715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.430729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.778 [2024-11-16 16:43:30.430781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00004000 cdw11:c7000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.430794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.778 #45 NEW cov: 11832 ft: 15180 corp: 41/1003b lim: 35 exec/s: 45 rss: 69Mb L: 31/35 MS: 1 CrossOver- 00:07:44.778 [2024-11-16 16:43:30.480419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00400000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.480446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.778 [2024-11-16 16:43:30.480499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:02000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.480513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.778 #46 NEW cov: 11832 ft: 15197 corp: 42/1018b lim: 35 exec/s: 46 rss: 69Mb L: 15/35 MS: 1 ShuffleBytes- 00:07:44.778 [2024-11-16 16:43:30.520672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00300a00 cdw11:31360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.520699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.778 [2024-11-16 16:43:30.520752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:39303834 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.520766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.778 [2024-11-16 16:43:30.520821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:36363636 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.778 [2024-11-16 16:43:30.520835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.037 #47 NEW cov: 11832 ft: 15206 corp: 43/1043b lim: 35 exec/s: 23 rss: 69Mb L: 25/35 MS: 1 EraseBytes- 00:07:45.037 #47 DONE cov: 11832 ft: 15206 corp: 43/1043b lim: 35 exec/s: 23 rss: 69Mb 00:07:45.037 Done 47 runs in 2 second(s) 00:07:45.037 16:43:30 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:07:45.037 16:43:30 -- ../common.sh@72 -- # (( i++ )) 00:07:45.037 16:43:30 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:45.037 16:43:30 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:45.037 16:43:30 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:45.037 16:43:30 -- nvmf/run.sh@24 -- # local timen=1 00:07:45.038 16:43:30 -- nvmf/run.sh@25 -- # local core=0x1 00:07:45.038 16:43:30 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:45.038 16:43:30 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:45.038 16:43:30 -- nvmf/run.sh@29 -- # printf %02d 5 00:07:45.038 16:43:30 -- nvmf/run.sh@29 -- # port=4405 00:07:45.038 16:43:30 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:45.038 16:43:30 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:45.038 16:43:30 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:45.038 16:43:30 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:07:45.038 [2024-11-16 16:43:30.697820] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:45.038 [2024-11-16 16:43:30.697888] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid485278 ] 00:07:45.038 EAL: No free 2048 kB hugepages reported on node 1 00:07:45.297 [2024-11-16 16:43:30.883522] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.297 [2024-11-16 16:43:30.902930] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:45.297 [2024-11-16 16:43:30.903047] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.297 [2024-11-16 16:43:30.954675] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:45.297 [2024-11-16 16:43:30.971003] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:45.297 INFO: Running with entropic power schedule (0xFF, 100). 00:07:45.297 INFO: Seed: 1776394132 00:07:45.297 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:45.297 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:45.297 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:45.297 INFO: A corpus is not provided, starting from an empty corpus 00:07:45.297 #2 INITED exec/s: 0 rss: 60Mb 00:07:45.297 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:45.297 This may also happen if the target rejected all inputs we tried so far 00:07:45.297 [2024-11-16 16:43:31.016564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.297 [2024-11-16 16:43:31.016594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.297 [2024-11-16 16:43:31.016652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.297 [2024-11-16 16:43:31.016665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.297 [2024-11-16 16:43:31.016726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.297 [2024-11-16 16:43:31.016740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.557 NEW_FUNC[1/671]: 0x4596b8 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:45.557 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:45.557 #9 NEW cov: 11616 ft: 11617 corp: 2/29b lim: 45 exec/s: 0 rss: 67Mb L: 28/28 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:45.817 [2024-11-16 16:43:31.327932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.817 [2024-11-16 16:43:31.327971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.817 #10 NEW cov: 11729 ft: 13158 corp: 3/39b lim: 45 exec/s: 0 rss: 67Mb L: 10/28 MS: 1 CrossOver- 00:07:45.817 [2024-11-16 16:43:31.378507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.817 [2024-11-16 16:43:31.378537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.817 [2024-11-16 16:43:31.378659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.817 [2024-11-16 16:43:31.378681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.817 [2024-11-16 16:43:31.378800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.817 [2024-11-16 16:43:31.378818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.817 #11 NEW cov: 11735 ft: 13367 corp: 4/67b lim: 45 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 CopyPart- 00:07:45.817 [2024-11-16 16:43:31.428699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.817 [2024-11-16 16:43:31.428727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.817 [2024-11-16 16:43:31.428854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.817 [2024-11-16 16:43:31.428872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.817 [2024-11-16 16:43:31.428981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.817 [2024-11-16 16:43:31.428996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.817 #12 NEW cov: 11820 ft: 13529 corp: 5/95b lim: 45 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 ChangeBinInt- 00:07:45.817 [2024-11-16 16:43:31.468708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.817 [2024-11-16 16:43:31.468734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.817 [2024-11-16 16:43:31.468841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:faffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.817 [2024-11-16 16:43:31.468858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.817 [2024-11-16 16:43:31.468971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.817 [2024-11-16 16:43:31.468988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.817 #13 NEW cov: 11820 ft: 13625 corp: 6/123b lim: 45 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 ChangeBinInt- 00:07:45.817 [2024-11-16 16:43:31.508862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.817 [2024-11-16 16:43:31.508892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.817 [2024-11-16 16:43:31.509006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.817 [2024-11-16 16:43:31.509021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.817 [2024-11-16 16:43:31.509138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.817 [2024-11-16 16:43:31.509154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.817 #14 NEW cov: 11820 ft: 13737 corp: 7/151b lim: 45 exec/s: 0 rss: 67Mb L: 28/28 MS: 1 CopyPart- 00:07:45.817 [2024-11-16 16:43:31.549209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.817 [2024-11-16 16:43:31.549234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.817 [2024-11-16 16:43:31.549339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.817 [2024-11-16 16:43:31.549355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.817 [2024-11-16 16:43:31.549471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ff7affff cdw11:d0890007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.817 [2024-11-16 16:43:31.549485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.817 [2024-11-16 16:43:31.549592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffff8a00 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.817 [2024-11-16 16:43:31.549609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.077 #15 NEW cov: 11820 ft: 14177 corp: 8/187b lim: 45 exec/s: 0 rss: 67Mb L: 36/36 MS: 1 CMP- DE: "z\320\211\344\031\027\212\000"- 00:07:46.077 [2024-11-16 16:43:31.599377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.077 [2024-11-16 16:43:31.599403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.077 [2024-11-16 16:43:31.599534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.077 [2024-11-16 16:43:31.599551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.077 [2024-11-16 16:43:31.599660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ff7affff cdw11:d0890000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.077 [2024-11-16 16:43:31.599679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.077 [2024-11-16 16:43:31.599801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffff19e4 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.077 [2024-11-16 16:43:31.599816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.077 #16 NEW cov: 11820 ft: 14213 corp: 9/223b lim: 45 exec/s: 0 rss: 67Mb L: 36/36 MS: 1 ShuffleBytes- 00:07:46.077 [2024-11-16 16:43:31.649022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.077 [2024-11-16 16:43:31.649051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.077 [2024-11-16 16:43:31.649165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.077 [2024-11-16 16:43:31.649179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.077 #17 NEW cov: 11820 ft: 14503 corp: 10/243b lim: 45 exec/s: 0 rss: 67Mb L: 20/36 MS: 1 EraseBytes- 00:07:46.077 [2024-11-16 16:43:31.689444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.077 [2024-11-16 16:43:31.689469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.077 [2024-11-16 16:43:31.689585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.077 [2024-11-16 16:43:31.689602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.077 [2024-11-16 16:43:31.689729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.077 [2024-11-16 16:43:31.689742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.077 #18 NEW cov: 11820 ft: 14554 corp: 11/272b lim: 45 exec/s: 0 rss: 67Mb L: 29/36 MS: 1 CopyPart- 00:07:46.077 [2024-11-16 16:43:31.729578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:89e47ad0 cdw11:19170004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.077 [2024-11-16 16:43:31.729604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.077 [2024-11-16 16:43:31.729739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.077 [2024-11-16 16:43:31.729755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.077 [2024-11-16 16:43:31.729875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.077 [2024-11-16 16:43:31.729891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.077 #19 NEW cov: 11820 ft: 14593 corp: 12/300b lim: 45 exec/s: 0 rss: 67Mb L: 28/36 MS: 1 PersAutoDict- DE: "z\320\211\344\031\027\212\000"- 00:07:46.077 [2024-11-16 16:43:31.769986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.077 [2024-11-16 16:43:31.770012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.077 [2024-11-16 16:43:31.770141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.077 [2024-11-16 16:43:31.770160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.077 [2024-11-16 16:43:31.770238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.077 [2024-11-16 16:43:31.770254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.077 [2024-11-16 16:43:31.770378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.077 [2024-11-16 16:43:31.770397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.077 #20 NEW cov: 11820 ft: 14603 corp: 13/336b lim: 45 exec/s: 0 rss: 67Mb L: 36/36 MS: 1 CMP- DE: "\017\000\000\000\000\000\000\000"- 00:07:46.077 [2024-11-16 16:43:31.809927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.077 [2024-11-16 16:43:31.809954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.077 [2024-11-16 16:43:31.810066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.077 [2024-11-16 16:43:31.810083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.077 [2024-11-16 16:43:31.810192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.077 [2024-11-16 16:43:31.810209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.077 [2024-11-16 16:43:31.810320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.077 [2024-11-16 16:43:31.810336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.337 #21 NEW cov: 11820 ft: 14632 corp: 14/379b lim: 45 exec/s: 0 rss: 67Mb L: 43/43 MS: 1 CrossOver- 00:07:46.337 [2024-11-16 16:43:31.859840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.337 [2024-11-16 16:43:31.859867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.337 [2024-11-16 16:43:31.860013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.337 [2024-11-16 16:43:31.860031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.337 [2024-11-16 16:43:31.860155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.337 [2024-11-16 16:43:31.860171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.337 #22 NEW cov: 11820 ft: 14693 corp: 15/407b lim: 45 exec/s: 0 rss: 68Mb L: 28/43 MS: 1 CopyPart- 00:07:46.337 [2024-11-16 16:43:31.909581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.337 [2024-11-16 16:43:31.909607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.337 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:46.337 #23 NEW cov: 11843 ft: 14732 corp: 16/418b lim: 45 exec/s: 0 rss: 68Mb L: 11/43 MS: 1 InsertByte- 00:07:46.337 [2024-11-16 16:43:31.970289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.337 [2024-11-16 16:43:31.970315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.337 [2024-11-16 16:43:31.970465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:fffaffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.337 [2024-11-16 16:43:31.970484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.337 [2024-11-16 16:43:31.970610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.338 [2024-11-16 16:43:31.970625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.338 #24 NEW cov: 11843 ft: 14754 corp: 17/447b lim: 45 exec/s: 0 rss: 68Mb L: 29/43 MS: 1 CopyPart- 00:07:46.338 [2024-11-16 16:43:32.020449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.338 [2024-11-16 16:43:32.020476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.338 [2024-11-16 16:43:32.020602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffff5b cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.338 [2024-11-16 16:43:32.020619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.338 [2024-11-16 16:43:32.020738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.338 [2024-11-16 16:43:32.020756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.338 #25 NEW cov: 11843 ft: 14825 corp: 18/477b lim: 45 exec/s: 25 rss: 68Mb L: 30/43 MS: 1 InsertByte- 00:07:46.338 [2024-11-16 16:43:32.070892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.338 [2024-11-16 16:43:32.070919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.338 [2024-11-16 16:43:32.071041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.338 [2024-11-16 16:43:32.071056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.338 [2024-11-16 16:43:32.071170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.338 [2024-11-16 16:43:32.071187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.338 [2024-11-16 16:43:32.071303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.338 [2024-11-16 16:43:32.071319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.598 #26 NEW cov: 11843 ft: 14832 corp: 19/520b lim: 45 exec/s: 26 rss: 68Mb L: 43/43 MS: 1 CopyPart- 00:07:46.598 [2024-11-16 16:43:32.120210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.598 [2024-11-16 16:43:32.120237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.598 #27 NEW cov: 11843 ft: 14847 corp: 20/530b lim: 45 exec/s: 27 rss: 68Mb L: 10/43 MS: 1 ChangeByte- 00:07:46.598 [2024-11-16 16:43:32.160928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.598 [2024-11-16 16:43:32.160957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.598 [2024-11-16 16:43:32.161098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:fffaffff cdw11:0f000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.598 [2024-11-16 16:43:32.161119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.598 [2024-11-16 16:43:32.161239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00ff0000 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.598 [2024-11-16 16:43:32.161257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.598 #28 NEW cov: 11843 ft: 14861 corp: 21/559b lim: 45 exec/s: 28 rss: 68Mb L: 29/43 MS: 1 PersAutoDict- DE: "\017\000\000\000\000\000\000\000"- 00:07:46.598 [2024-11-16 16:43:32.200727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.598 [2024-11-16 16:43:32.200755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.598 [2024-11-16 16:43:32.200889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.598 [2024-11-16 16:43:32.200907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.598 #29 NEW cov: 11843 ft: 14889 corp: 22/579b lim: 45 exec/s: 29 rss: 68Mb L: 20/43 MS: 1 ShuffleBytes- 00:07:46.598 [2024-11-16 16:43:32.241388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.598 [2024-11-16 16:43:32.241414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.598 [2024-11-16 16:43:32.241523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.598 [2024-11-16 16:43:32.241538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.598 [2024-11-16 16:43:32.241648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ff7affff cdw11:d0890000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.598 [2024-11-16 16:43:32.241663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.598 [2024-11-16 16:43:32.241803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffff19e4 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.598 [2024-11-16 16:43:32.241816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.598 #30 NEW cov: 11843 ft: 14901 corp: 23/615b lim: 45 exec/s: 30 rss: 68Mb L: 36/43 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:46.598 [2024-11-16 16:43:32.300786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:7ad0ffff cdw11:89e40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.598 [2024-11-16 16:43:32.300813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.598 #31 NEW cov: 11843 ft: 14989 corp: 24/625b lim: 45 exec/s: 31 rss: 68Mb L: 10/43 MS: 1 PersAutoDict- DE: "z\320\211\344\031\027\212\000"- 00:07:46.858 [2024-11-16 16:43:32.351479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.858 [2024-11-16 16:43:32.351507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.858 [2024-11-16 16:43:32.351620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.858 [2024-11-16 16:43:32.351639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.858 [2024-11-16 16:43:32.351757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:fffffff8 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.858 [2024-11-16 16:43:32.351774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.858 #32 NEW cov: 11843 ft: 14995 corp: 25/653b lim: 45 exec/s: 32 rss: 68Mb L: 28/43 MS: 1 ChangeBinInt- 00:07:46.858 [2024-11-16 16:43:32.401267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:7ad0ffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.858 [2024-11-16 16:43:32.401295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.858 [2024-11-16 16:43:32.401413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.858 [2024-11-16 16:43:32.401431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.858 #33 NEW cov: 11843 ft: 15074 corp: 26/679b lim: 45 exec/s: 33 rss: 69Mb L: 26/43 MS: 1 InsertRepeatedBytes- 00:07:46.858 [2024-11-16 16:43:32.461748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.858 [2024-11-16 16:43:32.461775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.858 [2024-11-16 16:43:32.461894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.858 [2024-11-16 16:43:32.461912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.858 [2024-11-16 16:43:32.462037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ff60ffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.858 [2024-11-16 16:43:32.462054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.858 #34 NEW cov: 11843 ft: 15150 corp: 27/708b lim: 45 exec/s: 34 rss: 69Mb L: 29/43 MS: 1 EraseBytes- 00:07:46.858 [2024-11-16 16:43:32.501842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.858 [2024-11-16 16:43:32.501869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.858 [2024-11-16 16:43:32.501989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.858 [2024-11-16 16:43:32.502005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.858 [2024-11-16 16:43:32.502118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.858 [2024-11-16 16:43:32.502135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.858 [2024-11-16 16:43:32.502252] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:3a1ab370 cdw11:178a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.858 [2024-11-16 16:43:32.502270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.858 #35 NEW cov: 11843 ft: 15188 corp: 28/745b lim: 45 exec/s: 35 rss: 69Mb L: 37/43 MS: 1 CMP- DE: "A\263p:\032\027\212\000"- 00:07:46.858 [2024-11-16 16:43:32.541920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.858 [2024-11-16 16:43:32.541952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.858 [2024-11-16 16:43:32.542074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:fffaffff cdw11:ffff0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.858 [2024-11-16 16:43:32.542090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.858 [2024-11-16 16:43:32.542211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.858 [2024-11-16 16:43:32.542227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.858 #36 NEW cov: 11843 ft: 15239 corp: 29/774b lim: 45 exec/s: 36 rss: 69Mb L: 29/43 MS: 1 ChangeBit- 00:07:46.858 [2024-11-16 16:43:32.581976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.858 [2024-11-16 16:43:32.582004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.858 [2024-11-16 16:43:32.582122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:5bff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.859 [2024-11-16 16:43:32.582138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.859 [2024-11-16 16:43:32.582260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.859 [2024-11-16 16:43:32.582276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.119 #37 NEW cov: 11843 ft: 15245 corp: 30/807b lim: 45 exec/s: 37 rss: 69Mb L: 33/43 MS: 1 CrossOver- 00:07:47.119 [2024-11-16 16:43:32.622165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.119 [2024-11-16 16:43:32.622191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.119 [2024-11-16 16:43:32.622336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.119 [2024-11-16 16:43:32.622353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.119 [2024-11-16 16:43:32.622485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.119 [2024-11-16 16:43:32.622501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.119 #38 NEW cov: 11843 ft: 15269 corp: 31/835b lim: 45 exec/s: 38 rss: 69Mb L: 28/43 MS: 1 ShuffleBytes- 00:07:47.119 [2024-11-16 16:43:32.661892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.119 [2024-11-16 16:43:32.661917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.119 [2024-11-16 16:43:32.662050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.119 [2024-11-16 16:43:32.662067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.119 #40 NEW cov: 11843 ft: 15327 corp: 32/860b lim: 45 exec/s: 40 rss: 69Mb L: 25/43 MS: 2 EraseBytes-CrossOver- 00:07:47.119 [2024-11-16 16:43:32.702572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:7ad0ffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.119 [2024-11-16 16:43:32.702597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.119 [2024-11-16 16:43:32.702732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.119 [2024-11-16 16:43:32.702750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.119 [2024-11-16 16:43:32.702889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.119 [2024-11-16 16:43:32.702905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.119 [2024-11-16 16:43:32.703028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ff89ffff cdw11:e4190000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.119 [2024-11-16 16:43:32.703044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.119 #41 NEW cov: 11843 ft: 15335 corp: 33/896b lim: 45 exec/s: 41 rss: 69Mb L: 36/43 MS: 1 InsertRepeatedBytes- 00:07:47.119 [2024-11-16 16:43:32.752501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.119 [2024-11-16 16:43:32.752528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.119 [2024-11-16 16:43:32.752653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.119 [2024-11-16 16:43:32.752671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.119 [2024-11-16 16:43:32.752795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.119 [2024-11-16 16:43:32.752811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.119 #42 NEW cov: 11843 ft: 15342 corp: 34/924b lim: 45 exec/s: 42 rss: 69Mb L: 28/43 MS: 1 ShuffleBytes- 00:07:47.119 [2024-11-16 16:43:32.792718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.119 [2024-11-16 16:43:32.792745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.119 [2024-11-16 16:43:32.792856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:fffaffff cdw11:ffff0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.119 [2024-11-16 16:43:32.792874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.119 [2024-11-16 16:43:32.792997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:bfffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.119 [2024-11-16 16:43:32.793011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.119 #43 NEW cov: 11843 ft: 15356 corp: 35/953b lim: 45 exec/s: 43 rss: 69Mb L: 29/43 MS: 1 CopyPart- 00:07:47.119 [2024-11-16 16:43:32.833062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.119 [2024-11-16 16:43:32.833087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.119 [2024-11-16 16:43:32.833197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.119 [2024-11-16 16:43:32.833213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.119 [2024-11-16 16:43:32.833324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.119 [2024-11-16 16:43:32.833340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.119 [2024-11-16 16:43:32.833449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffa60005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.119 [2024-11-16 16:43:32.833465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.119 #44 NEW cov: 11843 ft: 15390 corp: 36/994b lim: 45 exec/s: 44 rss: 69Mb L: 41/43 MS: 1 InsertRepeatedBytes- 00:07:47.379 [2024-11-16 16:43:32.872687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.379 [2024-11-16 16:43:32.872715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.379 [2024-11-16 16:43:32.872837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.379 [2024-11-16 16:43:32.872854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.379 #45 NEW cov: 11843 ft: 15394 corp: 37/1020b lim: 45 exec/s: 45 rss: 69Mb L: 26/43 MS: 1 EraseBytes- 00:07:47.379 [2024-11-16 16:43:32.913376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:7ad0ffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.379 [2024-11-16 16:43:32.913402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.379 [2024-11-16 16:43:32.913520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.379 [2024-11-16 16:43:32.913536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.379 [2024-11-16 16:43:32.913644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.379 [2024-11-16 16:43:32.913659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.379 [2024-11-16 16:43:32.913780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.379 [2024-11-16 16:43:32.913797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.379 #46 NEW cov: 11843 ft: 15399 corp: 38/1056b lim: 45 exec/s: 46 rss: 69Mb L: 36/43 MS: 1 CrossOver- 00:07:47.379 [2024-11-16 16:43:32.963307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff02ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.379 [2024-11-16 16:43:32.963334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.379 [2024-11-16 16:43:32.963449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.379 [2024-11-16 16:43:32.963466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.379 [2024-11-16 16:43:32.963589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.379 [2024-11-16 16:43:32.963606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.379 #47 NEW cov: 11843 ft: 15465 corp: 39/1085b lim: 45 exec/s: 47 rss: 69Mb L: 29/43 MS: 1 ChangeBinInt- 00:07:47.379 [2024-11-16 16:43:33.003618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.379 [2024-11-16 16:43:33.003643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.379 [2024-11-16 16:43:33.003745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.379 [2024-11-16 16:43:33.003762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.379 [2024-11-16 16:43:33.003884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ff7affff cdw11:d8890000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.379 [2024-11-16 16:43:33.003899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.379 [2024-11-16 16:43:33.004026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffff19e4 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.379 [2024-11-16 16:43:33.004042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.379 #48 NEW cov: 11843 ft: 15487 corp: 40/1121b lim: 45 exec/s: 24 rss: 69Mb L: 36/43 MS: 1 ChangeByte- 00:07:47.379 #48 DONE cov: 11843 ft: 15487 corp: 40/1121b lim: 45 exec/s: 24 rss: 69Mb 00:07:47.379 ###### Recommended dictionary. ###### 00:07:47.379 "z\320\211\344\031\027\212\000" # Uses: 2 00:07:47.380 "\017\000\000\000\000\000\000\000" # Uses: 1 00:07:47.380 "\000\000\000\000" # Uses: 0 00:07:47.380 "A\263p:\032\027\212\000" # Uses: 0 00:07:47.380 ###### End of recommended dictionary. ###### 00:07:47.380 Done 48 runs in 2 second(s) 00:07:47.639 16:43:33 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:07:47.639 16:43:33 -- ../common.sh@72 -- # (( i++ )) 00:07:47.639 16:43:33 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:47.639 16:43:33 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:47.639 16:43:33 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:47.639 16:43:33 -- nvmf/run.sh@24 -- # local timen=1 00:07:47.639 16:43:33 -- nvmf/run.sh@25 -- # local core=0x1 00:07:47.639 16:43:33 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:47.639 16:43:33 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:47.639 16:43:33 -- nvmf/run.sh@29 -- # printf %02d 6 00:07:47.639 16:43:33 -- nvmf/run.sh@29 -- # port=4406 00:07:47.640 16:43:33 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:47.640 16:43:33 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:47.640 16:43:33 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:47.640 16:43:33 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:07:47.640 [2024-11-16 16:43:33.186681] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:47.640 [2024-11-16 16:43:33.186756] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid485776 ] 00:07:47.640 EAL: No free 2048 kB hugepages reported on node 1 00:07:47.640 [2024-11-16 16:43:33.359296] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.640 [2024-11-16 16:43:33.378748] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:47.640 [2024-11-16 16:43:33.378861] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.900 [2024-11-16 16:43:33.430139] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:47.900 [2024-11-16 16:43:33.446459] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:47.900 INFO: Running with entropic power schedule (0xFF, 100). 00:07:47.900 INFO: Seed: 4252333477 00:07:47.900 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:47.900 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:47.900 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:47.900 INFO: A corpus is not provided, starting from an empty corpus 00:07:47.900 #2 INITED exec/s: 0 rss: 59Mb 00:07:47.900 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:47.900 This may also happen if the target rejected all inputs we tried so far 00:07:47.900 [2024-11-16 16:43:33.491617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a28 cdw11:00000000 00:07:47.900 [2024-11-16 16:43:33.491646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.159 NEW_FUNC[1/669]: 0x45bec8 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:48.159 NEW_FUNC[2/669]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:48.159 #6 NEW cov: 11528 ft: 11534 corp: 2/3b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 4 ShuffleBytes-ShuffleBytes-ShuffleBytes-InsertByte- 00:07:48.159 [2024-11-16 16:43:33.813091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a02 cdw11:00000000 00:07:48.159 [2024-11-16 16:43:33.813132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.159 #9 NEW cov: 11646 ft: 12317 corp: 3/5b lim: 10 exec/s: 0 rss: 67Mb L: 2/2 MS: 3 EraseBytes-ChangeBit-CrossOver- 00:07:48.159 [2024-11-16 16:43:33.863584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.159 [2024-11-16 16:43:33.863613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.159 [2024-11-16 16:43:33.863736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.159 [2024-11-16 16:43:33.863754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.159 [2024-11-16 16:43:33.863863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:07:48.159 [2024-11-16 16:43:33.863881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.159 #10 NEW cov: 11652 ft: 12708 corp: 4/11b lim: 10 exec/s: 0 rss: 67Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:07:48.159 [2024-11-16 16:43:33.903294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a2a cdw11:00000000 00:07:48.159 [2024-11-16 16:43:33.903320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.418 #12 NEW cov: 11737 ft: 12981 corp: 5/13b lim: 10 exec/s: 0 rss: 67Mb L: 2/6 MS: 2 CopyPart-InsertByte- 00:07:48.419 [2024-11-16 16:43:33.943822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.419 [2024-11-16 16:43:33.943849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.419 [2024-11-16 16:43:33.943971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.419 [2024-11-16 16:43:33.943987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.419 [2024-11-16 16:43:33.944106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:07:48.419 [2024-11-16 16:43:33.944123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.419 #13 NEW cov: 11737 ft: 13056 corp: 6/19b lim: 10 exec/s: 0 rss: 67Mb L: 6/6 MS: 1 ShuffleBytes- 00:07:48.419 [2024-11-16 16:43:33.993532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a02 cdw11:00000000 00:07:48.419 [2024-11-16 16:43:33.993559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.419 #14 NEW cov: 11737 ft: 13226 corp: 7/21b lim: 10 exec/s: 0 rss: 67Mb L: 2/6 MS: 1 CrossOver- 00:07:48.419 [2024-11-16 16:43:34.033692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002a2a cdw11:00000000 00:07:48.419 [2024-11-16 16:43:34.033718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.419 #15 NEW cov: 11737 ft: 13294 corp: 8/23b lim: 10 exec/s: 0 rss: 67Mb L: 2/6 MS: 1 CopyPart- 00:07:48.419 [2024-11-16 16:43:34.083905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000021ff cdw11:00000000 00:07:48.419 [2024-11-16 16:43:34.083931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.419 [2024-11-16 16:43:34.084046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:48.419 [2024-11-16 16:43:34.084076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.419 #17 NEW cov: 11737 ft: 13445 corp: 9/27b lim: 10 exec/s: 0 rss: 67Mb L: 4/6 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:48.419 [2024-11-16 16:43:34.123906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00005dfa cdw11:00000000 00:07:48.419 [2024-11-16 16:43:34.123932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.419 #21 NEW cov: 11737 ft: 13476 corp: 10/29b lim: 10 exec/s: 0 rss: 67Mb L: 2/6 MS: 4 ChangeBinInt-ChangeBit-ShuffleBytes-InsertByte- 00:07:48.419 [2024-11-16 16:43:34.164470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000080 cdw11:00000000 00:07:48.419 [2024-11-16 16:43:34.164496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.419 [2024-11-16 16:43:34.164611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.419 [2024-11-16 16:43:34.164628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.419 [2024-11-16 16:43:34.164756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:07:48.419 [2024-11-16 16:43:34.164771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.678 #22 NEW cov: 11737 ft: 13534 corp: 11/35b lim: 10 exec/s: 0 rss: 67Mb L: 6/6 MS: 1 ChangeBit- 00:07:48.678 [2024-11-16 16:43:34.204159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e28 cdw11:00000000 00:07:48.678 [2024-11-16 16:43:34.204185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.678 #23 NEW cov: 11737 ft: 13580 corp: 12/37b lim: 10 exec/s: 0 rss: 67Mb L: 2/6 MS: 1 ChangeBit- 00:07:48.678 [2024-11-16 16:43:34.244511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.678 [2024-11-16 16:43:34.244537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.678 [2024-11-16 16:43:34.244658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:07:48.678 [2024-11-16 16:43:34.244678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.678 #24 NEW cov: 11737 ft: 13608 corp: 13/41b lim: 10 exec/s: 0 rss: 67Mb L: 4/6 MS: 1 EraseBytes- 00:07:48.678 [2024-11-16 16:43:34.285043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:48.678 [2024-11-16 16:43:34.285069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.678 [2024-11-16 16:43:34.285189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:48.678 [2024-11-16 16:43:34.285205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.678 [2024-11-16 16:43:34.285315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:48.678 [2024-11-16 16:43:34.285330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.678 [2024-11-16 16:43:34.285440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:48.678 [2024-11-16 16:43:34.285456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.678 #25 NEW cov: 11737 ft: 13869 corp: 14/50b lim: 10 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:07:48.678 [2024-11-16 16:43:34.324724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000039 cdw11:00000000 00:07:48.678 [2024-11-16 16:43:34.324750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.678 [2024-11-16 16:43:34.324864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.678 [2024-11-16 16:43:34.324880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.678 #26 NEW cov: 11737 ft: 13921 corp: 15/55b lim: 10 exec/s: 0 rss: 67Mb L: 5/9 MS: 1 InsertByte- 00:07:48.678 [2024-11-16 16:43:34.364675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a2a cdw11:00000000 00:07:48.678 [2024-11-16 16:43:34.364700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.678 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:48.678 #32 NEW cov: 11760 ft: 13940 corp: 16/57b lim: 10 exec/s: 0 rss: 67Mb L: 2/9 MS: 1 CrossOver- 00:07:48.678 [2024-11-16 16:43:34.425022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 00:07:48.678 [2024-11-16 16:43:34.425056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.678 [2024-11-16 16:43:34.425183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:07:48.678 [2024-11-16 16:43:34.425204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.938 #33 NEW cov: 11760 ft: 14025 corp: 17/61b lim: 10 exec/s: 0 rss: 67Mb L: 4/9 MS: 1 CopyPart- 00:07:48.938 [2024-11-16 16:43:34.485205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:07:48.938 [2024-11-16 16:43:34.485240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.938 [2024-11-16 16:43:34.485376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002a28 cdw11:00000000 00:07:48.938 [2024-11-16 16:43:34.485399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.938 #34 NEW cov: 11760 ft: 14064 corp: 18/65b lim: 10 exec/s: 34 rss: 67Mb L: 4/9 MS: 1 CrossOver- 00:07:48.938 [2024-11-16 16:43:34.545601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.938 [2024-11-16 16:43:34.545634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.938 [2024-11-16 16:43:34.545785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000005b cdw11:00000000 00:07:48.939 [2024-11-16 16:43:34.545807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.939 [2024-11-16 16:43:34.545946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:07:48.939 [2024-11-16 16:43:34.545968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.939 #35 NEW cov: 11760 ft: 14140 corp: 19/71b lim: 10 exec/s: 35 rss: 68Mb L: 6/9 MS: 1 ChangeByte- 00:07:48.939 [2024-11-16 16:43:34.615956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.939 [2024-11-16 16:43:34.616028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.939 [2024-11-16 16:43:34.616199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000002a cdw11:00000000 00:07:48.939 [2024-11-16 16:43:34.616242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.939 #36 NEW cov: 11760 ft: 14173 corp: 20/75b lim: 10 exec/s: 36 rss: 68Mb L: 4/9 MS: 1 ChangeBit- 00:07:48.939 [2024-11-16 16:43:34.666427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000080 cdw11:00000000 00:07:48.939 [2024-11-16 16:43:34.666454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.939 [2024-11-16 16:43:34.666577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:48.939 [2024-11-16 16:43:34.666591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.939 [2024-11-16 16:43:34.666706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:48.939 [2024-11-16 16:43:34.666723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.939 [2024-11-16 16:43:34.666834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:48.939 [2024-11-16 16:43:34.666850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.939 [2024-11-16 16:43:34.666963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:07:48.939 [2024-11-16 16:43:34.666980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.199 #37 NEW cov: 11760 ft: 14245 corp: 21/85b lim: 10 exec/s: 37 rss: 68Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:49.199 [2024-11-16 16:43:34.716399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.199 [2024-11-16 16:43:34.716427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.199 [2024-11-16 16:43:34.716539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.199 [2024-11-16 16:43:34.716557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.199 [2024-11-16 16:43:34.716666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.199 [2024-11-16 16:43:34.716686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.199 [2024-11-16 16:43:34.716806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.199 [2024-11-16 16:43:34.716824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.199 #43 NEW cov: 11760 ft: 14254 corp: 22/94b lim: 10 exec/s: 43 rss: 68Mb L: 9/10 MS: 1 CopyPart- 00:07:49.199 [2024-11-16 16:43:34.756091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000dd00 cdw11:00000000 00:07:49.199 [2024-11-16 16:43:34.756118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.199 [2024-11-16 16:43:34.756233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:49.199 [2024-11-16 16:43:34.756250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.199 #44 NEW cov: 11760 ft: 14292 corp: 23/98b lim: 10 exec/s: 44 rss: 68Mb L: 4/10 MS: 1 ChangeBinInt- 00:07:49.199 [2024-11-16 16:43:34.796165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.199 [2024-11-16 16:43:34.796195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.199 [2024-11-16 16:43:34.796300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000002a cdw11:00000000 00:07:49.199 [2024-11-16 16:43:34.796317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.199 #45 NEW cov: 11760 ft: 14296 corp: 24/102b lim: 10 exec/s: 45 rss: 68Mb L: 4/10 MS: 1 CrossOver- 00:07:49.199 [2024-11-16 16:43:34.846573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.199 [2024-11-16 16:43:34.846598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.199 [2024-11-16 16:43:34.846734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000096 cdw11:00000000 00:07:49.199 [2024-11-16 16:43:34.846751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.199 [2024-11-16 16:43:34.846867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00005b00 cdw11:00000000 00:07:49.199 [2024-11-16 16:43:34.846884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.199 #46 NEW cov: 11760 ft: 14332 corp: 25/109b lim: 10 exec/s: 46 rss: 68Mb L: 7/10 MS: 1 InsertByte- 00:07:49.199 [2024-11-16 16:43:34.896749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002b00 cdw11:00000000 00:07:49.199 [2024-11-16 16:43:34.896775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.199 [2024-11-16 16:43:34.896909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.199 [2024-11-16 16:43:34.896926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.199 [2024-11-16 16:43:34.897038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.199 [2024-11-16 16:43:34.897054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.199 #47 NEW cov: 11760 ft: 14355 corp: 26/116b lim: 10 exec/s: 47 rss: 68Mb L: 7/10 MS: 1 InsertByte- 00:07:49.199 [2024-11-16 16:43:34.936614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.199 [2024-11-16 16:43:34.936641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.199 [2024-11-16 16:43:34.936771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000002a cdw11:00000000 00:07:49.199 [2024-11-16 16:43:34.936788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.458 #48 NEW cov: 11760 ft: 14364 corp: 27/121b lim: 10 exec/s: 48 rss: 68Mb L: 5/10 MS: 1 InsertByte- 00:07:49.458 [2024-11-16 16:43:34.976510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 00:07:49.458 [2024-11-16 16:43:34.976536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.458 #49 NEW cov: 11760 ft: 14378 corp: 28/123b lim: 10 exec/s: 49 rss: 68Mb L: 2/10 MS: 1 ChangeBinInt- 00:07:49.458 [2024-11-16 16:43:35.016898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000400 cdw11:00000000 00:07:49.458 [2024-11-16 16:43:35.016926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.458 [2024-11-16 16:43:35.017052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:07:49.458 [2024-11-16 16:43:35.017068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.458 #50 NEW cov: 11760 ft: 14388 corp: 29/127b lim: 10 exec/s: 50 rss: 68Mb L: 4/10 MS: 1 ChangeBinInt- 00:07:49.458 [2024-11-16 16:43:35.056724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:49.458 [2024-11-16 16:43:35.056750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.458 #51 NEW cov: 11760 ft: 14402 corp: 30/129b lim: 10 exec/s: 51 rss: 68Mb L: 2/10 MS: 1 CrossOver- 00:07:49.458 [2024-11-16 16:43:35.096830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ff02 cdw11:00000000 00:07:49.458 [2024-11-16 16:43:35.096855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.458 #52 NEW cov: 11760 ft: 14403 corp: 31/131b lim: 10 exec/s: 52 rss: 68Mb L: 2/10 MS: 1 CrossOver- 00:07:49.458 [2024-11-16 16:43:35.137071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 00:07:49.458 [2024-11-16 16:43:35.137098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.458 #53 NEW cov: 11760 ft: 14407 corp: 32/133b lim: 10 exec/s: 53 rss: 68Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:49.458 [2024-11-16 16:43:35.177168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.458 [2024-11-16 16:43:35.177196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.458 #54 NEW cov: 11760 ft: 14413 corp: 33/136b lim: 10 exec/s: 54 rss: 68Mb L: 3/10 MS: 1 EraseBytes- 00:07:49.717 [2024-11-16 16:43:35.217229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008100 cdw11:00000000 00:07:49.717 [2024-11-16 16:43:35.217257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.717 [2024-11-16 16:43:35.217377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.717 [2024-11-16 16:43:35.217393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.717 #55 NEW cov: 11760 ft: 14427 corp: 34/141b lim: 10 exec/s: 55 rss: 68Mb L: 5/10 MS: 1 InsertByte- 00:07:49.717 [2024-11-16 16:43:35.267703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.717 [2024-11-16 16:43:35.267729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.717 [2024-11-16 16:43:35.267851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.717 [2024-11-16 16:43:35.267868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.717 #56 NEW cov: 11760 ft: 14451 corp: 35/146b lim: 10 exec/s: 56 rss: 68Mb L: 5/10 MS: 1 InsertByte- 00:07:49.717 [2024-11-16 16:43:35.307899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.717 [2024-11-16 16:43:35.307926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.717 [2024-11-16 16:43:35.308037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:49.717 [2024-11-16 16:43:35.308056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.717 #57 NEW cov: 11760 ft: 14463 corp: 36/151b lim: 10 exec/s: 57 rss: 68Mb L: 5/10 MS: 1 ShuffleBytes- 00:07:49.717 [2024-11-16 16:43:35.358570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:00000000 00:07:49.717 [2024-11-16 16:43:35.358597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.717 [2024-11-16 16:43:35.358722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:00000000 00:07:49.717 [2024-11-16 16:43:35.358739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.717 [2024-11-16 16:43:35.358858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:49.717 [2024-11-16 16:43:35.358875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.717 [2024-11-16 16:43:35.358992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.717 [2024-11-16 16:43:35.359009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.717 [2024-11-16 16:43:35.359129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:07:49.717 [2024-11-16 16:43:35.359146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.717 #58 NEW cov: 11760 ft: 14465 corp: 37/161b lim: 10 exec/s: 58 rss: 68Mb L: 10/10 MS: 1 CopyPart- 00:07:49.717 [2024-11-16 16:43:35.408525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.717 [2024-11-16 16:43:35.408556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.717 [2024-11-16 16:43:35.408666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000005b cdw11:00000000 00:07:49.717 [2024-11-16 16:43:35.408687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.717 [2024-11-16 16:43:35.408802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.717 [2024-11-16 16:43:35.408819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.717 [2024-11-16 16:43:35.408941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 00:07:49.717 [2024-11-16 16:43:35.408960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.718 #59 NEW cov: 11760 ft: 14475 corp: 38/169b lim: 10 exec/s: 59 rss: 68Mb L: 8/10 MS: 1 CrossOver- 00:07:49.718 [2024-11-16 16:43:35.448163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:49.718 [2024-11-16 16:43:35.448190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.718 [2024-11-16 16:43:35.448306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002828 cdw11:00000000 00:07:49.718 [2024-11-16 16:43:35.448330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.977 #60 NEW cov: 11760 ft: 14486 corp: 39/173b lim: 10 exec/s: 60 rss: 68Mb L: 4/10 MS: 1 CopyPart- 00:07:49.977 [2024-11-16 16:43:35.488923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:00000000 00:07:49.977 [2024-11-16 16:43:35.488951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.977 [2024-11-16 16:43:35.489069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a2a cdw11:00000000 00:07:49.977 [2024-11-16 16:43:35.489086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.977 [2024-11-16 16:43:35.489199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:49.977 [2024-11-16 16:43:35.489217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.977 [2024-11-16 16:43:35.489328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.977 [2024-11-16 16:43:35.489344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.977 [2024-11-16 16:43:35.489458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:07:49.977 [2024-11-16 16:43:35.489476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.977 #61 NEW cov: 11760 ft: 14503 corp: 40/183b lim: 10 exec/s: 30 rss: 69Mb L: 10/10 MS: 1 CrossOver- 00:07:49.977 #61 DONE cov: 11760 ft: 14503 corp: 40/183b lim: 10 exec/s: 30 rss: 69Mb 00:07:49.977 Done 61 runs in 2 second(s) 00:07:49.977 16:43:35 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:07:49.977 16:43:35 -- ../common.sh@72 -- # (( i++ )) 00:07:49.977 16:43:35 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:49.977 16:43:35 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:49.977 16:43:35 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:49.977 16:43:35 -- nvmf/run.sh@24 -- # local timen=1 00:07:49.977 16:43:35 -- nvmf/run.sh@25 -- # local core=0x1 00:07:49.977 16:43:35 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:49.977 16:43:35 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:49.977 16:43:35 -- nvmf/run.sh@29 -- # printf %02d 7 00:07:49.977 16:43:35 -- nvmf/run.sh@29 -- # port=4407 00:07:49.977 16:43:35 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:49.977 16:43:35 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:49.977 16:43:35 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:49.977 16:43:35 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:07:49.977 [2024-11-16 16:43:35.672195] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:49.977 [2024-11-16 16:43:35.672264] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid486309 ] 00:07:49.977 EAL: No free 2048 kB hugepages reported on node 1 00:07:50.237 [2024-11-16 16:43:35.845636] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.237 [2024-11-16 16:43:35.864870] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:50.237 [2024-11-16 16:43:35.864985] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.237 [2024-11-16 16:43:35.916241] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:50.237 [2024-11-16 16:43:35.932539] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:50.237 INFO: Running with entropic power schedule (0xFF, 100). 00:07:50.237 INFO: Seed: 2443373427 00:07:50.237 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:50.237 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:50.237 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:50.237 INFO: A corpus is not provided, starting from an empty corpus 00:07:50.237 #2 INITED exec/s: 0 rss: 59Mb 00:07:50.237 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:50.237 This may also happen if the target rejected all inputs we tried so far 00:07:50.496 [2024-11-16 16:43:36.008688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:07:50.496 [2024-11-16 16:43:36.008742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.496 [2024-11-16 16:43:36.008862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:50.496 [2024-11-16 16:43:36.008878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.756 NEW_FUNC[1/668]: 0x45c8c8 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:50.756 NEW_FUNC[2/668]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:50.756 #6 NEW cov: 11521 ft: 11528 corp: 2/6b lim: 10 exec/s: 0 rss: 66Mb L: 5/5 MS: 4 CrossOver-ShuffleBytes-ChangeBit-CMP- DE: "\000\000\000@"- 00:07:50.756 [2024-11-16 16:43:36.319436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:07:50.756 [2024-11-16 16:43:36.319474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.756 [2024-11-16 16:43:36.319584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:50.756 [2024-11-16 16:43:36.319605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.756 NEW_FUNC[1/1]: 0x16c2448 in _nvme_qpair_complete_abort_queued_reqs /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:593 00:07:50.756 #7 NEW cov: 11646 ft: 12060 corp: 3/11b lim: 10 exec/s: 0 rss: 67Mb L: 5/5 MS: 1 ChangeBit- 00:07:50.756 [2024-11-16 16:43:36.369817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a31 cdw11:00000000 00:07:50.756 [2024-11-16 16:43:36.369845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.756 [2024-11-16 16:43:36.369973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:50.756 [2024-11-16 16:43:36.369990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.756 [2024-11-16 16:43:36.370093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000040 cdw11:00000000 00:07:50.756 [2024-11-16 16:43:36.370108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.756 #8 NEW cov: 11652 ft: 12594 corp: 4/17b lim: 10 exec/s: 0 rss: 67Mb L: 6/6 MS: 1 InsertByte- 00:07:50.756 [2024-11-16 16:43:36.409888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a5d cdw11:00000000 00:07:50.756 [2024-11-16 16:43:36.409915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.756 [2024-11-16 16:43:36.410051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:50.756 [2024-11-16 16:43:36.410067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.756 [2024-11-16 16:43:36.410181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000040 cdw11:00000000 00:07:50.756 [2024-11-16 16:43:36.410198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.756 #9 NEW cov: 11737 ft: 12849 corp: 5/23b lim: 10 exec/s: 0 rss: 67Mb L: 6/6 MS: 1 InsertByte- 00:07:50.756 [2024-11-16 16:43:36.449982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a31 cdw11:00000000 00:07:50.756 [2024-11-16 16:43:36.450008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.756 [2024-11-16 16:43:36.450145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:50.756 [2024-11-16 16:43:36.450160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.756 [2024-11-16 16:43:36.450285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000440 cdw11:00000000 00:07:50.756 [2024-11-16 16:43:36.450301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.756 #10 NEW cov: 11737 ft: 12947 corp: 6/29b lim: 10 exec/s: 0 rss: 67Mb L: 6/6 MS: 1 ChangeBit- 00:07:50.756 [2024-11-16 16:43:36.490136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003a5d cdw11:00000000 00:07:50.756 [2024-11-16 16:43:36.490163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.756 [2024-11-16 16:43:36.490284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:50.756 [2024-11-16 16:43:36.490299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.756 [2024-11-16 16:43:36.490417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000040 cdw11:00000000 00:07:50.756 [2024-11-16 16:43:36.490434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.015 #11 NEW cov: 11737 ft: 13039 corp: 7/35b lim: 10 exec/s: 0 rss: 67Mb L: 6/6 MS: 1 ChangeBit- 00:07:51.015 [2024-11-16 16:43:36.530307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a5d cdw11:00000000 00:07:51.015 [2024-11-16 16:43:36.530333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.015 [2024-11-16 16:43:36.530449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.015 [2024-11-16 16:43:36.530465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.015 [2024-11-16 16:43:36.530584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000040 cdw11:00000000 00:07:51.015 [2024-11-16 16:43:36.530600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.015 #12 NEW cov: 11737 ft: 13141 corp: 8/41b lim: 10 exec/s: 0 rss: 67Mb L: 6/6 MS: 1 ShuffleBytes- 00:07:51.016 [2024-11-16 16:43:36.570331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007c5d cdw11:00000000 00:07:51.016 [2024-11-16 16:43:36.570356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.016 [2024-11-16 16:43:36.570474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.016 [2024-11-16 16:43:36.570489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.016 [2024-11-16 16:43:36.570599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000040 cdw11:00000000 00:07:51.016 [2024-11-16 16:43:36.570614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.016 #13 NEW cov: 11737 ft: 13175 corp: 9/47b lim: 10 exec/s: 0 rss: 67Mb L: 6/6 MS: 1 ChangeByte- 00:07:51.016 [2024-11-16 16:43:36.610842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a31 cdw11:00000000 00:07:51.016 [2024-11-16 16:43:36.610868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.016 [2024-11-16 16:43:36.610997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.016 [2024-11-16 16:43:36.611014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.016 [2024-11-16 16:43:36.611121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000400 cdw11:00000000 00:07:51.016 [2024-11-16 16:43:36.611136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.016 [2024-11-16 16:43:36.611242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.016 [2024-11-16 16:43:36.611258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.016 [2024-11-16 16:43:36.611371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00004040 cdw11:00000000 00:07:51.016 [2024-11-16 16:43:36.611387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.016 #14 NEW cov: 11737 ft: 13444 corp: 10/57b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 CrossOver- 00:07:51.016 [2024-11-16 16:43:36.650978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007c5d cdw11:00000000 00:07:51.016 [2024-11-16 16:43:36.651005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.016 [2024-11-16 16:43:36.651113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.016 [2024-11-16 16:43:36.651130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.016 [2024-11-16 16:43:36.651255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.016 [2024-11-16 16:43:36.651272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.016 [2024-11-16 16:43:36.651386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000040 cdw11:00000000 00:07:51.016 [2024-11-16 16:43:36.651405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.016 [2024-11-16 16:43:36.651517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000040 cdw11:00000000 00:07:51.016 [2024-11-16 16:43:36.651533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.016 #15 NEW cov: 11737 ft: 13551 corp: 11/67b lim: 10 exec/s: 0 rss: 67Mb L: 10/10 MS: 1 PersAutoDict- DE: "\000\000\000@"- 00:07:51.016 [2024-11-16 16:43:36.701206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007c5d cdw11:00000000 00:07:51.016 [2024-11-16 16:43:36.701233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.016 [2024-11-16 16:43:36.701341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.016 [2024-11-16 16:43:36.701357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.016 [2024-11-16 16:43:36.701472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.016 [2024-11-16 16:43:36.701488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.016 [2024-11-16 16:43:36.701604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000040 cdw11:00000000 00:07:51.016 [2024-11-16 16:43:36.701621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.016 [2024-11-16 16:43:36.701737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000040 cdw11:00000000 00:07:51.016 [2024-11-16 16:43:36.701751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.016 #16 NEW cov: 11737 ft: 13569 corp: 12/77b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 CopyPart- 00:07:51.016 [2024-11-16 16:43:36.750667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:07:51.016 [2024-11-16 16:43:36.750699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.016 [2024-11-16 16:43:36.750815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 00:07:51.016 [2024-11-16 16:43:36.750832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.276 #17 NEW cov: 11737 ft: 13583 corp: 13/82b lim: 10 exec/s: 0 rss: 68Mb L: 5/10 MS: 1 ChangeBit- 00:07:51.276 [2024-11-16 16:43:36.801245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:07:51.276 [2024-11-16 16:43:36.801275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.276 [2024-11-16 16:43:36.801385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000e0e cdw11:00000000 00:07:51.276 [2024-11-16 16:43:36.801402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.276 [2024-11-16 16:43:36.801509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000e00 cdw11:00000000 00:07:51.276 [2024-11-16 16:43:36.801525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.276 [2024-11-16 16:43:36.801632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000042 cdw11:00000000 00:07:51.276 [2024-11-16 16:43:36.801649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.276 #18 NEW cov: 11737 ft: 13605 corp: 14/90b lim: 10 exec/s: 0 rss: 68Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:07:51.276 [2024-11-16 16:43:36.841545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007c55 cdw11:00000000 00:07:51.276 [2024-11-16 16:43:36.841573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.276 [2024-11-16 16:43:36.841688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.276 [2024-11-16 16:43:36.841704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.276 [2024-11-16 16:43:36.841826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.276 [2024-11-16 16:43:36.841842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.276 [2024-11-16 16:43:36.841956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000040 cdw11:00000000 00:07:51.276 [2024-11-16 16:43:36.841969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.276 [2024-11-16 16:43:36.842080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000040 cdw11:00000000 00:07:51.276 [2024-11-16 16:43:36.842097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.276 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:51.276 #19 NEW cov: 11760 ft: 13719 corp: 15/100b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ChangeBit- 00:07:51.276 [2024-11-16 16:43:36.891056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:07:51.276 [2024-11-16 16:43:36.891082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.276 [2024-11-16 16:43:36.891184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.276 [2024-11-16 16:43:36.891201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.276 #20 NEW cov: 11760 ft: 13730 corp: 16/105b lim: 10 exec/s: 0 rss: 68Mb L: 5/10 MS: 1 CMP- DE: "\000\000\000\011"- 00:07:51.276 [2024-11-16 16:43:36.931017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a5d cdw11:00000000 00:07:51.276 [2024-11-16 16:43:36.931044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.276 #21 NEW cov: 11760 ft: 13983 corp: 17/108b lim: 10 exec/s: 0 rss: 68Mb L: 3/10 MS: 1 EraseBytes- 00:07:51.276 [2024-11-16 16:43:36.971336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:07:51.276 [2024-11-16 16:43:36.971363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.276 [2024-11-16 16:43:36.971479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.276 [2024-11-16 16:43:36.971495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.276 #22 NEW cov: 11760 ft: 14013 corp: 18/113b lim: 10 exec/s: 22 rss: 68Mb L: 5/10 MS: 1 ShuffleBytes- 00:07:51.277 [2024-11-16 16:43:37.011685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007c5d cdw11:00000000 00:07:51.277 [2024-11-16 16:43:37.011713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.277 [2024-11-16 16:43:37.011831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.277 [2024-11-16 16:43:37.011849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.277 [2024-11-16 16:43:37.011968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000040 cdw11:00000000 00:07:51.277 [2024-11-16 16:43:37.011984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.536 #23 NEW cov: 11760 ft: 14019 corp: 19/119b lim: 10 exec/s: 23 rss: 68Mb L: 6/10 MS: 1 ChangeBinInt- 00:07:51.536 [2024-11-16 16:43:37.051605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:07:51.536 [2024-11-16 16:43:37.051633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.536 [2024-11-16 16:43:37.051747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 00:07:51.536 [2024-11-16 16:43:37.051764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.536 #24 NEW cov: 11760 ft: 14064 corp: 20/123b lim: 10 exec/s: 24 rss: 68Mb L: 4/10 MS: 1 CrossOver- 00:07:51.536 [2024-11-16 16:43:37.101579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:07:51.536 [2024-11-16 16:43:37.101609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.536 #25 NEW cov: 11760 ft: 14084 corp: 21/126b lim: 10 exec/s: 25 rss: 68Mb L: 3/10 MS: 1 CrossOver- 00:07:51.536 [2024-11-16 16:43:37.151842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a2e cdw11:00000000 00:07:51.536 [2024-11-16 16:43:37.151871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.536 [2024-11-16 16:43:37.151982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.536 [2024-11-16 16:43:37.151999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.536 [2024-11-16 16:43:37.152113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000040 cdw11:00000000 00:07:51.536 [2024-11-16 16:43:37.152129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.536 #26 NEW cov: 11760 ft: 14135 corp: 22/132b lim: 10 exec/s: 26 rss: 68Mb L: 6/10 MS: 1 ChangeBinInt- 00:07:51.536 [2024-11-16 16:43:37.192515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007c5d cdw11:00000000 00:07:51.536 [2024-11-16 16:43:37.192546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.536 [2024-11-16 16:43:37.192647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.536 [2024-11-16 16:43:37.192663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.536 [2024-11-16 16:43:37.192787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000069 cdw11:00000000 00:07:51.536 [2024-11-16 16:43:37.192804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.536 [2024-11-16 16:43:37.192939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000040 cdw11:00000000 00:07:51.536 [2024-11-16 16:43:37.192955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.536 [2024-11-16 16:43:37.193073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000040 cdw11:00000000 00:07:51.536 [2024-11-16 16:43:37.193089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.536 #27 NEW cov: 11760 ft: 14150 corp: 23/142b lim: 10 exec/s: 27 rss: 68Mb L: 10/10 MS: 1 ChangeByte- 00:07:51.536 [2024-11-16 16:43:37.231857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a13 cdw11:00000000 00:07:51.536 [2024-11-16 16:43:37.231881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.536 #28 NEW cov: 11760 ft: 14162 corp: 24/144b lim: 10 exec/s: 28 rss: 68Mb L: 2/10 MS: 1 InsertByte- 00:07:51.536 [2024-11-16 16:43:37.272415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a27 cdw11:00000000 00:07:51.536 [2024-11-16 16:43:37.272441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.536 [2024-11-16 16:43:37.272572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.536 [2024-11-16 16:43:37.272587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.536 [2024-11-16 16:43:37.272703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000040 cdw11:00000000 00:07:51.536 [2024-11-16 16:43:37.272720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.796 #29 NEW cov: 11760 ft: 14180 corp: 25/150b lim: 10 exec/s: 29 rss: 68Mb L: 6/10 MS: 1 ChangeByte- 00:07:51.796 [2024-11-16 16:43:37.312721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a5d cdw11:00000000 00:07:51.796 [2024-11-16 16:43:37.312747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.796 [2024-11-16 16:43:37.312877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:00000000 00:07:51.796 [2024-11-16 16:43:37.312896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.796 [2024-11-16 16:43:37.313010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:51.796 [2024-11-16 16:43:37.313026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.796 [2024-11-16 16:43:37.313138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.796 [2024-11-16 16:43:37.313153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.796 #30 NEW cov: 11760 ft: 14193 corp: 26/159b lim: 10 exec/s: 30 rss: 68Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:07:51.796 [2024-11-16 16:43:37.353123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007c5d cdw11:00000000 00:07:51.796 [2024-11-16 16:43:37.353149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.796 [2024-11-16 16:43:37.353255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.796 [2024-11-16 16:43:37.353273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.796 [2024-11-16 16:43:37.353391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.796 [2024-11-16 16:43:37.353407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.796 [2024-11-16 16:43:37.353494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.796 [2024-11-16 16:43:37.353510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.796 [2024-11-16 16:43:37.353624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000040 cdw11:00000000 00:07:51.796 [2024-11-16 16:43:37.353641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.796 #31 NEW cov: 11760 ft: 14198 corp: 27/169b lim: 10 exec/s: 31 rss: 68Mb L: 10/10 MS: 1 PersAutoDict- DE: "\000\000\000@"- 00:07:51.796 [2024-11-16 16:43:37.392776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:07:51.796 [2024-11-16 16:43:37.392802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.796 [2024-11-16 16:43:37.392926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.796 [2024-11-16 16:43:37.392943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.796 [2024-11-16 16:43:37.393061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000400 cdw11:00000000 00:07:51.796 [2024-11-16 16:43:37.393076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.796 #32 NEW cov: 11760 ft: 14209 corp: 28/176b lim: 10 exec/s: 32 rss: 68Mb L: 7/10 MS: 1 CopyPart- 00:07:51.796 [2024-11-16 16:43:37.432678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:07:51.796 [2024-11-16 16:43:37.432705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.796 [2024-11-16 16:43:37.432812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 00:07:51.796 [2024-11-16 16:43:37.432826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.796 #33 NEW cov: 11760 ft: 14211 corp: 29/181b lim: 10 exec/s: 33 rss: 68Mb L: 5/10 MS: 1 InsertByte- 00:07:51.796 [2024-11-16 16:43:37.473027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a31 cdw11:00000000 00:07:51.796 [2024-11-16 16:43:37.473052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.796 [2024-11-16 16:43:37.473152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.796 [2024-11-16 16:43:37.473167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.796 [2024-11-16 16:43:37.473269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000042 cdw11:00000000 00:07:51.796 [2024-11-16 16:43:37.473285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.796 #34 NEW cov: 11760 ft: 14214 corp: 30/187b lim: 10 exec/s: 34 rss: 68Mb L: 6/10 MS: 1 ChangeBit- 00:07:51.796 [2024-11-16 16:43:37.513576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a5d cdw11:00000000 00:07:51.796 [2024-11-16 16:43:37.513600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.796 [2024-11-16 16:43:37.513729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.796 [2024-11-16 16:43:37.513746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.796 [2024-11-16 16:43:37.513869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000040 cdw11:00000000 00:07:51.796 [2024-11-16 16:43:37.513883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.796 [2024-11-16 16:43:37.513999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.796 [2024-11-16 16:43:37.514016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.797 [2024-11-16 16:43:37.514125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000040 cdw11:00000000 00:07:51.797 [2024-11-16 16:43:37.514140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.797 #35 NEW cov: 11760 ft: 14228 corp: 31/197b lim: 10 exec/s: 35 rss: 68Mb L: 10/10 MS: 1 PersAutoDict- DE: "\000\000\000@"- 00:07:52.056 [2024-11-16 16:43:37.553467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.056 [2024-11-16 16:43:37.553492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.056 [2024-11-16 16:43:37.553615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000001a cdw11:00000000 00:07:52.056 [2024-11-16 16:43:37.553630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.056 [2024-11-16 16:43:37.553762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00003100 cdw11:00000000 00:07:52.056 [2024-11-16 16:43:37.553778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.056 [2024-11-16 16:43:37.553887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000004 cdw11:00000000 00:07:52.056 [2024-11-16 16:43:37.553903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.056 #36 NEW cov: 11760 ft: 14235 corp: 32/206b lim: 10 exec/s: 36 rss: 68Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:07:52.056 [2024-11-16 16:43:37.593812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007c55 cdw11:00000000 00:07:52.056 [2024-11-16 16:43:37.593837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.056 [2024-11-16 16:43:37.593967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 00:07:52.056 [2024-11-16 16:43:37.593984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.056 [2024-11-16 16:43:37.594100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.056 [2024-11-16 16:43:37.594117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.056 [2024-11-16 16:43:37.594234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000040 cdw11:00000000 00:07:52.056 [2024-11-16 16:43:37.594252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.056 [2024-11-16 16:43:37.594366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000040 cdw11:00000000 00:07:52.056 [2024-11-16 16:43:37.594382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.056 #37 NEW cov: 11760 ft: 14274 corp: 33/216b lim: 10 exec/s: 37 rss: 68Mb L: 10/10 MS: 1 ChangeBit- 00:07:52.056 [2024-11-16 16:43:37.633278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:07:52.056 [2024-11-16 16:43:37.633304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.056 [2024-11-16 16:43:37.633420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.056 [2024-11-16 16:43:37.633436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.056 #38 NEW cov: 11760 ft: 14289 corp: 34/221b lim: 10 exec/s: 38 rss: 68Mb L: 5/10 MS: 1 CrossOver- 00:07:52.056 [2024-11-16 16:43:37.673811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:07:52.056 [2024-11-16 16:43:37.673836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.056 [2024-11-16 16:43:37.673955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.056 [2024-11-16 16:43:37.673970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.056 [2024-11-16 16:43:37.674081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000dbdb cdw11:00000000 00:07:52.056 [2024-11-16 16:43:37.674096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.056 [2024-11-16 16:43:37.674204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000dbdb cdw11:00000000 00:07:52.056 [2024-11-16 16:43:37.674220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.056 #39 NEW cov: 11760 ft: 14311 corp: 35/230b lim: 10 exec/s: 39 rss: 69Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:07:52.056 [2024-11-16 16:43:37.713919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a27 cdw11:00000000 00:07:52.056 [2024-11-16 16:43:37.713944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.057 [2024-11-16 16:43:37.714066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.057 [2024-11-16 16:43:37.714083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.057 [2024-11-16 16:43:37.714193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.057 [2024-11-16 16:43:37.714209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.057 [2024-11-16 16:43:37.714321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.057 [2024-11-16 16:43:37.714342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.057 #40 NEW cov: 11760 ft: 14320 corp: 36/239b lim: 10 exec/s: 40 rss: 69Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:07:52.057 [2024-11-16 16:43:37.753653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a3f cdw11:00000000 00:07:52.057 [2024-11-16 16:43:37.753682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.057 [2024-11-16 16:43:37.753817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.057 [2024-11-16 16:43:37.753833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.057 #41 NEW cov: 11760 ft: 14339 corp: 37/244b lim: 10 exec/s: 41 rss: 69Mb L: 5/10 MS: 1 InsertByte- 00:07:52.057 [2024-11-16 16:43:37.784412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.057 [2024-11-16 16:43:37.784439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.057 [2024-11-16 16:43:37.784549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000040 cdw11:00000000 00:07:52.057 [2024-11-16 16:43:37.784565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.057 [2024-11-16 16:43:37.784672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00001a5d cdw11:00000000 00:07:52.057 [2024-11-16 16:43:37.784687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.057 [2024-11-16 16:43:37.784803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.057 [2024-11-16 16:43:37.784818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.057 [2024-11-16 16:43:37.784936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000040 cdw11:00000000 00:07:52.057 [2024-11-16 16:43:37.784949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.057 #42 NEW cov: 11760 ft: 14427 corp: 38/254b lim: 10 exec/s: 42 rss: 69Mb L: 10/10 MS: 1 PersAutoDict- DE: "\000\000\000@"- 00:07:52.317 [2024-11-16 16:43:37.824489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007c5d cdw11:00000000 00:07:52.317 [2024-11-16 16:43:37.824515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.317 [2024-11-16 16:43:37.824629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.317 [2024-11-16 16:43:37.824645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.317 [2024-11-16 16:43:37.824756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004100 cdw11:00000000 00:07:52.317 [2024-11-16 16:43:37.824772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.317 [2024-11-16 16:43:37.824885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000040 cdw11:00000000 00:07:52.317 [2024-11-16 16:43:37.824900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.317 [2024-11-16 16:43:37.825002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000040 cdw11:00000000 00:07:52.317 [2024-11-16 16:43:37.825021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.317 #43 NEW cov: 11760 ft: 14430 corp: 39/264b lim: 10 exec/s: 43 rss: 69Mb L: 10/10 MS: 1 ChangeByte- 00:07:52.317 [2024-11-16 16:43:37.864574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000005d cdw11:00000000 00:07:52.317 [2024-11-16 16:43:37.864599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.317 [2024-11-16 16:43:37.864714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000697c cdw11:00000000 00:07:52.317 [2024-11-16 16:43:37.864730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.317 [2024-11-16 16:43:37.864840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.317 [2024-11-16 16:43:37.864855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.317 [2024-11-16 16:43:37.864972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00004000 cdw11:00000000 00:07:52.317 [2024-11-16 16:43:37.864989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.317 [2024-11-16 16:43:37.865107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000040 cdw11:00000000 00:07:52.317 [2024-11-16 16:43:37.865123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:52.317 #44 NEW cov: 11760 ft: 14466 corp: 40/274b lim: 10 exec/s: 44 rss: 69Mb L: 10/10 MS: 1 ShuffleBytes- 00:07:52.317 [2024-11-16 16:43:37.904534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:07:52.317 [2024-11-16 16:43:37.904559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.317 [2024-11-16 16:43:37.904677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000104 cdw11:00000000 00:07:52.317 [2024-11-16 16:43:37.904692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.317 [2024-11-16 16:43:37.904813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.317 [2024-11-16 16:43:37.904830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.317 [2024-11-16 16:43:37.904940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000104 cdw11:00000000 00:07:52.317 [2024-11-16 16:43:37.904956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.317 #45 NEW cov: 11760 ft: 14481 corp: 41/282b lim: 10 exec/s: 45 rss: 69Mb L: 8/10 MS: 1 CopyPart- 00:07:52.317 [2024-11-16 16:43:37.944040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003100 cdw11:00000000 00:07:52.317 [2024-11-16 16:43:37.944067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.317 #46 NEW cov: 11760 ft: 14486 corp: 42/285b lim: 10 exec/s: 46 rss: 69Mb L: 3/10 MS: 1 CrossOver- 00:07:52.317 [2024-11-16 16:43:37.984355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a91 cdw11:00000000 00:07:52.317 [2024-11-16 16:43:37.984381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.317 [2024-11-16 16:43:37.984514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005d00 cdw11:00000000 00:07:52.317 [2024-11-16 16:43:37.984533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.317 #47 NEW cov: 11760 ft: 14499 corp: 43/289b lim: 10 exec/s: 23 rss: 69Mb L: 4/10 MS: 1 InsertByte- 00:07:52.317 #47 DONE cov: 11760 ft: 14499 corp: 43/289b lim: 10 exec/s: 23 rss: 69Mb 00:07:52.317 ###### Recommended dictionary. ###### 00:07:52.317 "\000\000\000@" # Uses: 4 00:07:52.317 "\000\000\000\011" # Uses: 0 00:07:52.317 ###### End of recommended dictionary. ###### 00:07:52.317 Done 47 runs in 2 second(s) 00:07:52.576 16:43:38 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:07:52.576 16:43:38 -- ../common.sh@72 -- # (( i++ )) 00:07:52.576 16:43:38 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:52.576 16:43:38 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:52.576 16:43:38 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:52.576 16:43:38 -- nvmf/run.sh@24 -- # local timen=1 00:07:52.576 16:43:38 -- nvmf/run.sh@25 -- # local core=0x1 00:07:52.576 16:43:38 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:52.576 16:43:38 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:52.576 16:43:38 -- nvmf/run.sh@29 -- # printf %02d 8 00:07:52.576 16:43:38 -- nvmf/run.sh@29 -- # port=4408 00:07:52.576 16:43:38 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:52.576 16:43:38 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:52.576 16:43:38 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:52.576 16:43:38 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:07:52.576 [2024-11-16 16:43:38.168252] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:52.576 [2024-11-16 16:43:38.168318] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid486605 ] 00:07:52.576 EAL: No free 2048 kB hugepages reported on node 1 00:07:52.836 [2024-11-16 16:43:38.352271] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.836 [2024-11-16 16:43:38.372236] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:52.836 [2024-11-16 16:43:38.372352] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.836 [2024-11-16 16:43:38.423696] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:52.836 [2024-11-16 16:43:38.440043] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:52.836 INFO: Running with entropic power schedule (0xFF, 100). 00:07:52.836 INFO: Seed: 656412142 00:07:52.836 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:52.836 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:52.836 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:52.836 INFO: A corpus is not provided, starting from an empty corpus 00:07:52.836 [2024-11-16 16:43:38.505961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.836 [2024-11-16 16:43:38.505997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.836 #2 INITED cov: 11561 ft: 11562 corp: 1/1b exec/s: 0 rss: 65Mb 00:07:52.836 [2024-11-16 16:43:38.546016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.836 [2024-11-16 16:43:38.546045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.836 #3 NEW cov: 11674 ft: 12178 corp: 2/2b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ChangeByte- 00:07:53.095 [2024-11-16 16:43:38.596170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.095 [2024-11-16 16:43:38.596197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.095 #4 NEW cov: 11680 ft: 12341 corp: 3/3b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ChangeByte- 00:07:53.095 [2024-11-16 16:43:38.636521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.095 [2024-11-16 16:43:38.636548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.096 [2024-11-16 16:43:38.636665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.096 [2024-11-16 16:43:38.636687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.096 #5 NEW cov: 11765 ft: 13180 corp: 4/5b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 InsertByte- 00:07:53.096 [2024-11-16 16:43:38.676377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.096 [2024-11-16 16:43:38.676405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.096 #6 NEW cov: 11765 ft: 13319 corp: 5/6b lim: 5 exec/s: 0 rss: 66Mb L: 1/2 MS: 1 ChangeByte- 00:07:53.096 [2024-11-16 16:43:38.716790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.096 [2024-11-16 16:43:38.716815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.096 [2024-11-16 16:43:38.716951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.096 [2024-11-16 16:43:38.716967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.096 #7 NEW cov: 11765 ft: 13410 corp: 6/8b lim: 5 exec/s: 0 rss: 66Mb L: 2/2 MS: 1 InsertByte- 00:07:53.096 [2024-11-16 16:43:38.757661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.096 [2024-11-16 16:43:38.757689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.096 [2024-11-16 16:43:38.757802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.096 [2024-11-16 16:43:38.757818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.096 [2024-11-16 16:43:38.757936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.096 [2024-11-16 16:43:38.757954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.096 [2024-11-16 16:43:38.758069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.096 [2024-11-16 16:43:38.758084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.096 [2024-11-16 16:43:38.758212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.096 [2024-11-16 16:43:38.758227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:53.096 #8 NEW cov: 11765 ft: 13842 corp: 7/13b lim: 5 exec/s: 0 rss: 66Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:53.096 [2024-11-16 16:43:38.806769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.096 [2024-11-16 16:43:38.806795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.096 #9 NEW cov: 11765 ft: 13876 corp: 8/14b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 ChangeByte- 00:07:53.356 [2024-11-16 16:43:38.847182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.356 [2024-11-16 16:43:38.847208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.356 [2024-11-16 16:43:38.847310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.356 [2024-11-16 16:43:38.847326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.356 #10 NEW cov: 11765 ft: 13909 corp: 9/16b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 InsertByte- 00:07:53.356 [2024-11-16 16:43:38.887219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.356 [2024-11-16 16:43:38.887244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.356 [2024-11-16 16:43:38.887359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.356 [2024-11-16 16:43:38.887376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.356 #11 NEW cov: 11765 ft: 13974 corp: 10/18b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 InsertByte- 00:07:53.356 [2024-11-16 16:43:38.927200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.356 [2024-11-16 16:43:38.927225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.356 #12 NEW cov: 11765 ft: 13985 corp: 11/19b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:53.356 [2024-11-16 16:43:38.957758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.356 [2024-11-16 16:43:38.957784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.356 [2024-11-16 16:43:38.957924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.356 [2024-11-16 16:43:38.957940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.356 [2024-11-16 16:43:38.958059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.356 [2024-11-16 16:43:38.958074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.356 #13 NEW cov: 11765 ft: 14161 corp: 12/22b lim: 5 exec/s: 0 rss: 66Mb L: 3/5 MS: 1 CrossOver- 00:07:53.356 [2024-11-16 16:43:38.997806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.356 [2024-11-16 16:43:38.997831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.356 [2024-11-16 16:43:38.997948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.356 [2024-11-16 16:43:38.997965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.356 [2024-11-16 16:43:38.998081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.356 [2024-11-16 16:43:38.998097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.356 #14 NEW cov: 11765 ft: 14214 corp: 13/25b lim: 5 exec/s: 0 rss: 66Mb L: 3/5 MS: 1 CopyPart- 00:07:53.356 [2024-11-16 16:43:39.047964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.356 [2024-11-16 16:43:39.047988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.356 [2024-11-16 16:43:39.048108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.356 [2024-11-16 16:43:39.048124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.356 [2024-11-16 16:43:39.048241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.356 [2024-11-16 16:43:39.048257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.356 #15 NEW cov: 11765 ft: 14232 corp: 14/28b lim: 5 exec/s: 0 rss: 66Mb L: 3/5 MS: 1 ShuffleBytes- 00:07:53.356 [2024-11-16 16:43:39.088041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.356 [2024-11-16 16:43:39.088066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.356 [2024-11-16 16:43:39.088200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.356 [2024-11-16 16:43:39.088217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.356 [2024-11-16 16:43:39.088338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.356 [2024-11-16 16:43:39.088353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.617 #16 NEW cov: 11765 ft: 14324 corp: 15/31b lim: 5 exec/s: 0 rss: 67Mb L: 3/5 MS: 1 ShuffleBytes- 00:07:53.618 [2024-11-16 16:43:39.138029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.618 [2024-11-16 16:43:39.138055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.618 [2024-11-16 16:43:39.138180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.618 [2024-11-16 16:43:39.138196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.618 #17 NEW cov: 11765 ft: 14337 corp: 16/33b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 ChangeBit- 00:07:53.618 [2024-11-16 16:43:39.178125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.618 [2024-11-16 16:43:39.178151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.618 [2024-11-16 16:43:39.178266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.618 [2024-11-16 16:43:39.178294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.618 #18 NEW cov: 11765 ft: 14356 corp: 17/35b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 ChangeBit- 00:07:53.618 [2024-11-16 16:43:39.217973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.618 [2024-11-16 16:43:39.217997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.618 #19 NEW cov: 11765 ft: 14360 corp: 18/36b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ChangeBit- 00:07:53.618 [2024-11-16 16:43:39.248610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.618 [2024-11-16 16:43:39.248635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.618 [2024-11-16 16:43:39.248758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.618 [2024-11-16 16:43:39.248774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.618 [2024-11-16 16:43:39.248888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.618 [2024-11-16 16:43:39.248903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.618 #20 NEW cov: 11765 ft: 14413 corp: 19/39b lim: 5 exec/s: 0 rss: 67Mb L: 3/5 MS: 1 ChangeBit- 00:07:53.618 [2024-11-16 16:43:39.288479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.618 [2024-11-16 16:43:39.288506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.618 [2024-11-16 16:43:39.288626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.618 [2024-11-16 16:43:39.288644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.618 #21 NEW cov: 11765 ft: 14424 corp: 20/41b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 InsertByte- 00:07:53.618 [2024-11-16 16:43:39.338293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.618 [2024-11-16 16:43:39.338319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.618 #22 NEW cov: 11765 ft: 14431 corp: 21/42b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 CopyPart- 00:07:53.876 [2024-11-16 16:43:39.378475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.877 [2024-11-16 16:43:39.378501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.136 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:54.136 #23 NEW cov: 11788 ft: 14464 corp: 22/43b lim: 5 exec/s: 23 rss: 68Mb L: 1/5 MS: 1 ChangeBit- 00:07:54.136 [2024-11-16 16:43:39.669576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.136 [2024-11-16 16:43:39.669608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.136 [2024-11-16 16:43:39.669746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.136 [2024-11-16 16:43:39.669763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.136 #24 NEW cov: 11788 ft: 14576 corp: 23/45b lim: 5 exec/s: 24 rss: 68Mb L: 2/5 MS: 1 ChangeBit- 00:07:54.136 [2024-11-16 16:43:39.709646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.136 [2024-11-16 16:43:39.709677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.136 [2024-11-16 16:43:39.709807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.136 [2024-11-16 16:43:39.709824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.136 #25 NEW cov: 11788 ft: 14583 corp: 24/47b lim: 5 exec/s: 25 rss: 68Mb L: 2/5 MS: 1 InsertByte- 00:07:54.136 [2024-11-16 16:43:39.760083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.136 [2024-11-16 16:43:39.760111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.136 [2024-11-16 16:43:39.760250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.136 [2024-11-16 16:43:39.760268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.136 [2024-11-16 16:43:39.760391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.136 [2024-11-16 16:43:39.760406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.136 #26 NEW cov: 11788 ft: 14627 corp: 25/50b lim: 5 exec/s: 26 rss: 68Mb L: 3/5 MS: 1 CopyPart- 00:07:54.136 [2024-11-16 16:43:39.809708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.136 [2024-11-16 16:43:39.809736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.136 #27 NEW cov: 11788 ft: 14647 corp: 26/51b lim: 5 exec/s: 27 rss: 68Mb L: 1/5 MS: 1 CrossOver- 00:07:54.136 [2024-11-16 16:43:39.850405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.136 [2024-11-16 16:43:39.850432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.136 [2024-11-16 16:43:39.850552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.136 [2024-11-16 16:43:39.850568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.136 [2024-11-16 16:43:39.850695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.136 [2024-11-16 16:43:39.850713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.136 #28 NEW cov: 11788 ft: 14662 corp: 27/54b lim: 5 exec/s: 28 rss: 69Mb L: 3/5 MS: 1 CrossOver- 00:07:54.395 [2024-11-16 16:43:39.900795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.395 [2024-11-16 16:43:39.900824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.395 [2024-11-16 16:43:39.900967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.395 [2024-11-16 16:43:39.900983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.395 [2024-11-16 16:43:39.901083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.396 [2024-11-16 16:43:39.901097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.396 [2024-11-16 16:43:39.901229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.396 [2024-11-16 16:43:39.901244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.396 #29 NEW cov: 11788 ft: 14681 corp: 28/58b lim: 5 exec/s: 29 rss: 69Mb L: 4/5 MS: 1 CrossOver- 00:07:54.396 [2024-11-16 16:43:39.940652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.396 [2024-11-16 16:43:39.940686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.396 [2024-11-16 16:43:39.940830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.396 [2024-11-16 16:43:39.940849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.396 [2024-11-16 16:43:39.940990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.396 [2024-11-16 16:43:39.941007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.396 #30 NEW cov: 11788 ft: 14697 corp: 29/61b lim: 5 exec/s: 30 rss: 69Mb L: 3/5 MS: 1 CopyPart- 00:07:54.396 [2024-11-16 16:43:39.990188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.396 [2024-11-16 16:43:39.990214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.396 #31 NEW cov: 11788 ft: 14705 corp: 30/62b lim: 5 exec/s: 31 rss: 69Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:54.396 [2024-11-16 16:43:40.031520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.396 [2024-11-16 16:43:40.031548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.396 [2024-11-16 16:43:40.031673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.396 [2024-11-16 16:43:40.031693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.396 [2024-11-16 16:43:40.031815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.396 [2024-11-16 16:43:40.031832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.396 [2024-11-16 16:43:40.031953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.396 [2024-11-16 16:43:40.031968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.396 [2024-11-16 16:43:40.032088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.396 [2024-11-16 16:43:40.032105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.396 #32 NEW cov: 11788 ft: 14716 corp: 31/67b lim: 5 exec/s: 32 rss: 69Mb L: 5/5 MS: 1 CrossOver- 00:07:54.396 [2024-11-16 16:43:40.080440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.396 [2024-11-16 16:43:40.080469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.396 #33 NEW cov: 11788 ft: 14754 corp: 32/68b lim: 5 exec/s: 33 rss: 69Mb L: 1/5 MS: 1 ChangeBit- 00:07:54.396 [2024-11-16 16:43:40.121442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.396 [2024-11-16 16:43:40.121469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.396 [2024-11-16 16:43:40.121591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.396 [2024-11-16 16:43:40.121608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.396 [2024-11-16 16:43:40.121729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.396 [2024-11-16 16:43:40.121745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.396 [2024-11-16 16:43:40.121867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.396 [2024-11-16 16:43:40.121884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.655 #34 NEW cov: 11788 ft: 14780 corp: 33/72b lim: 5 exec/s: 34 rss: 69Mb L: 4/5 MS: 1 EraseBytes- 00:07:54.655 [2024-11-16 16:43:40.171279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.655 [2024-11-16 16:43:40.171307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.656 [2024-11-16 16:43:40.171440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.656 [2024-11-16 16:43:40.171456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.656 [2024-11-16 16:43:40.171582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.656 [2024-11-16 16:43:40.171601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.656 #35 NEW cov: 11788 ft: 14791 corp: 34/75b lim: 5 exec/s: 35 rss: 69Mb L: 3/5 MS: 1 InsertByte- 00:07:54.656 [2024-11-16 16:43:40.211488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.656 [2024-11-16 16:43:40.211515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.656 [2024-11-16 16:43:40.211638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.656 [2024-11-16 16:43:40.211653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.656 [2024-11-16 16:43:40.211766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.656 [2024-11-16 16:43:40.211782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.656 #36 NEW cov: 11788 ft: 14829 corp: 35/78b lim: 5 exec/s: 36 rss: 69Mb L: 3/5 MS: 1 CopyPart- 00:07:54.656 [2024-11-16 16:43:40.261540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.656 [2024-11-16 16:43:40.261565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.656 [2024-11-16 16:43:40.261694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.656 [2024-11-16 16:43:40.261724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.656 [2024-11-16 16:43:40.261838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.656 [2024-11-16 16:43:40.261855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.656 #37 NEW cov: 11788 ft: 14845 corp: 36/81b lim: 5 exec/s: 37 rss: 69Mb L: 3/5 MS: 1 ChangeByte- 00:07:54.656 [2024-11-16 16:43:40.302026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.656 [2024-11-16 16:43:40.302052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.656 [2024-11-16 16:43:40.302182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.656 [2024-11-16 16:43:40.302198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.656 [2024-11-16 16:43:40.302314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.656 [2024-11-16 16:43:40.302330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.656 [2024-11-16 16:43:40.302448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.656 [2024-11-16 16:43:40.302463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.656 #38 NEW cov: 11788 ft: 14856 corp: 37/85b lim: 5 exec/s: 38 rss: 69Mb L: 4/5 MS: 1 ChangeByte- 00:07:54.656 [2024-11-16 16:43:40.352389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.656 [2024-11-16 16:43:40.352414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.656 [2024-11-16 16:43:40.352509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.656 [2024-11-16 16:43:40.352526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.656 [2024-11-16 16:43:40.352648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.656 [2024-11-16 16:43:40.352666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.656 [2024-11-16 16:43:40.352794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.656 [2024-11-16 16:43:40.352809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.656 [2024-11-16 16:43:40.352932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.656 [2024-11-16 16:43:40.352948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.656 #39 NEW cov: 11788 ft: 14865 corp: 38/90b lim: 5 exec/s: 39 rss: 69Mb L: 5/5 MS: 1 CMP- DE: "\000\000\000\003"- 00:07:54.656 [2024-11-16 16:43:40.391489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.656 [2024-11-16 16:43:40.391513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.917 #40 NEW cov: 11788 ft: 14879 corp: 39/91b lim: 5 exec/s: 40 rss: 69Mb L: 1/5 MS: 1 CrossOver- 00:07:54.917 [2024-11-16 16:43:40.442136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.917 [2024-11-16 16:43:40.442163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.917 [2024-11-16 16:43:40.442302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.917 [2024-11-16 16:43:40.442319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.917 [2024-11-16 16:43:40.442447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.917 [2024-11-16 16:43:40.442463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.917 #41 NEW cov: 11788 ft: 14886 corp: 40/94b lim: 5 exec/s: 41 rss: 69Mb L: 3/5 MS: 1 ChangeByte- 00:07:54.917 [2024-11-16 16:43:40.482737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.917 [2024-11-16 16:43:40.482763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.917 [2024-11-16 16:43:40.482881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.917 [2024-11-16 16:43:40.482907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.917 [2024-11-16 16:43:40.483028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.917 [2024-11-16 16:43:40.483045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.917 [2024-11-16 16:43:40.483162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.917 [2024-11-16 16:43:40.483179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.917 [2024-11-16 16:43:40.483299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.917 [2024-11-16 16:43:40.483315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.917 #42 NEW cov: 11788 ft: 14904 corp: 41/99b lim: 5 exec/s: 21 rss: 69Mb L: 5/5 MS: 1 PersAutoDict- DE: "\000\000\000\003"- 00:07:54.917 #42 DONE cov: 11788 ft: 14904 corp: 41/99b lim: 5 exec/s: 21 rss: 69Mb 00:07:54.917 ###### Recommended dictionary. ###### 00:07:54.917 "\000\000\000\003" # Uses: 1 00:07:54.917 ###### End of recommended dictionary. ###### 00:07:54.917 Done 42 runs in 2 second(s) 00:07:54.917 16:43:40 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:07:54.917 16:43:40 -- ../common.sh@72 -- # (( i++ )) 00:07:54.917 16:43:40 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:54.917 16:43:40 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:54.917 16:43:40 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:54.917 16:43:40 -- nvmf/run.sh@24 -- # local timen=1 00:07:54.917 16:43:40 -- nvmf/run.sh@25 -- # local core=0x1 00:07:54.917 16:43:40 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:54.917 16:43:40 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:54.917 16:43:40 -- nvmf/run.sh@29 -- # printf %02d 9 00:07:54.917 16:43:40 -- nvmf/run.sh@29 -- # port=4409 00:07:54.917 16:43:40 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:54.917 16:43:40 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:54.917 16:43:40 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:54.917 16:43:40 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:07:54.917 [2024-11-16 16:43:40.664679] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:54.917 [2024-11-16 16:43:40.664767] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid487145 ] 00:07:55.176 EAL: No free 2048 kB hugepages reported on node 1 00:07:55.176 [2024-11-16 16:43:40.839574] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.176 [2024-11-16 16:43:40.858903] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:55.176 [2024-11-16 16:43:40.859016] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.176 [2024-11-16 16:43:40.910357] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:55.436 [2024-11-16 16:43:40.926663] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:07:55.436 INFO: Running with entropic power schedule (0xFF, 100). 00:07:55.436 INFO: Seed: 3141401693 00:07:55.436 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:55.436 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:55.436 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:55.436 INFO: A corpus is not provided, starting from an empty corpus 00:07:55.436 [2024-11-16 16:43:40.971927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.436 [2024-11-16 16:43:40.971957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.436 #2 INITED cov: 11548 ft: 11562 corp: 1/1b exec/s: 0 rss: 65Mb 00:07:55.436 [2024-11-16 16:43:41.001889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.436 [2024-11-16 16:43:41.001916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.436 #3 NEW cov: 11674 ft: 12094 corp: 2/2b lim: 5 exec/s: 0 rss: 66Mb L: 1/1 MS: 1 ShuffleBytes- 00:07:55.436 [2024-11-16 16:43:41.042438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.436 [2024-11-16 16:43:41.042465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.436 [2024-11-16 16:43:41.042522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.436 [2024-11-16 16:43:41.042537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.436 [2024-11-16 16:43:41.042590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.436 [2024-11-16 16:43:41.042604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.436 [2024-11-16 16:43:41.042659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.436 [2024-11-16 16:43:41.042677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.436 #4 NEW cov: 11680 ft: 13087 corp: 3/6b lim: 5 exec/s: 0 rss: 66Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:07:55.436 [2024-11-16 16:43:41.082207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.436 [2024-11-16 16:43:41.082233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.436 [2024-11-16 16:43:41.082290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.436 [2024-11-16 16:43:41.082304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.436 #5 NEW cov: 11765 ft: 13573 corp: 4/8b lim: 5 exec/s: 0 rss: 66Mb L: 2/4 MS: 1 InsertByte- 00:07:55.436 [2024-11-16 16:43:41.122367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.436 [2024-11-16 16:43:41.122393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.436 [2024-11-16 16:43:41.122450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.436 [2024-11-16 16:43:41.122466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.436 #6 NEW cov: 11765 ft: 13786 corp: 5/10b lim: 5 exec/s: 0 rss: 66Mb L: 2/4 MS: 1 CrossOver- 00:07:55.436 [2024-11-16 16:43:41.162318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.436 [2024-11-16 16:43:41.162345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.436 #7 NEW cov: 11765 ft: 13888 corp: 6/11b lim: 5 exec/s: 0 rss: 66Mb L: 1/4 MS: 1 ChangeBinInt- 00:07:55.696 [2024-11-16 16:43:41.202570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.696 [2024-11-16 16:43:41.202597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.696 [2024-11-16 16:43:41.202654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.696 [2024-11-16 16:43:41.202673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.696 #8 NEW cov: 11765 ft: 13972 corp: 7/13b lim: 5 exec/s: 0 rss: 66Mb L: 2/4 MS: 1 ShuffleBytes- 00:07:55.696 [2024-11-16 16:43:41.242686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.696 [2024-11-16 16:43:41.242712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.696 [2024-11-16 16:43:41.242772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.696 [2024-11-16 16:43:41.242785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.696 #9 NEW cov: 11765 ft: 14029 corp: 8/15b lim: 5 exec/s: 0 rss: 66Mb L: 2/4 MS: 1 ChangeByte- 00:07:55.696 [2024-11-16 16:43:41.283325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.696 [2024-11-16 16:43:41.283352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.696 [2024-11-16 16:43:41.283406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.696 [2024-11-16 16:43:41.283420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.696 [2024-11-16 16:43:41.283472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.696 [2024-11-16 16:43:41.283486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.696 [2024-11-16 16:43:41.283541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.696 [2024-11-16 16:43:41.283554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.696 [2024-11-16 16:43:41.283608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.697 [2024-11-16 16:43:41.283621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.697 #10 NEW cov: 11765 ft: 14099 corp: 9/20b lim: 5 exec/s: 0 rss: 66Mb L: 5/5 MS: 1 InsertByte- 00:07:55.697 [2024-11-16 16:43:41.323277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.697 [2024-11-16 16:43:41.323304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.697 [2024-11-16 16:43:41.323363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.697 [2024-11-16 16:43:41.323376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.697 [2024-11-16 16:43:41.323430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.697 [2024-11-16 16:43:41.323444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.697 [2024-11-16 16:43:41.323500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.697 [2024-11-16 16:43:41.323514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.697 #11 NEW cov: 11765 ft: 14133 corp: 10/24b lim: 5 exec/s: 0 rss: 66Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:55.697 [2024-11-16 16:43:41.362910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.697 [2024-11-16 16:43:41.362936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.697 #12 NEW cov: 11765 ft: 14221 corp: 11/25b lim: 5 exec/s: 0 rss: 66Mb L: 1/5 MS: 1 CrossOver- 00:07:55.697 [2024-11-16 16:43:41.403144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.697 [2024-11-16 16:43:41.403170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.697 [2024-11-16 16:43:41.403226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.697 [2024-11-16 16:43:41.403240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.697 #13 NEW cov: 11765 ft: 14236 corp: 12/27b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 ChangeByte- 00:07:55.697 [2024-11-16 16:43:41.443583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.697 [2024-11-16 16:43:41.443609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.697 [2024-11-16 16:43:41.443666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.697 [2024-11-16 16:43:41.443685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.697 [2024-11-16 16:43:41.443741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.697 [2024-11-16 16:43:41.443755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.697 [2024-11-16 16:43:41.443810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.697 [2024-11-16 16:43:41.443827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.957 #14 NEW cov: 11765 ft: 14253 corp: 13/31b lim: 5 exec/s: 0 rss: 66Mb L: 4/5 MS: 1 CopyPart- 00:07:55.957 [2024-11-16 16:43:41.493569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.957 [2024-11-16 16:43:41.493594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.957 [2024-11-16 16:43:41.493647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.957 [2024-11-16 16:43:41.493661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.957 [2024-11-16 16:43:41.493720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.957 [2024-11-16 16:43:41.493734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.957 #15 NEW cov: 11765 ft: 14454 corp: 14/34b lim: 5 exec/s: 0 rss: 66Mb L: 3/5 MS: 1 EraseBytes- 00:07:55.957 [2024-11-16 16:43:41.533553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.957 [2024-11-16 16:43:41.533579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.957 [2024-11-16 16:43:41.533633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.957 [2024-11-16 16:43:41.533646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.957 #16 NEW cov: 11765 ft: 14475 corp: 15/36b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:55.957 [2024-11-16 16:43:41.574132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.957 [2024-11-16 16:43:41.574157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.957 [2024-11-16 16:43:41.574215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.957 [2024-11-16 16:43:41.574229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.957 [2024-11-16 16:43:41.574282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.957 [2024-11-16 16:43:41.574296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.957 [2024-11-16 16:43:41.574353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.957 [2024-11-16 16:43:41.574367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.957 [2024-11-16 16:43:41.574421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.957 [2024-11-16 16:43:41.574435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.957 #17 NEW cov: 11765 ft: 14506 corp: 16/41b lim: 5 exec/s: 0 rss: 66Mb L: 5/5 MS: 1 InsertByte- 00:07:55.957 [2024-11-16 16:43:41.613974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.957 [2024-11-16 16:43:41.614003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.957 [2024-11-16 16:43:41.614064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.957 [2024-11-16 16:43:41.614078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.957 [2024-11-16 16:43:41.614134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.957 [2024-11-16 16:43:41.614148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.957 #18 NEW cov: 11765 ft: 14523 corp: 17/44b lim: 5 exec/s: 0 rss: 66Mb L: 3/5 MS: 1 InsertByte- 00:07:55.957 [2024-11-16 16:43:41.653905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.957 [2024-11-16 16:43:41.653930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.957 [2024-11-16 16:43:41.653987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.957 [2024-11-16 16:43:41.654000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.957 #19 NEW cov: 11765 ft: 14533 corp: 18/46b lim: 5 exec/s: 0 rss: 66Mb L: 2/5 MS: 1 ChangeByte- 00:07:55.957 [2024-11-16 16:43:41.694211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.957 [2024-11-16 16:43:41.694236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.957 [2024-11-16 16:43:41.694292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.957 [2024-11-16 16:43:41.694305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.957 [2024-11-16 16:43:41.694361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.957 [2024-11-16 16:43:41.694374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.217 #20 NEW cov: 11765 ft: 14543 corp: 19/49b lim: 5 exec/s: 0 rss: 67Mb L: 3/5 MS: 1 CMP- DE: "\377\004"- 00:07:56.217 [2024-11-16 16:43:41.734120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.217 [2024-11-16 16:43:41.734147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.217 [2024-11-16 16:43:41.734205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.217 [2024-11-16 16:43:41.734219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.217 #21 NEW cov: 11765 ft: 14558 corp: 20/51b lim: 5 exec/s: 0 rss: 67Mb L: 2/5 MS: 1 PersAutoDict- DE: "\377\004"- 00:07:56.217 [2024-11-16 16:43:41.774089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.217 [2024-11-16 16:43:41.774117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.217 #22 NEW cov: 11765 ft: 14626 corp: 21/52b lim: 5 exec/s: 0 rss: 67Mb L: 1/5 MS: 1 ChangeByte- 00:07:56.217 [2024-11-16 16:43:41.814635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.217 [2024-11-16 16:43:41.814661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.217 [2024-11-16 16:43:41.814723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.217 [2024-11-16 16:43:41.814738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.217 [2024-11-16 16:43:41.814795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.217 [2024-11-16 16:43:41.814809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.217 [2024-11-16 16:43:41.814864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.217 [2024-11-16 16:43:41.814878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.217 #23 NEW cov: 11765 ft: 14721 corp: 22/56b lim: 5 exec/s: 0 rss: 67Mb L: 4/5 MS: 1 CopyPart- 00:07:56.217 [2024-11-16 16:43:41.854483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.217 [2024-11-16 16:43:41.854507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.217 [2024-11-16 16:43:41.854564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.217 [2024-11-16 16:43:41.854578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.476 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:56.476 #24 NEW cov: 11788 ft: 14797 corp: 23/58b lim: 5 exec/s: 24 rss: 68Mb L: 2/5 MS: 1 ChangeBit- 00:07:56.476 [2024-11-16 16:43:42.145310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.476 [2024-11-16 16:43:42.145343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.477 [2024-11-16 16:43:42.145401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.477 [2024-11-16 16:43:42.145416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.477 #25 NEW cov: 11788 ft: 14820 corp: 24/60b lim: 5 exec/s: 25 rss: 68Mb L: 2/5 MS: 1 ChangeByte- 00:07:56.477 [2024-11-16 16:43:42.185511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.477 [2024-11-16 16:43:42.185539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.477 [2024-11-16 16:43:42.185599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.477 [2024-11-16 16:43:42.185613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.477 [2024-11-16 16:43:42.185681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.477 [2024-11-16 16:43:42.185696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.477 #26 NEW cov: 11788 ft: 14844 corp: 25/63b lim: 5 exec/s: 26 rss: 68Mb L: 3/5 MS: 1 InsertByte- 00:07:56.737 [2024-11-16 16:43:42.225661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.737 [2024-11-16 16:43:42.225693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.737 [2024-11-16 16:43:42.225755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.737 [2024-11-16 16:43:42.225769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.737 [2024-11-16 16:43:42.225828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.737 [2024-11-16 16:43:42.225849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.737 #27 NEW cov: 11788 ft: 14870 corp: 26/66b lim: 5 exec/s: 27 rss: 68Mb L: 3/5 MS: 1 CopyPart- 00:07:56.737 [2024-11-16 16:43:42.265620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.737 [2024-11-16 16:43:42.265646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.737 [2024-11-16 16:43:42.265708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.737 [2024-11-16 16:43:42.265722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.737 #28 NEW cov: 11788 ft: 14883 corp: 27/68b lim: 5 exec/s: 28 rss: 68Mb L: 2/5 MS: 1 ChangeBit- 00:07:56.737 [2024-11-16 16:43:42.305914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.737 [2024-11-16 16:43:42.305941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.737 [2024-11-16 16:43:42.306000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.737 [2024-11-16 16:43:42.306014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.737 [2024-11-16 16:43:42.306071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.737 [2024-11-16 16:43:42.306085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.737 #29 NEW cov: 11788 ft: 14896 corp: 28/71b lim: 5 exec/s: 29 rss: 68Mb L: 3/5 MS: 1 InsertByte- 00:07:56.737 [2024-11-16 16:43:42.346181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.737 [2024-11-16 16:43:42.346207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.737 [2024-11-16 16:43:42.346266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.737 [2024-11-16 16:43:42.346283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.737 [2024-11-16 16:43:42.346338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.737 [2024-11-16 16:43:42.346353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.737 [2024-11-16 16:43:42.346409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.737 [2024-11-16 16:43:42.346422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.737 #30 NEW cov: 11788 ft: 14910 corp: 29/75b lim: 5 exec/s: 30 rss: 68Mb L: 4/5 MS: 1 PersAutoDict- DE: "\377\004"- 00:07:56.737 [2024-11-16 16:43:42.385967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.737 [2024-11-16 16:43:42.385994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.737 [2024-11-16 16:43:42.386052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.737 [2024-11-16 16:43:42.386066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.737 #31 NEW cov: 11788 ft: 14927 corp: 30/77b lim: 5 exec/s: 31 rss: 68Mb L: 2/5 MS: 1 CrossOver- 00:07:56.737 [2024-11-16 16:43:42.426230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.737 [2024-11-16 16:43:42.426257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.737 [2024-11-16 16:43:42.426318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.737 [2024-11-16 16:43:42.426332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.737 [2024-11-16 16:43:42.426389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.737 [2024-11-16 16:43:42.426419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.737 #32 NEW cov: 11788 ft: 14958 corp: 31/80b lim: 5 exec/s: 32 rss: 68Mb L: 3/5 MS: 1 InsertByte- 00:07:56.737 [2024-11-16 16:43:42.466510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.737 [2024-11-16 16:43:42.466536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.737 [2024-11-16 16:43:42.466596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.737 [2024-11-16 16:43:42.466609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.737 [2024-11-16 16:43:42.466665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.737 [2024-11-16 16:43:42.466684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.737 [2024-11-16 16:43:42.466741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.737 [2024-11-16 16:43:42.466757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.997 #33 NEW cov: 11788 ft: 14964 corp: 32/84b lim: 5 exec/s: 33 rss: 68Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:56.997 [2024-11-16 16:43:42.506182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.997 [2024-11-16 16:43:42.506208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.997 [2024-11-16 16:43:42.506265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.997 [2024-11-16 16:43:42.506280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.997 #34 NEW cov: 11788 ft: 14981 corp: 33/86b lim: 5 exec/s: 34 rss: 68Mb L: 2/5 MS: 1 CopyPart- 00:07:56.997 [2024-11-16 16:43:42.546380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.997 [2024-11-16 16:43:42.546406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.997 [2024-11-16 16:43:42.546464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.997 [2024-11-16 16:43:42.546477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.997 #35 NEW cov: 11788 ft: 15048 corp: 34/88b lim: 5 exec/s: 35 rss: 68Mb L: 2/5 MS: 1 EraseBytes- 00:07:56.997 [2024-11-16 16:43:42.586881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.997 [2024-11-16 16:43:42.586908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.997 [2024-11-16 16:43:42.586966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.997 [2024-11-16 16:43:42.586980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.997 [2024-11-16 16:43:42.587037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.997 [2024-11-16 16:43:42.587051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.997 [2024-11-16 16:43:42.587108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.997 [2024-11-16 16:43:42.587122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.997 #36 NEW cov: 11788 ft: 15053 corp: 35/92b lim: 5 exec/s: 36 rss: 69Mb L: 4/5 MS: 1 InsertByte- 00:07:56.997 [2024-11-16 16:43:42.626941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.997 [2024-11-16 16:43:42.626967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.997 [2024-11-16 16:43:42.627024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.997 [2024-11-16 16:43:42.627042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.997 [2024-11-16 16:43:42.627096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.997 [2024-11-16 16:43:42.627110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.997 [2024-11-16 16:43:42.627164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.997 [2024-11-16 16:43:42.627178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.997 #37 NEW cov: 11788 ft: 15062 corp: 36/96b lim: 5 exec/s: 37 rss: 69Mb L: 4/5 MS: 1 ShuffleBytes- 00:07:56.997 [2024-11-16 16:43:42.666746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.997 [2024-11-16 16:43:42.666772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.997 [2024-11-16 16:43:42.666829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.998 [2024-11-16 16:43:42.666842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.998 #38 NEW cov: 11788 ft: 15072 corp: 37/98b lim: 5 exec/s: 38 rss: 69Mb L: 2/5 MS: 1 ChangeBit- 00:07:56.998 [2024-11-16 16:43:42.706688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.998 [2024-11-16 16:43:42.706716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.998 #39 NEW cov: 11788 ft: 15089 corp: 38/99b lim: 5 exec/s: 39 rss: 69Mb L: 1/5 MS: 1 ChangeBit- 00:07:56.998 [2024-11-16 16:43:42.737315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.998 [2024-11-16 16:43:42.737341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.998 [2024-11-16 16:43:42.737400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.998 [2024-11-16 16:43:42.737415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.998 [2024-11-16 16:43:42.737471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.998 [2024-11-16 16:43:42.737485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.998 [2024-11-16 16:43:42.737540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.998 [2024-11-16 16:43:42.737553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.258 #40 NEW cov: 11788 ft: 15090 corp: 39/103b lim: 5 exec/s: 40 rss: 69Mb L: 4/5 MS: 1 CopyPart- 00:07:57.258 [2024-11-16 16:43:42.787111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.258 [2024-11-16 16:43:42.787137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.258 [2024-11-16 16:43:42.787196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.258 [2024-11-16 16:43:42.787215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.258 #41 NEW cov: 11788 ft: 15108 corp: 40/105b lim: 5 exec/s: 41 rss: 69Mb L: 2/5 MS: 1 EraseBytes- 00:07:57.258 [2024-11-16 16:43:42.827227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.258 [2024-11-16 16:43:42.827254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.258 [2024-11-16 16:43:42.827311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.258 [2024-11-16 16:43:42.827325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.258 #42 NEW cov: 11788 ft: 15116 corp: 41/107b lim: 5 exec/s: 42 rss: 69Mb L: 2/5 MS: 1 CMP- DE: "\001\000"- 00:07:57.258 [2024-11-16 16:43:42.867194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.258 [2024-11-16 16:43:42.867221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.258 #43 NEW cov: 11788 ft: 15158 corp: 42/108b lim: 5 exec/s: 43 rss: 69Mb L: 1/5 MS: 1 EraseBytes- 00:07:57.258 [2024-11-16 16:43:42.907479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.258 [2024-11-16 16:43:42.907506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.258 [2024-11-16 16:43:42.907564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.258 [2024-11-16 16:43:42.907578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.258 #44 NEW cov: 11788 ft: 15166 corp: 43/110b lim: 5 exec/s: 44 rss: 69Mb L: 2/5 MS: 1 PersAutoDict- DE: "\377\004"- 00:07:57.258 [2024-11-16 16:43:42.947881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.258 [2024-11-16 16:43:42.947908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.258 [2024-11-16 16:43:42.947966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.258 [2024-11-16 16:43:42.947980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.258 [2024-11-16 16:43:42.948035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.258 [2024-11-16 16:43:42.948049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.258 [2024-11-16 16:43:42.948105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.258 [2024-11-16 16:43:42.948118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.258 #45 NEW cov: 11788 ft: 15213 corp: 44/114b lim: 5 exec/s: 22 rss: 69Mb L: 4/5 MS: 1 ShuffleBytes- 00:07:57.258 #45 DONE cov: 11788 ft: 15213 corp: 44/114b lim: 5 exec/s: 22 rss: 69Mb 00:07:57.258 ###### Recommended dictionary. ###### 00:07:57.258 "\377\004" # Uses: 3 00:07:57.258 "\001\000" # Uses: 0 00:07:57.258 ###### End of recommended dictionary. ###### 00:07:57.258 Done 45 runs in 2 second(s) 00:07:57.518 16:43:43 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:07:57.518 16:43:43 -- ../common.sh@72 -- # (( i++ )) 00:07:57.518 16:43:43 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:57.518 16:43:43 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:07:57.518 16:43:43 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:07:57.518 16:43:43 -- nvmf/run.sh@24 -- # local timen=1 00:07:57.518 16:43:43 -- nvmf/run.sh@25 -- # local core=0x1 00:07:57.518 16:43:43 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:57.518 16:43:43 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:07:57.518 16:43:43 -- nvmf/run.sh@29 -- # printf %02d 10 00:07:57.518 16:43:43 -- nvmf/run.sh@29 -- # port=4410 00:07:57.518 16:43:43 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:57.518 16:43:43 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:07:57.518 16:43:43 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:57.518 16:43:43 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:07:57.518 [2024-11-16 16:43:43.131985] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:07:57.518 [2024-11-16 16:43:43.132076] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid487591 ] 00:07:57.518 EAL: No free 2048 kB hugepages reported on node 1 00:07:57.778 [2024-11-16 16:43:43.317236] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.778 [2024-11-16 16:43:43.336894] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:57.778 [2024-11-16 16:43:43.337008] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.778 [2024-11-16 16:43:43.388249] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:57.778 [2024-11-16 16:43:43.404582] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:07:57.778 INFO: Running with entropic power schedule (0xFF, 100). 00:07:57.778 INFO: Seed: 1325434097 00:07:57.778 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:07:57.778 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:07:57.778 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:57.778 INFO: A corpus is not provided, starting from an empty corpus 00:07:57.778 #2 INITED exec/s: 0 rss: 59Mb 00:07:57.778 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:57.778 This may also happen if the target rejected all inputs we tried so far 00:07:57.778 [2024-11-16 16:43:43.449947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.778 [2024-11-16 16:43:43.449975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.778 [2024-11-16 16:43:43.450036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.778 [2024-11-16 16:43:43.450050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.037 NEW_FUNC[1/670]: 0x45e248 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:07:58.038 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:58.038 #13 NEW cov: 11568 ft: 11569 corp: 2/22b lim: 40 exec/s: 0 rss: 66Mb L: 21/21 MS: 1 InsertRepeatedBytes- 00:07:58.038 [2024-11-16 16:43:43.740408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.038 [2024-11-16 16:43:43.740439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.038 #14 NEW cov: 11697 ft: 12325 corp: 3/34b lim: 40 exec/s: 0 rss: 68Mb L: 12/21 MS: 1 EraseBytes- 00:07:58.297 [2024-11-16 16:43:43.790630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.297 [2024-11-16 16:43:43.790656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.297 [2024-11-16 16:43:43.790716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.297 [2024-11-16 16:43:43.790740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.297 #15 NEW cov: 11703 ft: 12568 corp: 4/55b lim: 40 exec/s: 0 rss: 68Mb L: 21/21 MS: 1 CopyPart- 00:07:58.297 [2024-11-16 16:43:43.830728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.297 [2024-11-16 16:43:43.830753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.297 [2024-11-16 16:43:43.830812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.297 [2024-11-16 16:43:43.830826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.297 #16 NEW cov: 11788 ft: 12899 corp: 5/76b lim: 40 exec/s: 0 rss: 68Mb L: 21/21 MS: 1 ShuffleBytes- 00:07:58.297 [2024-11-16 16:43:43.870851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.297 [2024-11-16 16:43:43.870877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.297 [2024-11-16 16:43:43.870932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.297 [2024-11-16 16:43:43.870945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.297 #19 NEW cov: 11788 ft: 13000 corp: 6/96b lim: 40 exec/s: 0 rss: 68Mb L: 20/21 MS: 3 ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:07:58.297 [2024-11-16 16:43:43.911175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.297 [2024-11-16 16:43:43.911201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.297 [2024-11-16 16:43:43.911255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.298 [2024-11-16 16:43:43.911269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.298 [2024-11-16 16:43:43.911321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.298 [2024-11-16 16:43:43.911335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.298 [2024-11-16 16:43:43.911392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.298 [2024-11-16 16:43:43.911405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.298 #20 NEW cov: 11788 ft: 13554 corp: 7/132b lim: 40 exec/s: 0 rss: 68Mb L: 36/36 MS: 1 CrossOver- 00:07:58.298 [2024-11-16 16:43:43.950928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.298 [2024-11-16 16:43:43.950954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.298 #21 NEW cov: 11788 ft: 13754 corp: 8/144b lim: 40 exec/s: 0 rss: 68Mb L: 12/36 MS: 1 ChangeByte- 00:07:58.298 [2024-11-16 16:43:43.991422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.298 [2024-11-16 16:43:43.991447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.298 [2024-11-16 16:43:43.991499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.298 [2024-11-16 16:43:43.991512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.298 [2024-11-16 16:43:43.991565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:5454540a cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.298 [2024-11-16 16:43:43.991579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.298 [2024-11-16 16:43:43.991633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.298 [2024-11-16 16:43:43.991646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.298 #27 NEW cov: 11788 ft: 13841 corp: 9/183b lim: 40 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 CrossOver- 00:07:58.298 [2024-11-16 16:43:44.031194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a505454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.298 [2024-11-16 16:43:44.031219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.557 #28 NEW cov: 11788 ft: 13874 corp: 10/195b lim: 40 exec/s: 0 rss: 68Mb L: 12/39 MS: 1 ChangeBinInt- 00:07:58.558 [2024-11-16 16:43:44.071457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.558 [2024-11-16 16:43:44.071482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.558 [2024-11-16 16:43:44.071538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.558 [2024-11-16 16:43:44.071552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.558 #29 NEW cov: 11788 ft: 13908 corp: 11/216b lim: 40 exec/s: 0 rss: 68Mb L: 21/39 MS: 1 ShuffleBytes- 00:07:58.558 [2024-11-16 16:43:44.111799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.558 [2024-11-16 16:43:44.111824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.558 [2024-11-16 16:43:44.111881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.558 [2024-11-16 16:43:44.111898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.558 [2024-11-16 16:43:44.111951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.558 [2024-11-16 16:43:44.111965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.558 [2024-11-16 16:43:44.112019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.558 [2024-11-16 16:43:44.112033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.558 #30 NEW cov: 11788 ft: 13939 corp: 12/252b lim: 40 exec/s: 0 rss: 68Mb L: 36/39 MS: 1 ShuffleBytes- 00:07:58.558 [2024-11-16 16:43:44.151468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000000c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.558 [2024-11-16 16:43:44.151493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.558 #31 NEW cov: 11788 ft: 13987 corp: 13/264b lim: 40 exec/s: 0 rss: 68Mb L: 12/39 MS: 1 ChangeBinInt- 00:07:58.558 [2024-11-16 16:43:44.191748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.558 [2024-11-16 16:43:44.191773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.558 [2024-11-16 16:43:44.191828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.558 [2024-11-16 16:43:44.191842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.558 #32 NEW cov: 11788 ft: 14028 corp: 14/283b lim: 40 exec/s: 0 rss: 68Mb L: 19/39 MS: 1 EraseBytes- 00:07:58.558 [2024-11-16 16:43:44.232081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.558 [2024-11-16 16:43:44.232106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.558 [2024-11-16 16:43:44.232163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.558 [2024-11-16 16:43:44.232176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.558 [2024-11-16 16:43:44.232229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.558 [2024-11-16 16:43:44.232244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.558 [2024-11-16 16:43:44.232295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:55555554 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.558 [2024-11-16 16:43:44.232308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.558 #33 NEW cov: 11788 ft: 14078 corp: 15/317b lim: 40 exec/s: 0 rss: 68Mb L: 34/39 MS: 1 InsertRepeatedBytes- 00:07:58.558 [2024-11-16 16:43:44.272239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.558 [2024-11-16 16:43:44.272271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.558 [2024-11-16 16:43:44.272327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:543b3b3b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.558 [2024-11-16 16:43:44.272341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.558 [2024-11-16 16:43:44.272394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:3b3b3b3b cdw11:3b3b3b3b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.558 [2024-11-16 16:43:44.272407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.558 [2024-11-16 16:43:44.272461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:3b3b3b3b cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.558 [2024-11-16 16:43:44.272474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.558 #34 NEW cov: 11788 ft: 14189 corp: 16/353b lim: 40 exec/s: 0 rss: 68Mb L: 36/39 MS: 1 InsertRepeatedBytes- 00:07:58.818 [2024-11-16 16:43:44.312045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.818 [2024-11-16 16:43:44.312071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.818 [2024-11-16 16:43:44.312127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.818 [2024-11-16 16:43:44.312141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.818 #35 NEW cov: 11788 ft: 14198 corp: 17/374b lim: 40 exec/s: 0 rss: 68Mb L: 21/39 MS: 1 ChangeBit- 00:07:58.818 [2024-11-16 16:43:44.352194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a54a5ab cdw11:abababab SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.818 [2024-11-16 16:43:44.352221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.818 [2024-11-16 16:43:44.352277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:abab5454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.818 [2024-11-16 16:43:44.352292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.818 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:58.818 #36 NEW cov: 11811 ft: 14229 corp: 18/395b lim: 40 exec/s: 0 rss: 69Mb L: 21/39 MS: 1 ChangeBinInt- 00:07:58.818 [2024-11-16 16:43:44.392199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.818 [2024-11-16 16:43:44.392225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.818 #37 NEW cov: 11811 ft: 14263 corp: 19/407b lim: 40 exec/s: 0 rss: 69Mb L: 12/39 MS: 1 CrossOver- 00:07:58.818 [2024-11-16 16:43:44.422754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.818 [2024-11-16 16:43:44.422780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.818 [2024-11-16 16:43:44.422833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.818 [2024-11-16 16:43:44.422847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.818 [2024-11-16 16:43:44.422902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0a545400 cdw11:00000054 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.818 [2024-11-16 16:43:44.422916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.818 [2024-11-16 16:43:44.422970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.818 [2024-11-16 16:43:44.422983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.818 [2024-11-16 16:43:44.423036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.818 [2024-11-16 16:43:44.423049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.818 #38 NEW cov: 11811 ft: 14350 corp: 20/447b lim: 40 exec/s: 38 rss: 69Mb L: 40/40 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:58.818 [2024-11-16 16:43:44.462519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:27000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.818 [2024-11-16 16:43:44.462544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.818 [2024-11-16 16:43:44.462601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.818 [2024-11-16 16:43:44.462614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.818 #44 NEW cov: 11811 ft: 14369 corp: 21/468b lim: 40 exec/s: 44 rss: 69Mb L: 21/40 MS: 1 InsertByte- 00:07:58.818 [2024-11-16 16:43:44.502908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.818 [2024-11-16 16:43:44.502933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.818 [2024-11-16 16:43:44.502991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.818 [2024-11-16 16:43:44.503004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.818 [2024-11-16 16:43:44.503058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0a545454 cdw11:54543254 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.818 [2024-11-16 16:43:44.503072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.818 [2024-11-16 16:43:44.503126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.818 [2024-11-16 16:43:44.503139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.818 #45 NEW cov: 11811 ft: 14402 corp: 22/504b lim: 40 exec/s: 45 rss: 69Mb L: 36/40 MS: 1 ChangeByte- 00:07:58.818 [2024-11-16 16:43:44.543186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a54acab cdw11:abb35454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.818 [2024-11-16 16:43:44.543211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.818 [2024-11-16 16:43:44.543269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.819 [2024-11-16 16:43:44.543288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.819 [2024-11-16 16:43:44.543342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0a545400 cdw11:00000054 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.819 [2024-11-16 16:43:44.543356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.819 [2024-11-16 16:43:44.543412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.819 [2024-11-16 16:43:44.543425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.819 [2024-11-16 16:43:44.543479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.819 [2024-11-16 16:43:44.543492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.078 #46 NEW cov: 11811 ft: 14413 corp: 23/544b lim: 40 exec/s: 46 rss: 69Mb L: 40/40 MS: 1 ChangeBinInt- 00:07:59.078 [2024-11-16 16:43:44.583048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54540000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.078 [2024-11-16 16:43:44.583074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.078 [2024-11-16 16:43:44.583132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00005454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.078 [2024-11-16 16:43:44.583147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.078 [2024-11-16 16:43:44.583202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.078 [2024-11-16 16:43:44.583225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.078 #47 NEW cov: 11811 ft: 14619 corp: 24/569b lim: 40 exec/s: 47 rss: 69Mb L: 25/40 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:59.078 [2024-11-16 16:43:44.622970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:27000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.078 [2024-11-16 16:43:44.622996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.078 [2024-11-16 16:43:44.623054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.078 [2024-11-16 16:43:44.623069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.078 #48 NEW cov: 11811 ft: 14635 corp: 25/590b lim: 40 exec/s: 48 rss: 69Mb L: 21/40 MS: 1 ShuffleBytes- 00:07:59.078 [2024-11-16 16:43:44.663399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a54a5ab cdw11:abababab SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.079 [2024-11-16 16:43:44.663425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.079 [2024-11-16 16:43:44.663481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:abab5454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.079 [2024-11-16 16:43:44.663495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.079 [2024-11-16 16:43:44.663550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:545454a5 cdw11:abababab SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.079 [2024-11-16 16:43:44.663566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.079 [2024-11-16 16:43:44.663620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ababab54 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.079 [2024-11-16 16:43:44.663633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.079 #49 NEW cov: 11811 ft: 14660 corp: 26/628b lim: 40 exec/s: 49 rss: 69Mb L: 38/40 MS: 1 CopyPart- 00:07:59.079 [2024-11-16 16:43:44.713306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.079 [2024-11-16 16:43:44.713332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.079 [2024-11-16 16:43:44.713389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.079 [2024-11-16 16:43:44.713403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.079 #50 NEW cov: 11811 ft: 14683 corp: 27/648b lim: 40 exec/s: 50 rss: 69Mb L: 20/40 MS: 1 ShuffleBytes- 00:07:59.079 [2024-11-16 16:43:44.753410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:4a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.079 [2024-11-16 16:43:44.753436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.079 [2024-11-16 16:43:44.753491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.079 [2024-11-16 16:43:44.753504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.079 #51 NEW cov: 11811 ft: 14696 corp: 28/667b lim: 40 exec/s: 51 rss: 69Mb L: 19/40 MS: 1 ChangeBit- 00:07:59.079 [2024-11-16 16:43:44.793758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.079 [2024-11-16 16:43:44.793784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.079 [2024-11-16 16:43:44.793845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.079 [2024-11-16 16:43:44.793858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.079 [2024-11-16 16:43:44.793915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:55555555 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.079 [2024-11-16 16:43:44.793928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.079 [2024-11-16 16:43:44.793985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:55555554 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.079 [2024-11-16 16:43:44.793998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.079 #52 NEW cov: 11811 ft: 14713 corp: 29/701b lim: 40 exec/s: 52 rss: 69Mb L: 34/40 MS: 1 ShuffleBytes- 00:07:59.339 [2024-11-16 16:43:44.834003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a54acab cdw11:abb35454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.339 [2024-11-16 16:43:44.834030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.339 [2024-11-16 16:43:44.834089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.339 [2024-11-16 16:43:44.834103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.339 [2024-11-16 16:43:44.834157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0a5454b9 cdw11:00000054 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.339 [2024-11-16 16:43:44.834171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.339 [2024-11-16 16:43:44.834225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.339 [2024-11-16 16:43:44.834238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.339 [2024-11-16 16:43:44.834295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.339 [2024-11-16 16:43:44.834308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.339 #53 NEW cov: 11811 ft: 14716 corp: 30/741b lim: 40 exec/s: 53 rss: 69Mb L: 40/40 MS: 1 ChangeByte- 00:07:59.339 [2024-11-16 16:43:44.873722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:4a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.339 [2024-11-16 16:43:44.873747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.339 [2024-11-16 16:43:44.873802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.339 [2024-11-16 16:43:44.873816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.339 #54 NEW cov: 11811 ft: 14724 corp: 31/763b lim: 40 exec/s: 54 rss: 69Mb L: 22/40 MS: 1 CopyPart- 00:07:59.339 [2024-11-16 16:43:44.914088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.339 [2024-11-16 16:43:44.914113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.339 [2024-11-16 16:43:44.914169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.339 [2024-11-16 16:43:44.914183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.339 [2024-11-16 16:43:44.914222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:55555555 cdw11:5555ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.339 [2024-11-16 16:43:44.914235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.339 [2024-11-16 16:43:44.914292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffff55 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.339 [2024-11-16 16:43:44.914305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.339 #55 NEW cov: 11811 ft: 14734 corp: 32/802b lim: 40 exec/s: 55 rss: 70Mb L: 39/40 MS: 1 InsertRepeatedBytes- 00:07:59.339 [2024-11-16 16:43:44.954201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a54a5ab cdw11:abababab SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.339 [2024-11-16 16:43:44.954226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.339 [2024-11-16 16:43:44.954285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:abab5454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.339 [2024-11-16 16:43:44.954299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.339 [2024-11-16 16:43:44.954352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54c2c2c2 cdw11:c2c2c2c2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.339 [2024-11-16 16:43:44.954365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.339 [2024-11-16 16:43:44.954423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:c2c2c2c2 cdw11:c2545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.339 [2024-11-16 16:43:44.954436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.339 #56 NEW cov: 11811 ft: 14756 corp: 33/835b lim: 40 exec/s: 56 rss: 70Mb L: 33/40 MS: 1 InsertRepeatedBytes- 00:07:59.339 [2024-11-16 16:43:44.994055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a54a5ab cdw11:abababab SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.339 [2024-11-16 16:43:44.994081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.339 [2024-11-16 16:43:44.994136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:abab5454 cdw11:5454545b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.339 [2024-11-16 16:43:44.994150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.340 #57 NEW cov: 11811 ft: 14781 corp: 34/856b lim: 40 exec/s: 57 rss: 70Mb L: 21/40 MS: 1 ChangeByte- 00:07:59.340 [2024-11-16 16:43:45.034512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:5454ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.340 [2024-11-16 16:43:45.034538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.340 [2024-11-16 16:43:45.034595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:00005454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.340 [2024-11-16 16:43:45.034609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.340 [2024-11-16 16:43:45.034641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0a545400 cdw11:00000054 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.340 [2024-11-16 16:43:45.034655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.340 [2024-11-16 16:43:45.034713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.340 [2024-11-16 16:43:45.034727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.340 [2024-11-16 16:43:45.034783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.340 [2024-11-16 16:43:45.034796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.340 #58 NEW cov: 11811 ft: 14797 corp: 35/896b lim: 40 exec/s: 58 rss: 70Mb L: 40/40 MS: 1 CMP- DE: "\377\377\377\377\000\000\000\000"- 00:07:59.340 [2024-11-16 16:43:45.074149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.340 [2024-11-16 16:43:45.074179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.599 #59 NEW cov: 11811 ft: 14861 corp: 36/906b lim: 40 exec/s: 59 rss: 70Mb L: 10/40 MS: 1 EraseBytes- 00:07:59.599 [2024-11-16 16:43:45.114389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:4a545454 cdw11:540a5454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.599 [2024-11-16 16:43:45.114414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.599 [2024-11-16 16:43:45.114468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.599 [2024-11-16 16:43:45.114482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.599 #60 NEW cov: 11811 ft: 14889 corp: 37/928b lim: 40 exec/s: 60 rss: 70Mb L: 22/40 MS: 1 CrossOver- 00:07:59.599 [2024-11-16 16:43:45.154766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a54a5ab cdw11:abababab SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.599 [2024-11-16 16:43:45.154792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.599 [2024-11-16 16:43:45.154848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:a3ab5454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.599 [2024-11-16 16:43:45.154862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.599 [2024-11-16 16:43:45.154917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:54c2c2c2 cdw11:c2c2c2c2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.599 [2024-11-16 16:43:45.154931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.599 [2024-11-16 16:43:45.154987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:c2c2c2c2 cdw11:c2545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.599 [2024-11-16 16:43:45.155000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.599 #61 NEW cov: 11811 ft: 14894 corp: 38/961b lim: 40 exec/s: 61 rss: 70Mb L: 33/40 MS: 1 ChangeBinInt- 00:07:59.599 [2024-11-16 16:43:45.194978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.599 [2024-11-16 16:43:45.195004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.599 [2024-11-16 16:43:45.195061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.600 [2024-11-16 16:43:45.195074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.600 [2024-11-16 16:43:45.195131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0a545400 cdw11:00000054 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.600 [2024-11-16 16:43:45.195144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.600 [2024-11-16 16:43:45.195199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54549a01 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.600 [2024-11-16 16:43:45.195212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.600 [2024-11-16 16:43:45.195264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00005454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.600 [2024-11-16 16:43:45.195281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.600 #62 NEW cov: 11811 ft: 14908 corp: 39/1001b lim: 40 exec/s: 62 rss: 70Mb L: 40/40 MS: 1 CMP- DE: "\232\001\000\000"- 00:07:59.600 [2024-11-16 16:43:45.235094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.600 [2024-11-16 16:43:45.235119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.600 [2024-11-16 16:43:45.235177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.600 [2024-11-16 16:43:45.235191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.600 [2024-11-16 16:43:45.235248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0a545400 cdw11:00000089 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.600 [2024-11-16 16:43:45.235261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.600 [2024-11-16 16:43:45.235317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.600 [2024-11-16 16:43:45.235330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.600 [2024-11-16 16:43:45.235386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:54545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.600 [2024-11-16 16:43:45.235399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.600 #63 NEW cov: 11811 ft: 14918 corp: 40/1041b lim: 40 exec/s: 63 rss: 70Mb L: 40/40 MS: 1 ChangeByte- 00:07:59.600 [2024-11-16 16:43:45.275080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.600 [2024-11-16 16:43:45.275106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.600 [2024-11-16 16:43:45.275161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.600 [2024-11-16 16:43:45.275175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.600 [2024-11-16 16:43:45.275232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:55555555 cdw11:55555454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.600 [2024-11-16 16:43:45.275247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.600 [2024-11-16 16:43:45.275302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:5454ff55 cdw11:55555555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.600 [2024-11-16 16:43:45.275315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.600 #64 NEW cov: 11811 ft: 14920 corp: 41/1080b lim: 40 exec/s: 64 rss: 70Mb L: 39/40 MS: 1 CopyPart- 00:07:59.600 [2024-11-16 16:43:45.315097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:9a010000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.600 [2024-11-16 16:43:45.315122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.600 [2024-11-16 16:43:45.315183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.600 [2024-11-16 16:43:45.315197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.600 [2024-11-16 16:43:45.315256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.600 [2024-11-16 16:43:45.315270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.600 #65 NEW cov: 11811 ft: 14942 corp: 42/1104b lim: 40 exec/s: 65 rss: 70Mb L: 24/40 MS: 1 PersAutoDict- DE: "\232\001\000\000"- 00:07:59.860 [2024-11-16 16:43:45.354954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a540a54 cdw11:545454a5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.860 [2024-11-16 16:43:45.354980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.860 #66 NEW cov: 11811 ft: 14964 corp: 43/1112b lim: 40 exec/s: 66 rss: 70Mb L: 8/40 MS: 1 CrossOver- 00:07:59.860 [2024-11-16 16:43:45.395212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000032 cdw11:00270000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.860 [2024-11-16 16:43:45.395237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.860 [2024-11-16 16:43:45.395295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.860 [2024-11-16 16:43:45.395309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.860 #67 NEW cov: 11811 ft: 14977 corp: 44/1134b lim: 40 exec/s: 67 rss: 70Mb L: 22/40 MS: 1 InsertByte- 00:07:59.860 [2024-11-16 16:43:45.435567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a545454 cdw11:54545454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.860 [2024-11-16 16:43:45.435593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.860 [2024-11-16 16:43:45.435648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:54545454 cdw11:54545555 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.860 [2024-11-16 16:43:45.435662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.860 [2024-11-16 16:43:45.435724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:55555555 cdw11:55555454 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.860 [2024-11-16 16:43:45.435738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.860 [2024-11-16 16:43:45.435797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:5454ff55 cdw11:9a010000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.860 [2024-11-16 16:43:45.435810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.860 #68 NEW cov: 11811 ft: 14986 corp: 45/1173b lim: 40 exec/s: 34 rss: 70Mb L: 39/40 MS: 1 PersAutoDict- DE: "\232\001\000\000"- 00:07:59.860 #68 DONE cov: 11811 ft: 14986 corp: 45/1173b lim: 40 exec/s: 34 rss: 70Mb 00:07:59.860 ###### Recommended dictionary. ###### 00:07:59.860 "\000\000\000\000" # Uses: 1 00:07:59.860 "\377\377\377\377\000\000\000\000" # Uses: 0 00:07:59.860 "\232\001\000\000" # Uses: 2 00:07:59.860 ###### End of recommended dictionary. ###### 00:07:59.860 Done 68 runs in 2 second(s) 00:07:59.860 16:43:45 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:07:59.860 16:43:45 -- ../common.sh@72 -- # (( i++ )) 00:07:59.860 16:43:45 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:59.860 16:43:45 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:07:59.860 16:43:45 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:07:59.860 16:43:45 -- nvmf/run.sh@24 -- # local timen=1 00:07:59.860 16:43:45 -- nvmf/run.sh@25 -- # local core=0x1 00:07:59.860 16:43:45 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:59.860 16:43:45 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:07:59.860 16:43:45 -- nvmf/run.sh@29 -- # printf %02d 11 00:07:59.860 16:43:45 -- nvmf/run.sh@29 -- # port=4411 00:07:59.860 16:43:45 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:59.860 16:43:45 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:07:59.860 16:43:45 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:59.860 16:43:45 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:08:00.120 [2024-11-16 16:43:45.618378] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:00.120 [2024-11-16 16:43:45.618443] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid487978 ] 00:08:00.120 EAL: No free 2048 kB hugepages reported on node 1 00:08:00.120 [2024-11-16 16:43:45.791513] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.120 [2024-11-16 16:43:45.811118] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:00.120 [2024-11-16 16:43:45.811234] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.120 [2024-11-16 16:43:45.862463] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:00.380 [2024-11-16 16:43:45.878801] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:08:00.380 INFO: Running with entropic power schedule (0xFF, 100). 00:08:00.380 INFO: Seed: 3799446225 00:08:00.380 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:00.380 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:00.380 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:00.380 INFO: A corpus is not provided, starting from an empty corpus 00:08:00.380 #2 INITED exec/s: 0 rss: 59Mb 00:08:00.380 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:00.380 This may also happen if the target rejected all inputs we tried so far 00:08:00.380 [2024-11-16 16:43:45.924409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.380 [2024-11-16 16:43:45.924438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.380 [2024-11-16 16:43:45.924500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.380 [2024-11-16 16:43:45.924515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.380 [2024-11-16 16:43:45.924577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.380 [2024-11-16 16:43:45.924591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.640 NEW_FUNC[1/671]: 0x45ffb8 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:00.640 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:00.640 #4 NEW cov: 11596 ft: 11597 corp: 2/30b lim: 40 exec/s: 0 rss: 67Mb L: 29/29 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:00.640 [2024-11-16 16:43:46.225004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.640 [2024-11-16 16:43:46.225035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.640 [2024-11-16 16:43:46.225096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:050505ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.640 [2024-11-16 16:43:46.225110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.640 [2024-11-16 16:43:46.225164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:01000005 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.640 [2024-11-16 16:43:46.225178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.640 #5 NEW cov: 11709 ft: 12058 corp: 3/59b lim: 40 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 CMP- DE: "\377\001\000\000"- 00:08:00.640 [2024-11-16 16:43:46.275133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0aff0100 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.640 [2024-11-16 16:43:46.275160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.640 [2024-11-16 16:43:46.275221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.640 [2024-11-16 16:43:46.275235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.640 [2024-11-16 16:43:46.275295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.640 [2024-11-16 16:43:46.275308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.640 #8 NEW cov: 11715 ft: 12344 corp: 4/87b lim: 40 exec/s: 0 rss: 67Mb L: 28/29 MS: 3 CopyPart-PersAutoDict-InsertRepeatedBytes- DE: "\377\001\000\000"- 00:08:00.640 [2024-11-16 16:43:46.315211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0aff0100 cdw11:484803de SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.640 [2024-11-16 16:43:46.315238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.640 [2024-11-16 16:43:46.315296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:563f2217 cdw11:8a004848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.640 [2024-11-16 16:43:46.315310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.640 [2024-11-16 16:43:46.315368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:48484848 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.640 [2024-11-16 16:43:46.315382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.640 #9 NEW cov: 11800 ft: 12599 corp: 5/115b lim: 40 exec/s: 0 rss: 67Mb L: 28/29 MS: 1 CMP- DE: "\003\336V?\"\027\212\000"- 00:08:00.640 [2024-11-16 16:43:46.355344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0aff0100 cdw11:484803de SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.640 [2024-11-16 16:43:46.355370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.640 [2024-11-16 16:43:46.355431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:563fde56 cdw11:3f22178a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.640 [2024-11-16 16:43:46.355444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.640 [2024-11-16 16:43:46.355499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00484848 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.640 [2024-11-16 16:43:46.355512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.640 #10 NEW cov: 11800 ft: 12656 corp: 6/143b lim: 40 exec/s: 0 rss: 67Mb L: 28/29 MS: 1 CopyPart- 00:08:00.900 [2024-11-16 16:43:46.395326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.900 [2024-11-16 16:43:46.395352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.900 [2024-11-16 16:43:46.395407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.900 [2024-11-16 16:43:46.395421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.900 #11 NEW cov: 11800 ft: 13114 corp: 7/162b lim: 40 exec/s: 0 rss: 67Mb L: 19/29 MS: 1 EraseBytes- 00:08:00.900 [2024-11-16 16:43:46.435704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.900 [2024-11-16 16:43:46.435729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.900 [2024-11-16 16:43:46.435786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:050505ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.900 [2024-11-16 16:43:46.435800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.900 [2024-11-16 16:43:46.435845] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:01000005 cdw11:05050503 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.900 [2024-11-16 16:43:46.435858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.900 [2024-11-16 16:43:46.435914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:de563f22 cdw11:178a0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.900 [2024-11-16 16:43:46.435928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.900 #12 NEW cov: 11800 ft: 13493 corp: 8/199b lim: 40 exec/s: 0 rss: 67Mb L: 37/37 MS: 1 PersAutoDict- DE: "\003\336V?\"\027\212\000"- 00:08:00.900 [2024-11-16 16:43:46.475544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.900 [2024-11-16 16:43:46.475570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.900 [2024-11-16 16:43:46.475627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.900 [2024-11-16 16:43:46.475641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.900 #13 NEW cov: 11800 ft: 13513 corp: 9/219b lim: 40 exec/s: 0 rss: 67Mb L: 20/37 MS: 1 EraseBytes- 00:08:00.900 [2024-11-16 16:43:46.515644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.900 [2024-11-16 16:43:46.515674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.900 [2024-11-16 16:43:46.515735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:0505050a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.900 [2024-11-16 16:43:46.515749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.900 #14 NEW cov: 11800 ft: 13546 corp: 10/240b lim: 40 exec/s: 0 rss: 68Mb L: 21/37 MS: 1 CrossOver- 00:08:00.900 [2024-11-16 16:43:46.555970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.900 [2024-11-16 16:43:46.555996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.900 [2024-11-16 16:43:46.556053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:05000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.901 [2024-11-16 16:43:46.556067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.901 [2024-11-16 16:43:46.556123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.901 [2024-11-16 16:43:46.556136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.901 #15 NEW cov: 11800 ft: 13565 corp: 11/269b lim: 40 exec/s: 0 rss: 68Mb L: 29/37 MS: 1 CMP- DE: "\000\000\000\000"- 00:08:00.901 [2024-11-16 16:43:46.596063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.901 [2024-11-16 16:43:46.596089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.901 [2024-11-16 16:43:46.596146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.901 [2024-11-16 16:43:46.596159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.901 [2024-11-16 16:43:46.596217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:050505ff cdw11:01000005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.901 [2024-11-16 16:43:46.596230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.901 #16 NEW cov: 11800 ft: 13585 corp: 12/294b lim: 40 exec/s: 0 rss: 68Mb L: 25/37 MS: 1 CrossOver- 00:08:00.901 [2024-11-16 16:43:46.636171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.901 [2024-11-16 16:43:46.636197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.901 [2024-11-16 16:43:46.636255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.901 [2024-11-16 16:43:46.636269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.901 [2024-11-16 16:43:46.636326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.901 [2024-11-16 16:43:46.636339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.160 #17 NEW cov: 11800 ft: 13676 corp: 13/323b lim: 40 exec/s: 0 rss: 68Mb L: 29/37 MS: 1 ShuffleBytes- 00:08:01.160 [2024-11-16 16:43:46.676086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.160 [2024-11-16 16:43:46.676115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.160 [2024-11-16 16:43:46.676173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:055d0505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.160 [2024-11-16 16:43:46.676187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.160 #18 NEW cov: 11800 ft: 13705 corp: 14/342b lim: 40 exec/s: 0 rss: 68Mb L: 19/37 MS: 1 ChangeByte- 00:08:01.160 [2024-11-16 16:43:46.716259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.160 [2024-11-16 16:43:46.716285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.160 [2024-11-16 16:43:46.716344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.160 [2024-11-16 16:43:46.716358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.160 #19 NEW cov: 11800 ft: 13749 corp: 15/364b lim: 40 exec/s: 0 rss: 68Mb L: 22/37 MS: 1 EraseBytes- 00:08:01.160 [2024-11-16 16:43:46.756177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.160 [2024-11-16 16:43:46.756203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.160 #20 NEW cov: 11800 ft: 14484 corp: 16/379b lim: 40 exec/s: 0 rss: 68Mb L: 15/37 MS: 1 EraseBytes- 00:08:01.160 [2024-11-16 16:43:46.806493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.160 [2024-11-16 16:43:46.806519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.160 [2024-11-16 16:43:46.806574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:055d0505 cdw11:4a050a0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.160 [2024-11-16 16:43:46.806587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.160 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:01.160 #21 NEW cov: 11823 ft: 14565 corp: 17/395b lim: 40 exec/s: 0 rss: 68Mb L: 16/37 MS: 1 InsertByte- 00:08:01.160 [2024-11-16 16:43:46.846938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.160 [2024-11-16 16:43:46.846964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.160 [2024-11-16 16:43:46.847019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0505ffff cdw11:ffffff05 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.160 [2024-11-16 16:43:46.847033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.160 [2024-11-16 16:43:46.847088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.160 [2024-11-16 16:43:46.847103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.160 [2024-11-16 16:43:46.847157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.160 [2024-11-16 16:43:46.847170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.160 #22 NEW cov: 11823 ft: 14604 corp: 18/429b lim: 40 exec/s: 0 rss: 68Mb L: 34/37 MS: 1 InsertRepeatedBytes- 00:08:01.160 [2024-11-16 16:43:46.886825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0aff0100 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.161 [2024-11-16 16:43:46.886852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.161 [2024-11-16 16:43:46.886910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.161 [2024-11-16 16:43:46.886924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.161 [2024-11-16 16:43:46.886981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4848ffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.161 [2024-11-16 16:43:46.886994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.161 #23 NEW cov: 11823 ft: 14630 corp: 19/457b lim: 40 exec/s: 0 rss: 68Mb L: 28/37 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:01.420 [2024-11-16 16:43:46.926831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0505ff01 cdw11:00000505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.420 [2024-11-16 16:43:46.926856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.420 [2024-11-16 16:43:46.926913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:055d0505 cdw11:4a050a0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.420 [2024-11-16 16:43:46.926926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.420 #24 NEW cov: 11823 ft: 14710 corp: 20/473b lim: 40 exec/s: 24 rss: 68Mb L: 16/37 MS: 1 PersAutoDict- DE: "\377\001\000\000"- 00:08:01.420 [2024-11-16 16:43:46.967271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.420 [2024-11-16 16:43:46.967296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.420 [2024-11-16 16:43:46.967352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.420 [2024-11-16 16:43:46.967365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.420 [2024-11-16 16:43:46.967421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:050505ff cdw11:01000005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.420 [2024-11-16 16:43:46.967435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.420 [2024-11-16 16:43:46.967488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:05050505 cdw11:0505050a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.420 [2024-11-16 16:43:46.967500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.420 #25 NEW cov: 11823 ft: 14720 corp: 21/506b lim: 40 exec/s: 25 rss: 68Mb L: 33/37 MS: 1 CopyPart- 00:08:01.420 [2024-11-16 16:43:47.007082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0aff0100 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.420 [2024-11-16 16:43:47.007107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.420 [2024-11-16 16:43:47.007163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.420 [2024-11-16 16:43:47.007180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.420 #26 NEW cov: 11823 ft: 14744 corp: 22/525b lim: 40 exec/s: 26 rss: 68Mb L: 19/37 MS: 1 EraseBytes- 00:08:01.420 [2024-11-16 16:43:47.047352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.420 [2024-11-16 16:43:47.047378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.420 [2024-11-16 16:43:47.047438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:050d0505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.420 [2024-11-16 16:43:47.047452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.420 [2024-11-16 16:43:47.047507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:050505ff cdw11:01000005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.420 [2024-11-16 16:43:47.047521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.420 #27 NEW cov: 11823 ft: 14758 corp: 23/550b lim: 40 exec/s: 27 rss: 68Mb L: 25/37 MS: 1 ChangeBit- 00:08:01.420 [2024-11-16 16:43:47.087598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0aff0100 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.420 [2024-11-16 16:43:47.087624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.420 [2024-11-16 16:43:47.087683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.420 [2024-11-16 16:43:47.087697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.420 [2024-11-16 16:43:47.087753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:48ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.420 [2024-11-16 16:43:47.087768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.420 [2024-11-16 16:43:47.087819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffff48ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.420 [2024-11-16 16:43:47.087832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.421 #28 NEW cov: 11823 ft: 14776 corp: 24/587b lim: 40 exec/s: 28 rss: 68Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:08:01.421 [2024-11-16 16:43:47.127390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.421 [2024-11-16 16:43:47.127416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.421 [2024-11-16 16:43:47.127474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:054b0505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.421 [2024-11-16 16:43:47.127488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.421 #29 NEW cov: 11823 ft: 14793 corp: 25/609b lim: 40 exec/s: 29 rss: 69Mb L: 22/37 MS: 1 InsertByte- 00:08:01.421 [2024-11-16 16:43:47.167505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.421 [2024-11-16 16:43:47.167531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.421 [2024-11-16 16:43:47.167592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.421 [2024-11-16 16:43:47.167607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.681 #30 NEW cov: 11823 ft: 14826 corp: 26/629b lim: 40 exec/s: 30 rss: 69Mb L: 20/37 MS: 1 ChangeByte- 00:08:01.681 [2024-11-16 16:43:47.207965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.681 [2024-11-16 16:43:47.207991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.681 [2024-11-16 16:43:47.208049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:050505e7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.681 [2024-11-16 16:43:47.208063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.681 [2024-11-16 16:43:47.208117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:e7e7e7e7 cdw11:e7e7e7e7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.681 [2024-11-16 16:43:47.208131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.681 [2024-11-16 16:43:47.208188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:e7e70505 cdw11:0505050a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.681 [2024-11-16 16:43:47.208202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.681 #31 NEW cov: 11823 ft: 14841 corp: 27/662b lim: 40 exec/s: 31 rss: 69Mb L: 33/37 MS: 1 InsertRepeatedBytes- 00:08:01.681 [2024-11-16 16:43:47.247757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.681 [2024-11-16 16:43:47.247783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.681 [2024-11-16 16:43:47.247839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.681 [2024-11-16 16:43:47.247853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.681 #32 NEW cov: 11823 ft: 14847 corp: 28/682b lim: 40 exec/s: 32 rss: 69Mb L: 20/37 MS: 1 EraseBytes- 00:08:01.681 [2024-11-16 16:43:47.288160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.681 [2024-11-16 16:43:47.288186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.681 [2024-11-16 16:43:47.288244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:05000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.681 [2024-11-16 16:43:47.288257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.681 [2024-11-16 16:43:47.288313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.681 [2024-11-16 16:43:47.288326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.681 [2024-11-16 16:43:47.288381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:0505050a cdw11:0a050500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.681 [2024-11-16 16:43:47.288394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.681 #33 NEW cov: 11823 ft: 14923 corp: 29/721b lim: 40 exec/s: 33 rss: 69Mb L: 39/39 MS: 1 CopyPart- 00:08:01.681 [2024-11-16 16:43:47.327988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000505 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.681 [2024-11-16 16:43:47.328015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.681 [2024-11-16 16:43:47.328071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.681 [2024-11-16 16:43:47.328085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.681 #37 NEW cov: 11823 ft: 14928 corp: 30/744b lim: 40 exec/s: 37 rss: 69Mb L: 23/39 MS: 4 PersAutoDict-ChangeBit-CrossOver-InsertRepeatedBytes- DE: "\000\000\000\000"- 00:08:01.681 [2024-11-16 16:43:47.368250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.681 [2024-11-16 16:43:47.368277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.681 [2024-11-16 16:43:47.368333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:050d9e05 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.681 [2024-11-16 16:43:47.368347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.681 [2024-11-16 16:43:47.368403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:050505ff cdw11:01000005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.681 [2024-11-16 16:43:47.368417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.681 #38 NEW cov: 11823 ft: 14941 corp: 31/769b lim: 40 exec/s: 38 rss: 69Mb L: 25/39 MS: 1 ChangeByte- 00:08:01.681 [2024-11-16 16:43:47.408380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.681 [2024-11-16 16:43:47.408407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.681 [2024-11-16 16:43:47.408465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:05ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.681 [2024-11-16 16:43:47.408480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.681 [2024-11-16 16:43:47.408537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.681 [2024-11-16 16:43:47.408551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.941 #39 NEW cov: 11823 ft: 14943 corp: 32/797b lim: 40 exec/s: 39 rss: 69Mb L: 28/39 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:01.941 [2024-11-16 16:43:47.448535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.941 [2024-11-16 16:43:47.448562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.941 [2024-11-16 16:43:47.448622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.941 [2024-11-16 16:43:47.448636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.941 [2024-11-16 16:43:47.448696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:050585ff cdw11:01000005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.941 [2024-11-16 16:43:47.448714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.941 #40 NEW cov: 11823 ft: 14990 corp: 33/822b lim: 40 exec/s: 40 rss: 69Mb L: 25/39 MS: 1 ChangeBit- 00:08:01.941 [2024-11-16 16:43:47.488375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.941 [2024-11-16 16:43:47.488400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.941 [2024-11-16 16:43:47.488458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050549 cdw11:0505050a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.941 [2024-11-16 16:43:47.488472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.941 #41 NEW cov: 11823 ft: 15035 corp: 34/843b lim: 40 exec/s: 41 rss: 69Mb L: 21/39 MS: 1 ChangeByte- 00:08:01.941 [2024-11-16 16:43:47.528331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050a05 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.941 [2024-11-16 16:43:47.528357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.941 #42 NEW cov: 11823 ft: 15075 corp: 35/855b lim: 40 exec/s: 42 rss: 69Mb L: 12/39 MS: 1 EraseBytes- 00:08:01.941 [2024-11-16 16:43:47.568983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0aff0100 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.941 [2024-11-16 16:43:47.569009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.941 [2024-11-16 16:43:47.569067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:48484848 cdw11:48484848 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.941 [2024-11-16 16:43:47.569081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.941 [2024-11-16 16:43:47.569134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:48ffffff cdw11:ffff41ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.941 [2024-11-16 16:43:47.569148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.941 [2024-11-16 16:43:47.569202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffff48 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.941 [2024-11-16 16:43:47.569215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.941 #43 NEW cov: 11823 ft: 15090 corp: 36/893b lim: 40 exec/s: 43 rss: 69Mb L: 38/39 MS: 1 InsertByte- 00:08:01.941 [2024-11-16 16:43:47.609037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.941 [2024-11-16 16:43:47.609063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.941 [2024-11-16 16:43:47.609119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.941 [2024-11-16 16:43:47.609133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.941 [2024-11-16 16:43:47.609190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:050505ff cdw11:01000005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.941 [2024-11-16 16:43:47.609204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.941 [2024-11-16 16:43:47.609263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:05050505 cdw11:ff010000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.941 [2024-11-16 16:43:47.609277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.941 #44 NEW cov: 11823 ft: 15095 corp: 37/926b lim: 40 exec/s: 44 rss: 69Mb L: 33/39 MS: 1 PersAutoDict- DE: "\377\001\000\000"- 00:08:01.941 [2024-11-16 16:43:47.648982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.941 [2024-11-16 16:43:47.649008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.941 [2024-11-16 16:43:47.649065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:050505f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.941 [2024-11-16 16:43:47.649079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.942 [2024-11-16 16:43:47.649139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:f1f1f1f1 cdw11:f1f1f1f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.942 [2024-11-16 16:43:47.649152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.942 #45 NEW cov: 11823 ft: 15098 corp: 38/957b lim: 40 exec/s: 45 rss: 69Mb L: 31/39 MS: 1 InsertRepeatedBytes- 00:08:01.942 [2024-11-16 16:43:47.689161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.942 [2024-11-16 16:43:47.689188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.942 [2024-11-16 16:43:47.689250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff0d9e05 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.942 [2024-11-16 16:43:47.689265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.942 [2024-11-16 16:43:47.689321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:050505ff cdw11:01000005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.942 [2024-11-16 16:43:47.689335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.201 #46 NEW cov: 11823 ft: 15117 corp: 39/982b lim: 40 exec/s: 46 rss: 70Mb L: 25/39 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:02.201 [2024-11-16 16:43:47.729223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.201 [2024-11-16 16:43:47.729249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.201 [2024-11-16 16:43:47.729308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:0505050d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.201 [2024-11-16 16:43:47.729321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.201 [2024-11-16 16:43:47.729378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:050505ff cdw11:01000005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.201 [2024-11-16 16:43:47.729392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.201 #47 NEW cov: 11823 ft: 15122 corp: 40/1007b lim: 40 exec/s: 47 rss: 70Mb L: 25/39 MS: 1 ShuffleBytes- 00:08:02.201 [2024-11-16 16:43:47.769196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.201 [2024-11-16 16:43:47.769225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.201 [2024-11-16 16:43:47.769284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:0505050a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.201 [2024-11-16 16:43:47.769298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.201 #48 NEW cov: 11823 ft: 15163 corp: 41/1024b lim: 40 exec/s: 48 rss: 70Mb L: 17/39 MS: 1 EraseBytes- 00:08:02.201 [2024-11-16 16:43:47.809330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.201 [2024-11-16 16:43:47.809356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.201 [2024-11-16 16:43:47.809412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:05050507 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.201 [2024-11-16 16:43:47.809425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.201 #49 NEW cov: 11823 ft: 15205 corp: 42/1043b lim: 40 exec/s: 49 rss: 70Mb L: 19/39 MS: 1 ChangeBit- 00:08:02.201 [2024-11-16 16:43:47.849416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.201 [2024-11-16 16:43:47.849441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.201 [2024-11-16 16:43:47.849497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:050505fd cdw11:fafafafa SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.201 [2024-11-16 16:43:47.849511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.201 #50 NEW cov: 11823 ft: 15215 corp: 43/1062b lim: 40 exec/s: 50 rss: 70Mb L: 19/39 MS: 1 ChangeBinInt- 00:08:02.201 [2024-11-16 16:43:47.879675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.201 [2024-11-16 16:43:47.879701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.201 [2024-11-16 16:43:47.879758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.201 [2024-11-16 16:43:47.879772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.201 [2024-11-16 16:43:47.879827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:fe0505ff cdw11:01000005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.201 [2024-11-16 16:43:47.879841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.201 #51 NEW cov: 11823 ft: 15228 corp: 44/1087b lim: 40 exec/s: 51 rss: 70Mb L: 25/39 MS: 1 ChangeBinInt- 00:08:02.201 [2024-11-16 16:43:47.919535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:05050505 cdw11:05050505 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.201 [2024-11-16 16:43:47.919560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.201 #52 NEW cov: 11823 ft: 15300 corp: 45/1102b lim: 40 exec/s: 26 rss: 70Mb L: 15/39 MS: 1 CrossOver- 00:08:02.202 #52 DONE cov: 11823 ft: 15300 corp: 45/1102b lim: 40 exec/s: 26 rss: 70Mb 00:08:02.202 ###### Recommended dictionary. ###### 00:08:02.202 "\377\001\000\000" # Uses: 3 00:08:02.202 "\003\336V?\"\027\212\000" # Uses: 1 00:08:02.202 "\000\000\000\000" # Uses: 1 00:08:02.202 "\377\377\377\377\377\377\377\377" # Uses: 1 00:08:02.202 ###### End of recommended dictionary. ###### 00:08:02.202 Done 52 runs in 2 second(s) 00:08:02.461 16:43:48 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:08:02.461 16:43:48 -- ../common.sh@72 -- # (( i++ )) 00:08:02.461 16:43:48 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:02.461 16:43:48 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:02.461 16:43:48 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:02.461 16:43:48 -- nvmf/run.sh@24 -- # local timen=1 00:08:02.461 16:43:48 -- nvmf/run.sh@25 -- # local core=0x1 00:08:02.461 16:43:48 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:02.461 16:43:48 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:02.461 16:43:48 -- nvmf/run.sh@29 -- # printf %02d 12 00:08:02.461 16:43:48 -- nvmf/run.sh@29 -- # port=4412 00:08:02.461 16:43:48 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:02.461 16:43:48 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:02.461 16:43:48 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:02.461 16:43:48 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:08:02.461 [2024-11-16 16:43:48.102307] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:02.461 [2024-11-16 16:43:48.102374] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid488515 ] 00:08:02.461 EAL: No free 2048 kB hugepages reported on node 1 00:08:02.721 [2024-11-16 16:43:48.276183] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.721 [2024-11-16 16:43:48.295419] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:02.721 [2024-11-16 16:43:48.295534] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.721 [2024-11-16 16:43:48.346798] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:02.721 [2024-11-16 16:43:48.363158] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:02.721 INFO: Running with entropic power schedule (0xFF, 100). 00:08:02.721 INFO: Seed: 1989478969 00:08:02.721 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:02.721 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:02.721 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:02.721 INFO: A corpus is not provided, starting from an empty corpus 00:08:02.721 #2 INITED exec/s: 0 rss: 59Mb 00:08:02.721 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:02.721 This may also happen if the target rejected all inputs we tried so far 00:08:02.721 [2024-11-16 16:43:48.418444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989898 cdw11:98989898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.721 [2024-11-16 16:43:48.418473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.980 NEW_FUNC[1/671]: 0x461d28 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:02.980 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:02.980 #3 NEW cov: 11594 ft: 11595 corp: 2/14b lim: 40 exec/s: 0 rss: 67Mb L: 13/13 MS: 1 InsertRepeatedBytes- 00:08:02.980 [2024-11-16 16:43:48.709215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989898 cdw11:98989898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.980 [2024-11-16 16:43:48.709249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.980 [2024-11-16 16:43:48.709303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:98989898 cdw11:f2f2f20a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.980 [2024-11-16 16:43:48.709316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.240 #4 NEW cov: 11707 ft: 12674 corp: 3/30b lim: 40 exec/s: 0 rss: 67Mb L: 16/16 MS: 1 InsertRepeatedBytes- 00:08:03.240 [2024-11-16 16:43:48.759065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989898 cdw11:98989898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.240 [2024-11-16 16:43:48.759092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.240 #5 NEW cov: 11713 ft: 12872 corp: 4/43b lim: 40 exec/s: 0 rss: 67Mb L: 13/16 MS: 1 ChangeByte- 00:08:03.240 [2024-11-16 16:43:48.799172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989898 cdw11:98989898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.240 [2024-11-16 16:43:48.799198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.240 #6 NEW cov: 11798 ft: 13252 corp: 5/56b lim: 40 exec/s: 0 rss: 67Mb L: 13/16 MS: 1 ShuffleBytes- 00:08:03.240 [2024-11-16 16:43:48.839293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989800 cdw11:0d989898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.240 [2024-11-16 16:43:48.839320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.240 #7 NEW cov: 11798 ft: 13344 corp: 6/69b lim: 40 exec/s: 0 rss: 67Mb L: 13/16 MS: 1 ChangeBinInt- 00:08:03.240 [2024-11-16 16:43:48.880032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:febebebe cdw11:bebebebe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.240 [2024-11-16 16:43:48.880058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.240 [2024-11-16 16:43:48.880113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:bebebebe cdw11:bebebebe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.240 [2024-11-16 16:43:48.880126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.240 [2024-11-16 16:43:48.880179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:bebebebe cdw11:bebebebe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.240 [2024-11-16 16:43:48.880192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.240 [2024-11-16 16:43:48.880244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:bebebebe cdw11:bebebebe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.240 [2024-11-16 16:43:48.880258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.240 [2024-11-16 16:43:48.880309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:bebebebe cdw11:bebebe11 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.240 [2024-11-16 16:43:48.880322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.240 #10 NEW cov: 11798 ft: 13797 corp: 7/109b lim: 40 exec/s: 0 rss: 67Mb L: 40/40 MS: 3 InsertByte-ChangeBinInt-InsertRepeatedBytes- 00:08:03.240 [2024-11-16 16:43:48.919522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:9898987a cdw11:9898980a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.240 [2024-11-16 16:43:48.919549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.240 #12 NEW cov: 11798 ft: 13844 corp: 8/117b lim: 40 exec/s: 0 rss: 67Mb L: 8/40 MS: 2 EraseBytes-InsertByte- 00:08:03.240 [2024-11-16 16:43:48.949623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:9898987a cdw11:989898be SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.240 [2024-11-16 16:43:48.949649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.240 #13 NEW cov: 11798 ft: 13915 corp: 9/125b lim: 40 exec/s: 0 rss: 67Mb L: 8/40 MS: 1 CrossOver- 00:08:03.500 [2024-11-16 16:43:48.989918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989898 cdw11:98919898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-16 16:43:48.989945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.500 [2024-11-16 16:43:48.990005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:989a9898 cdw11:8a989898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-16 16:43:48.990019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.500 #18 NEW cov: 11798 ft: 13981 corp: 10/141b lim: 40 exec/s: 0 rss: 67Mb L: 16/40 MS: 5 EraseBytes-ChangeByte-ChangeBit-ChangeBit-CrossOver- 00:08:03.500 [2024-11-16 16:43:49.029841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989898 cdw11:98989898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-16 16:43:49.029867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.500 #19 NEW cov: 11798 ft: 14002 corp: 11/154b lim: 40 exec/s: 0 rss: 67Mb L: 13/40 MS: 1 ChangeBit- 00:08:03.500 [2024-11-16 16:43:49.070006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989800 cdw11:0d98982e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-16 16:43:49.070032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.500 #20 NEW cov: 11798 ft: 14045 corp: 12/168b lim: 40 exec/s: 0 rss: 67Mb L: 14/40 MS: 1 InsertByte- 00:08:03.500 [2024-11-16 16:43:49.110098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-16 16:43:49.110123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.500 #22 NEW cov: 11798 ft: 14068 corp: 13/180b lim: 40 exec/s: 0 rss: 67Mb L: 12/40 MS: 2 EraseBytes-InsertRepeatedBytes- 00:08:03.500 [2024-11-16 16:43:49.150222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989898 cdw11:98989898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-16 16:43:49.150248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.500 #23 NEW cov: 11798 ft: 14112 corp: 14/194b lim: 40 exec/s: 0 rss: 68Mb L: 14/40 MS: 1 InsertByte- 00:08:03.500 [2024-11-16 16:43:49.190910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:febebebe cdw11:bebebebe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-16 16:43:49.190936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.500 [2024-11-16 16:43:49.190988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:bebebebe cdw11:bebebebe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-16 16:43:49.191002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.500 [2024-11-16 16:43:49.191056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:bebebebe cdw11:bebebebe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-16 16:43:49.191073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.500 [2024-11-16 16:43:49.191125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:bebebebe cdw11:bebebebe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-16 16:43:49.191137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.500 [2024-11-16 16:43:49.191191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:bebebebe cdw11:bebebe11 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-16 16:43:49.191204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.500 #24 NEW cov: 11798 ft: 14126 corp: 15/234b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 ShuffleBytes- 00:08:03.500 [2024-11-16 16:43:49.230746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-16 16:43:49.230772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.500 [2024-11-16 16:43:49.230829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-16 16:43:49.230843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.500 [2024-11-16 16:43:49.230897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.500 [2024-11-16 16:43:49.230911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.759 #27 NEW cov: 11798 ft: 14322 corp: 16/263b lim: 40 exec/s: 0 rss: 68Mb L: 29/40 MS: 3 EraseBytes-EraseBytes-InsertRepeatedBytes- 00:08:03.759 [2024-11-16 16:43:49.270996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.759 [2024-11-16 16:43:49.271022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.759 [2024-11-16 16:43:49.271076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:000d980a cdw11:21212121 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.759 [2024-11-16 16:43:49.271090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.759 [2024-11-16 16:43:49.271143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:21212121 cdw11:21212121 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.759 [2024-11-16 16:43:49.271156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.759 [2024-11-16 16:43:49.271207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:21212121 cdw11:21212121 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.759 [2024-11-16 16:43:49.271220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.759 #28 NEW cov: 11798 ft: 14333 corp: 17/295b lim: 40 exec/s: 0 rss: 68Mb L: 32/40 MS: 1 InsertRepeatedBytes- 00:08:03.759 [2024-11-16 16:43:49.310682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989898 cdw11:9898d898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.759 [2024-11-16 16:43:49.310709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.759 #29 NEW cov: 11798 ft: 14347 corp: 18/308b lim: 40 exec/s: 0 rss: 68Mb L: 13/40 MS: 1 ShuffleBytes- 00:08:03.759 [2024-11-16 16:43:49.350934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989898 cdw11:98989898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.759 [2024-11-16 16:43:49.350960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.759 [2024-11-16 16:43:49.351015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:980a9898 cdw11:98989898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.759 [2024-11-16 16:43:49.351029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.759 #30 NEW cov: 11798 ft: 14370 corp: 19/330b lim: 40 exec/s: 0 rss: 68Mb L: 22/40 MS: 1 CopyPart- 00:08:03.759 [2024-11-16 16:43:49.391217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f262626 cdw11:26262626 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.759 [2024-11-16 16:43:49.391244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.759 [2024-11-16 16:43:49.391300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:26262626 cdw11:26262626 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.759 [2024-11-16 16:43:49.391314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.759 [2024-11-16 16:43:49.391368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:26262626 cdw11:26262626 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.759 [2024-11-16 16:43:49.391381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.760 #34 NEW cov: 11798 ft: 14392 corp: 20/361b lim: 40 exec/s: 34 rss: 68Mb L: 31/40 MS: 4 ChangeByte-ChangeByte-ChangeByte-InsertRepeatedBytes- 00:08:03.760 [2024-11-16 16:43:49.431016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a6e6e6e cdw11:6e6e6e6e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.760 [2024-11-16 16:43:49.431042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.760 #37 NEW cov: 11798 ft: 14493 corp: 21/374b lim: 40 exec/s: 37 rss: 68Mb L: 13/40 MS: 3 ShuffleBytes-InsertByte-InsertRepeatedBytes- 00:08:03.760 [2024-11-16 16:43:49.461236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98000d98 cdw11:0a989800 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.760 [2024-11-16 16:43:49.461262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.760 [2024-11-16 16:43:49.461315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000d980a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.760 [2024-11-16 16:43:49.461328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.760 #38 NEW cov: 11798 ft: 14520 corp: 22/390b lim: 40 exec/s: 38 rss: 68Mb L: 16/40 MS: 1 CopyPart- 00:08:03.760 [2024-11-16 16:43:49.501215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989800 cdw11:0d98982e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.760 [2024-11-16 16:43:49.501241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.019 #39 NEW cov: 11798 ft: 14533 corp: 23/404b lim: 40 exec/s: 39 rss: 68Mb L: 14/40 MS: 1 CrossOver- 00:08:04.019 [2024-11-16 16:43:49.541647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f262626 cdw11:26262624 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.019 [2024-11-16 16:43:49.541677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.019 [2024-11-16 16:43:49.541733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:26262626 cdw11:26262626 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.019 [2024-11-16 16:43:49.541750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.019 [2024-11-16 16:43:49.541804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:26262626 cdw11:26262626 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.019 [2024-11-16 16:43:49.541818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.019 #40 NEW cov: 11798 ft: 14536 corp: 24/435b lim: 40 exec/s: 40 rss: 68Mb L: 31/40 MS: 1 ChangeBit- 00:08:04.019 [2024-11-16 16:43:49.581775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989800 cdw11:9898000d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.019 [2024-11-16 16:43:49.581801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.019 [2024-11-16 16:43:49.581857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:98989898 cdw11:9898980a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.019 [2024-11-16 16:43:49.581871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.019 [2024-11-16 16:43:49.581925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0d989898 cdw11:98989898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.019 [2024-11-16 16:43:49.581938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.020 #41 NEW cov: 11798 ft: 14544 corp: 25/460b lim: 40 exec/s: 41 rss: 68Mb L: 25/40 MS: 1 CopyPart- 00:08:04.020 [2024-11-16 16:43:49.621603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a6e6e6e cdw11:6e6e6e6e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.020 [2024-11-16 16:43:49.621629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.020 #42 NEW cov: 11798 ft: 14555 corp: 26/473b lim: 40 exec/s: 42 rss: 68Mb L: 13/40 MS: 1 ChangeByte- 00:08:04.020 [2024-11-16 16:43:49.661862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:febebebe cdw11:bebebebe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.020 [2024-11-16 16:43:49.661888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.020 [2024-11-16 16:43:49.661948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:bebebebe cdw11:bebebebe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.020 [2024-11-16 16:43:49.661962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.020 #43 NEW cov: 11798 ft: 14561 corp: 27/494b lim: 40 exec/s: 43 rss: 68Mb L: 21/40 MS: 1 EraseBytes- 00:08:04.020 [2024-11-16 16:43:49.702279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.020 [2024-11-16 16:43:49.702305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.020 [2024-11-16 16:43:49.702361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:000d980a cdw11:21212121 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.020 [2024-11-16 16:43:49.702374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.020 [2024-11-16 16:43:49.702426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:21212121 cdw11:21212121 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.020 [2024-11-16 16:43:49.702440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.020 [2024-11-16 16:43:49.702494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:21212121 cdw11:21212121 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.020 [2024-11-16 16:43:49.702507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.020 #44 NEW cov: 11798 ft: 14574 corp: 28/526b lim: 40 exec/s: 44 rss: 68Mb L: 32/40 MS: 1 ShuffleBytes- 00:08:04.020 [2024-11-16 16:43:49.742555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:febebebe cdw11:bebebe00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.020 [2024-11-16 16:43:49.742580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.020 [2024-11-16 16:43:49.742638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000028be SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.020 [2024-11-16 16:43:49.742652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.020 [2024-11-16 16:43:49.742709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:bebebebe cdw11:bebebebe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.020 [2024-11-16 16:43:49.742722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.020 [2024-11-16 16:43:49.742775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:bebebebe cdw11:bebebebe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.020 [2024-11-16 16:43:49.742788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.020 [2024-11-16 16:43:49.742839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:bebebebe cdw11:bebebe11 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.020 [2024-11-16 16:43:49.742853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.020 #45 NEW cov: 11798 ft: 14581 corp: 29/566b lim: 40 exec/s: 45 rss: 69Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:04.280 [2024-11-16 16:43:49.782032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989898 cdw11:98989898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.280 [2024-11-16 16:43:49.782058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.280 #46 NEW cov: 11798 ft: 14594 corp: 30/579b lim: 40 exec/s: 46 rss: 69Mb L: 13/40 MS: 1 ShuffleBytes- 00:08:04.280 [2024-11-16 16:43:49.812101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:246a9898 cdw11:246a9898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.280 [2024-11-16 16:43:49.812126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.280 #51 NEW cov: 11798 ft: 14606 corp: 31/589b lim: 40 exec/s: 51 rss: 69Mb L: 10/40 MS: 5 ChangeBit-InsertByte-ChangeBit-CrossOver-CopyPart- 00:08:04.280 [2024-11-16 16:43:49.842214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989898 cdw11:98989898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.280 [2024-11-16 16:43:49.842239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.280 #52 NEW cov: 11798 ft: 14622 corp: 32/603b lim: 40 exec/s: 52 rss: 69Mb L: 14/40 MS: 1 CopyPart- 00:08:04.280 [2024-11-16 16:43:49.882913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:febebebe cdw11:bebebebe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.280 [2024-11-16 16:43:49.882939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.280 [2024-11-16 16:43:49.882998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:bebebebe cdw11:bebe9ebe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.280 [2024-11-16 16:43:49.883012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.280 [2024-11-16 16:43:49.883067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:bebebebe cdw11:bebebebe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.280 [2024-11-16 16:43:49.883081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.280 [2024-11-16 16:43:49.883136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:bebebebe cdw11:bebebebe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.280 [2024-11-16 16:43:49.883149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.280 [2024-11-16 16:43:49.883203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:bebebebe cdw11:bebebe11 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.280 [2024-11-16 16:43:49.883217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.280 #53 NEW cov: 11798 ft: 14668 corp: 33/643b lim: 40 exec/s: 53 rss: 69Mb L: 40/40 MS: 1 ChangeBit- 00:08:04.280 [2024-11-16 16:43:49.922797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989800 cdw11:9898000d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.280 [2024-11-16 16:43:49.922823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.280 [2024-11-16 16:43:49.922876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:98989898 cdw11:9898980a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.280 [2024-11-16 16:43:49.922889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.281 [2024-11-16 16:43:49.922944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0d989800 cdw11:00989898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.281 [2024-11-16 16:43:49.922958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.281 #54 NEW cov: 11798 ft: 14678 corp: 34/668b lim: 40 exec/s: 54 rss: 69Mb L: 25/40 MS: 1 CrossOver- 00:08:04.281 [2024-11-16 16:43:49.962567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989898 cdw11:98682f98 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.281 [2024-11-16 16:43:49.962593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.281 #55 NEW cov: 11798 ft: 14802 corp: 35/681b lim: 40 exec/s: 55 rss: 69Mb L: 13/40 MS: 1 ChangeBinInt- 00:08:04.281 [2024-11-16 16:43:50.002674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989898 cdw11:98989898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.281 [2024-11-16 16:43:50.002700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.540 #56 NEW cov: 11798 ft: 14805 corp: 36/694b lim: 40 exec/s: 56 rss: 69Mb L: 13/40 MS: 1 CrossOver- 00:08:04.541 [2024-11-16 16:43:50.042839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989898 cdw11:98989898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.541 [2024-11-16 16:43:50.042865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.541 #57 NEW cov: 11798 ft: 14843 corp: 37/708b lim: 40 exec/s: 57 rss: 69Mb L: 14/40 MS: 1 ChangeByte- 00:08:04.541 [2024-11-16 16:43:50.082964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989898 cdw11:98989898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.541 [2024-11-16 16:43:50.082996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.541 #58 NEW cov: 11798 ft: 14848 corp: 38/722b lim: 40 exec/s: 58 rss: 69Mb L: 14/40 MS: 1 CrossOver- 00:08:04.541 [2024-11-16 16:43:50.123213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98000d98 cdw11:0a989800 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.541 [2024-11-16 16:43:50.123239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.541 [2024-11-16 16:43:50.123292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000d980a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.541 [2024-11-16 16:43:50.123306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.541 #59 NEW cov: 11798 ft: 14849 corp: 39/738b lim: 40 exec/s: 59 rss: 69Mb L: 16/40 MS: 1 ShuffleBytes- 00:08:04.541 [2024-11-16 16:43:50.163622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989898 cdw11:98000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.541 [2024-11-16 16:43:50.163648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.541 [2024-11-16 16:43:50.163706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.541 [2024-11-16 16:43:50.163720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.541 [2024-11-16 16:43:50.163771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.541 [2024-11-16 16:43:50.163786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.541 [2024-11-16 16:43:50.163838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000098 cdw11:98989898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.541 [2024-11-16 16:43:50.163851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.541 #60 NEW cov: 11798 ft: 14852 corp: 40/773b lim: 40 exec/s: 60 rss: 69Mb L: 35/40 MS: 1 InsertRepeatedBytes- 00:08:04.541 [2024-11-16 16:43:50.203306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989898 cdw11:9898d898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.541 [2024-11-16 16:43:50.203332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.541 #61 NEW cov: 11798 ft: 14955 corp: 41/786b lim: 40 exec/s: 61 rss: 69Mb L: 13/40 MS: 1 CopyPart- 00:08:04.541 [2024-11-16 16:43:50.243741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2f262626 cdw11:26262624 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.541 [2024-11-16 16:43:50.243768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.541 [2024-11-16 16:43:50.243823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:26262626 cdw11:26262626 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.541 [2024-11-16 16:43:50.243837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.541 [2024-11-16 16:43:50.243889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:26262626 cdw11:26062626 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.541 [2024-11-16 16:43:50.243903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.541 #62 NEW cov: 11798 ft: 14974 corp: 42/817b lim: 40 exec/s: 62 rss: 69Mb L: 31/40 MS: 1 ChangeBit- 00:08:04.541 [2024-11-16 16:43:50.283512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a6e6e6e cdw11:6e6e306e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.541 [2024-11-16 16:43:50.283538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.801 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:04.801 #63 NEW cov: 11821 ft: 14988 corp: 43/830b lim: 40 exec/s: 63 rss: 69Mb L: 13/40 MS: 1 ChangeByte- 00:08:04.801 [2024-11-16 16:43:50.323976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989800 cdw11:9898000d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.801 [2024-11-16 16:43:50.324003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.801 [2024-11-16 16:43:50.324057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:98989898 cdw11:9898980a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.801 [2024-11-16 16:43:50.324071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.801 [2024-11-16 16:43:50.324124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0d989898 cdw11:98b19898 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.801 [2024-11-16 16:43:50.324138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.801 #64 NEW cov: 11821 ft: 14997 corp: 44/855b lim: 40 exec/s: 64 rss: 69Mb L: 25/40 MS: 1 ChangeByte- 00:08:04.801 [2024-11-16 16:43:50.363753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.801 [2024-11-16 16:43:50.363779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.801 #67 NEW cov: 11821 ft: 15000 corp: 45/870b lim: 40 exec/s: 67 rss: 70Mb L: 15/40 MS: 3 CrossOver-EraseBytes-InsertRepeatedBytes- 00:08:04.801 [2024-11-16 16:43:50.403907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:98989800 cdw11:0d9898ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.801 [2024-11-16 16:43:50.403934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.801 #68 NEW cov: 11821 ft: 15006 corp: 46/883b lim: 40 exec/s: 34 rss: 70Mb L: 13/40 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:04.801 #68 DONE cov: 11821 ft: 15006 corp: 46/883b lim: 40 exec/s: 34 rss: 70Mb 00:08:04.801 ###### Recommended dictionary. ###### 00:08:04.801 "\377\377\377\377" # Uses: 0 00:08:04.801 ###### End of recommended dictionary. ###### 00:08:04.801 Done 68 runs in 2 second(s) 00:08:04.801 16:43:50 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:08:04.801 16:43:50 -- ../common.sh@72 -- # (( i++ )) 00:08:04.801 16:43:50 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:04.801 16:43:50 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:04.801 16:43:50 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:04.801 16:43:50 -- nvmf/run.sh@24 -- # local timen=1 00:08:04.801 16:43:50 -- nvmf/run.sh@25 -- # local core=0x1 00:08:04.801 16:43:50 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:04.801 16:43:50 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:04.801 16:43:50 -- nvmf/run.sh@29 -- # printf %02d 13 00:08:04.801 16:43:50 -- nvmf/run.sh@29 -- # port=4413 00:08:04.801 16:43:50 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:05.060 16:43:50 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:05.060 16:43:50 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:05.060 16:43:50 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:08:05.060 [2024-11-16 16:43:50.579272] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:05.060 [2024-11-16 16:43:50.579335] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid488830 ] 00:08:05.060 EAL: No free 2048 kB hugepages reported on node 1 00:08:05.060 [2024-11-16 16:43:50.760088] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.061 [2024-11-16 16:43:50.780383] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:05.061 [2024-11-16 16:43:50.780503] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.320 [2024-11-16 16:43:50.832180] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:05.320 [2024-11-16 16:43:50.848521] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:05.320 INFO: Running with entropic power schedule (0xFF, 100). 00:08:05.320 INFO: Seed: 179513225 00:08:05.320 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:05.320 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:05.320 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:05.320 INFO: A corpus is not provided, starting from an empty corpus 00:08:05.320 #2 INITED exec/s: 0 rss: 60Mb 00:08:05.320 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:05.320 This may also happen if the target rejected all inputs we tried so far 00:08:05.321 [2024-11-16 16:43:50.903937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:6a6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.321 [2024-11-16 16:43:50.903967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.321 [2024-11-16 16:43:50.904028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6a6a6a6a cdw11:6a6a6a29 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.321 [2024-11-16 16:43:50.904043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.580 NEW_FUNC[1/669]: 0x4638f8 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:05.580 NEW_FUNC[2/669]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:05.580 #5 NEW cov: 11580 ft: 11581 corp: 2/17b lim: 40 exec/s: 0 rss: 67Mb L: 16/16 MS: 3 InsertByte-CrossOver-InsertRepeatedBytes- 00:08:05.580 [2024-11-16 16:43:51.204664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a0a0a0a0 cdw11:a0a0a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.580 [2024-11-16 16:43:51.204704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.580 [2024-11-16 16:43:51.204769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a0a0a0a0 cdw11:a0a0a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.580 [2024-11-16 16:43:51.204785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.580 [2024-11-16 16:43:51.204830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a0a0a0a0 cdw11:a0a00a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.580 [2024-11-16 16:43:51.204846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.580 NEW_FUNC[1/1]: 0x1723ef8 in nvme_qpair_get_state /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1456 00:08:05.580 #9 NEW cov: 11695 ft: 12396 corp: 3/41b lim: 40 exec/s: 0 rss: 67Mb L: 24/24 MS: 4 ShuffleBytes-CopyPart-CrossOver-InsertRepeatedBytes- 00:08:05.580 [2024-11-16 16:43:51.244441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:78787878 cdw11:78787878 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.580 [2024-11-16 16:43:51.244468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.580 #11 NEW cov: 11701 ft: 12916 corp: 4/50b lim: 40 exec/s: 0 rss: 67Mb L: 9/24 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:05.580 [2024-11-16 16:43:51.284654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:6a6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.580 [2024-11-16 16:43:51.284691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.580 [2024-11-16 16:43:51.284749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6a6a6a10 cdw11:6a6a6a29 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.580 [2024-11-16 16:43:51.284763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.580 #12 NEW cov: 11786 ft: 13187 corp: 5/66b lim: 40 exec/s: 0 rss: 67Mb L: 16/24 MS: 1 ChangeBinInt- 00:08:05.580 [2024-11-16 16:43:51.324886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:3e6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.580 [2024-11-16 16:43:51.324913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.580 [2024-11-16 16:43:51.324971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6a6a6a6a cdw11:106a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.580 [2024-11-16 16:43:51.324985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.840 #13 NEW cov: 11786 ft: 13290 corp: 6/83b lim: 40 exec/s: 0 rss: 67Mb L: 17/24 MS: 1 InsertByte- 00:08:05.840 [2024-11-16 16:43:51.365040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:6a6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.840 [2024-11-16 16:43:51.365066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.840 [2024-11-16 16:43:51.365124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6a6a6a6a cdw11:6a6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.840 [2024-11-16 16:43:51.365139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.840 [2024-11-16 16:43:51.365193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:6a6a6a6a cdw11:6a6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.840 [2024-11-16 16:43:51.365207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.840 #14 NEW cov: 11786 ft: 13339 corp: 7/108b lim: 40 exec/s: 0 rss: 67Mb L: 25/25 MS: 1 CrossOver- 00:08:05.840 [2024-11-16 16:43:51.405276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a0a0a0a0 cdw11:a0a0a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.840 [2024-11-16 16:43:51.405302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.840 [2024-11-16 16:43:51.405359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a0a0a0a0 cdw11:0c0c0c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.840 [2024-11-16 16:43:51.405373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.840 [2024-11-16 16:43:51.405429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0ca0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.840 [2024-11-16 16:43:51.405443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.840 [2024-11-16 16:43:51.405495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a0a0a0a0 cdw11:a0a0a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.840 [2024-11-16 16:43:51.405508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.840 #15 NEW cov: 11786 ft: 13849 corp: 8/143b lim: 40 exec/s: 0 rss: 67Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:05.840 [2024-11-16 16:43:51.445053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:6a6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.840 [2024-11-16 16:43:51.445079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.840 #16 NEW cov: 11786 ft: 13921 corp: 9/157b lim: 40 exec/s: 0 rss: 68Mb L: 14/35 MS: 1 EraseBytes- 00:08:05.840 [2024-11-16 16:43:51.485396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a0a0a0a0 cdw11:a0a4a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.840 [2024-11-16 16:43:51.485423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.840 [2024-11-16 16:43:51.485478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a0a0a0a0 cdw11:a0a0a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.840 [2024-11-16 16:43:51.485491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.840 [2024-11-16 16:43:51.485545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a0a0a0a0 cdw11:a0a00a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.840 [2024-11-16 16:43:51.485559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.840 #17 NEW cov: 11786 ft: 13938 corp: 10/181b lim: 40 exec/s: 0 rss: 68Mb L: 24/35 MS: 1 ChangeBinInt- 00:08:05.840 [2024-11-16 16:43:51.525395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:2b3e6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.840 [2024-11-16 16:43:51.525421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.840 [2024-11-16 16:43:51.525478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6a6a6a6a cdw11:6a106a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.840 [2024-11-16 16:43:51.525491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.840 #18 NEW cov: 11786 ft: 14004 corp: 11/199b lim: 40 exec/s: 0 rss: 68Mb L: 18/35 MS: 1 InsertByte- 00:08:05.840 [2024-11-16 16:43:51.565753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:6a6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.840 [2024-11-16 16:43:51.565779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.840 [2024-11-16 16:43:51.565833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6a6a6a6a cdw11:6a6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.840 [2024-11-16 16:43:51.565847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.840 [2024-11-16 16:43:51.565902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:6a6a6a6a cdw11:6a6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.840 [2024-11-16 16:43:51.565919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.840 [2024-11-16 16:43:51.565973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:29666666 cdw11:66666666 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.840 [2024-11-16 16:43:51.565986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.100 #19 NEW cov: 11786 ft: 14031 corp: 12/236b lim: 40 exec/s: 0 rss: 68Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:08:06.100 [2024-11-16 16:43:51.605654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:6a6a976a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.100 [2024-11-16 16:43:51.605686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.100 [2024-11-16 16:43:51.605745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6a6a6a6a cdw11:106a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.100 [2024-11-16 16:43:51.605758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.100 #20 NEW cov: 11786 ft: 14168 corp: 13/253b lim: 40 exec/s: 0 rss: 68Mb L: 17/37 MS: 1 InsertByte- 00:08:06.100 [2024-11-16 16:43:51.645879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a0a0a0a0 cdw11:a0a4a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.100 [2024-11-16 16:43:51.645904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.100 [2024-11-16 16:43:51.645964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a0a0a0a0 cdw11:a0a0a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.100 [2024-11-16 16:43:51.645977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.100 [2024-11-16 16:43:51.646032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a0a0a0a0 cdw11:a0a0250a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.100 [2024-11-16 16:43:51.646045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.101 #21 NEW cov: 11786 ft: 14203 corp: 14/278b lim: 40 exec/s: 0 rss: 68Mb L: 25/37 MS: 1 InsertByte- 00:08:06.101 [2024-11-16 16:43:51.685855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6b cdw11:2b3e6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.101 [2024-11-16 16:43:51.685882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.101 [2024-11-16 16:43:51.685940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6a6a6a6a cdw11:6a106a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.101 [2024-11-16 16:43:51.685954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.101 #22 NEW cov: 11786 ft: 14243 corp: 15/296b lim: 40 exec/s: 0 rss: 68Mb L: 18/37 MS: 1 ChangeBit- 00:08:06.101 [2024-11-16 16:43:51.726238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a0a0a0a0 cdw11:a0a0a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.101 [2024-11-16 16:43:51.726264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.101 [2024-11-16 16:43:51.726320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a0a0a0a0 cdw11:0c0c0c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.101 [2024-11-16 16:43:51.726334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.101 [2024-11-16 16:43:51.726391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0c0c0c0e cdw11:0c0c0ca0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.101 [2024-11-16 16:43:51.726405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.101 [2024-11-16 16:43:51.726458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a0a0a0a0 cdw11:a0a0a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.101 [2024-11-16 16:43:51.726471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.101 #23 NEW cov: 11786 ft: 14267 corp: 16/331b lim: 40 exec/s: 0 rss: 68Mb L: 35/37 MS: 1 ChangeBinInt- 00:08:06.101 [2024-11-16 16:43:51.766116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6b cdw11:2b3e6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.101 [2024-11-16 16:43:51.766142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.101 [2024-11-16 16:43:51.766197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6a6a6a6a cdw11:6a106a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.101 [2024-11-16 16:43:51.766211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.101 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:06.101 #24 NEW cov: 11809 ft: 14288 corp: 17/349b lim: 40 exec/s: 0 rss: 68Mb L: 18/37 MS: 1 ChangeBinInt- 00:08:06.101 [2024-11-16 16:43:51.806367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6ae0 cdw11:6a6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.101 [2024-11-16 16:43:51.806393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.101 [2024-11-16 16:43:51.806451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6a6a6a6a cdw11:6a6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.101 [2024-11-16 16:43:51.806465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.101 [2024-11-16 16:43:51.806521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:6a6a6a6a cdw11:6a6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.101 [2024-11-16 16:43:51.806535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.101 #25 NEW cov: 11809 ft: 14314 corp: 18/375b lim: 40 exec/s: 0 rss: 68Mb L: 26/37 MS: 1 InsertByte- 00:08:06.101 [2024-11-16 16:43:51.846639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a080a0a0 cdw11:a0a0a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.101 [2024-11-16 16:43:51.846666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.101 [2024-11-16 16:43:51.846729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a0a0a0a0 cdw11:0c0c0c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.101 [2024-11-16 16:43:51.846745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.101 [2024-11-16 16:43:51.846802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0ca0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.101 [2024-11-16 16:43:51.846816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.101 [2024-11-16 16:43:51.846873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a0a0a0a0 cdw11:a0a0a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.101 [2024-11-16 16:43:51.846889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.360 #26 NEW cov: 11809 ft: 14321 corp: 19/410b lim: 40 exec/s: 0 rss: 68Mb L: 35/37 MS: 1 ChangeBit- 00:08:06.360 [2024-11-16 16:43:51.886331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:6a6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.360 [2024-11-16 16:43:51.886358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.360 #27 NEW cov: 11809 ft: 14349 corp: 20/424b lim: 40 exec/s: 27 rss: 68Mb L: 14/37 MS: 1 CrossOver- 00:08:06.360 [2024-11-16 16:43:51.926480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:78787878 cdw11:78786878 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.360 [2024-11-16 16:43:51.926506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.360 #28 NEW cov: 11809 ft: 14428 corp: 21/433b lim: 40 exec/s: 28 rss: 68Mb L: 9/37 MS: 1 ChangeBit- 00:08:06.360 [2024-11-16 16:43:51.966983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a0a0a0a0 cdw11:a032a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.360 [2024-11-16 16:43:51.967009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.360 [2024-11-16 16:43:51.967066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a0a0a0a0 cdw11:a00c0c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.360 [2024-11-16 16:43:51.967081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.360 [2024-11-16 16:43:51.967135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0e0c0c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.360 [2024-11-16 16:43:51.967149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.360 [2024-11-16 16:43:51.967205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a0a0a0a0 cdw11:a0a0a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.360 [2024-11-16 16:43:51.967218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.360 #29 NEW cov: 11809 ft: 14501 corp: 22/469b lim: 40 exec/s: 29 rss: 69Mb L: 36/37 MS: 1 InsertByte- 00:08:06.360 [2024-11-16 16:43:52.006997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:1a1a1a1a cdw11:1a1a1a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.360 [2024-11-16 16:43:52.007023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.360 [2024-11-16 16:43:52.007079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:1a1a1a1a cdw11:1a1a1a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.360 [2024-11-16 16:43:52.007093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.360 [2024-11-16 16:43:52.007146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:1a1a1a1a cdw11:1a1a1a1a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.360 [2024-11-16 16:43:52.007159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.360 #31 NEW cov: 11809 ft: 14515 corp: 23/497b lim: 40 exec/s: 31 rss: 69Mb L: 28/37 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:06.360 [2024-11-16 16:43:52.046814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a7a6a6a cdw11:6a6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.360 [2024-11-16 16:43:52.046844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.360 #32 NEW cov: 11809 ft: 14563 corp: 24/511b lim: 40 exec/s: 32 rss: 69Mb L: 14/37 MS: 1 ChangeBit- 00:08:06.360 [2024-11-16 16:43:52.087062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:3e6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.360 [2024-11-16 16:43:52.087089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.360 [2024-11-16 16:43:52.087142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6a6a6a6a cdw11:10ededed SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.360 [2024-11-16 16:43:52.087156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.360 #33 NEW cov: 11809 ft: 14576 corp: 25/531b lim: 40 exec/s: 33 rss: 69Mb L: 20/37 MS: 1 InsertRepeatedBytes- 00:08:06.619 [2024-11-16 16:43:52.127164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:3e6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.619 [2024-11-16 16:43:52.127192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.619 [2024-11-16 16:43:52.127244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6a6a6a6a cdw11:10dbeded SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.619 [2024-11-16 16:43:52.127258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.619 #34 NEW cov: 11809 ft: 14593 corp: 26/551b lim: 40 exec/s: 34 rss: 69Mb L: 20/37 MS: 1 ChangeByte- 00:08:06.619 [2024-11-16 16:43:52.167318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:2b3e6af9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.619 [2024-11-16 16:43:52.167344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.619 [2024-11-16 16:43:52.167399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6a6a6a6a cdw11:6a6a106a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.619 [2024-11-16 16:43:52.167412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.619 #35 NEW cov: 11809 ft: 14594 corp: 27/570b lim: 40 exec/s: 35 rss: 69Mb L: 19/37 MS: 1 InsertByte- 00:08:06.619 [2024-11-16 16:43:52.207526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:2b3e6a2b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.619 [2024-11-16 16:43:52.207552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.619 [2024-11-16 16:43:52.207606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:3e6a6a6a cdw11:6a6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.619 [2024-11-16 16:43:52.207619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.619 [2024-11-16 16:43:52.207676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:106a6a6a cdw11:6a6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.619 [2024-11-16 16:43:52.207690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.619 #36 NEW cov: 11809 ft: 14636 corp: 28/600b lim: 40 exec/s: 36 rss: 69Mb L: 30/37 MS: 1 CopyPart- 00:08:06.619 [2024-11-16 16:43:52.247525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6b cdw11:2b3e746a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.619 [2024-11-16 16:43:52.247555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.619 [2024-11-16 16:43:52.247607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6a6a6a6a cdw11:6a6a106a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.619 [2024-11-16 16:43:52.247621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.619 #37 NEW cov: 11809 ft: 14674 corp: 29/619b lim: 40 exec/s: 37 rss: 69Mb L: 19/37 MS: 1 InsertByte- 00:08:06.619 [2024-11-16 16:43:52.287897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a0a0a0a0 cdw11:a032a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.619 [2024-11-16 16:43:52.287923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.619 [2024-11-16 16:43:52.287979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a0a0a0a0 cdw11:a00c0c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.619 [2024-11-16 16:43:52.287993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.619 [2024-11-16 16:43:52.288044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0c0c150c cdw11:0e0c0c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.619 [2024-11-16 16:43:52.288058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.619 [2024-11-16 16:43:52.288112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a0a0a0a0 cdw11:a0a0a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.620 [2024-11-16 16:43:52.288125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.620 #38 NEW cov: 11809 ft: 14687 corp: 30/655b lim: 40 exec/s: 38 rss: 69Mb L: 36/37 MS: 1 ChangeByte- 00:08:06.620 [2024-11-16 16:43:52.327787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a06006b cdw11:2b3e6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.620 [2024-11-16 16:43:52.327815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.620 [2024-11-16 16:43:52.327866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6a6a6a6a cdw11:6a106a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.620 [2024-11-16 16:43:52.327879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.620 #39 NEW cov: 11809 ft: 14693 corp: 31/673b lim: 40 exec/s: 39 rss: 69Mb L: 18/37 MS: 1 CMP- DE: "\006\000"- 00:08:06.879 [2024-11-16 16:43:52.368097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:6a10a0a0 cdw11:a0a0a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.879 [2024-11-16 16:43:52.368125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.879 [2024-11-16 16:43:52.368182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a0a0a0a0 cdw11:a0a0a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.879 [2024-11-16 16:43:52.368196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.879 [2024-11-16 16:43:52.368253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a0a0a0a0 cdw11:a0a00a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.879 [2024-11-16 16:43:52.368267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.879 #40 NEW cov: 11809 ft: 14706 corp: 32/697b lim: 40 exec/s: 40 rss: 69Mb L: 24/37 MS: 1 CrossOver- 00:08:06.879 [2024-11-16 16:43:52.408010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6ae0 cdw11:6a6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.879 [2024-11-16 16:43:52.408036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.879 [2024-11-16 16:43:52.408090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6a6a6a6a cdw11:6a6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.879 [2024-11-16 16:43:52.408103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.879 #41 NEW cov: 11809 ft: 14733 corp: 33/720b lim: 40 exec/s: 41 rss: 69Mb L: 23/37 MS: 1 EraseBytes- 00:08:06.879 [2024-11-16 16:43:52.448364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:a0a0a0b8 cdw11:b8b8a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.879 [2024-11-16 16:43:52.448391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.879 [2024-11-16 16:43:52.448446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a0a0a0a0 cdw11:a0a0a00c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.879 [2024-11-16 16:43:52.448460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.879 [2024-11-16 16:43:52.448511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0c0c0c0c cdw11:0c0c0c0c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.879 [2024-11-16 16:43:52.448525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.879 [2024-11-16 16:43:52.448576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:0c0ca0a0 cdw11:a0a0a0a0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.879 [2024-11-16 16:43:52.448589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.879 #42 NEW cov: 11809 ft: 14750 corp: 34/758b lim: 40 exec/s: 42 rss: 69Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:06.879 [2024-11-16 16:43:52.488095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:6a6a0e00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.879 [2024-11-16 16:43:52.488122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.879 #43 NEW cov: 11809 ft: 14761 corp: 35/772b lim: 40 exec/s: 43 rss: 69Mb L: 14/38 MS: 1 ChangeBinInt- 00:08:06.879 [2024-11-16 16:43:52.528385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:6a6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.880 [2024-11-16 16:43:52.528411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.880 [2024-11-16 16:43:52.528466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6a106a6a cdw11:6a290600 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.880 [2024-11-16 16:43:52.528479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.880 #44 NEW cov: 11809 ft: 14766 corp: 36/788b lim: 40 exec/s: 44 rss: 69Mb L: 16/38 MS: 1 PersAutoDict- DE: "\006\000"- 00:08:06.880 [2024-11-16 16:43:52.568340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:78787878 cdw11:78786878 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.880 [2024-11-16 16:43:52.568367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.880 #45 NEW cov: 11809 ft: 14781 corp: 37/797b lim: 40 exec/s: 45 rss: 69Mb L: 9/38 MS: 1 ShuffleBytes- 00:08:06.880 [2024-11-16 16:43:52.608622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:3e6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.880 [2024-11-16 16:43:52.608651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.880 [2024-11-16 16:43:52.608710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6a6a6a6a cdw11:10ededed SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.880 [2024-11-16 16:43:52.608725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.139 #46 NEW cov: 11809 ft: 14788 corp: 38/819b lim: 40 exec/s: 46 rss: 69Mb L: 22/38 MS: 1 CopyPart- 00:08:07.139 [2024-11-16 16:43:52.648687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:6a6a976a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.139 [2024-11-16 16:43:52.648712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.139 [2024-11-16 16:43:52.648769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:9b959595 cdw11:ef959595 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.139 [2024-11-16 16:43:52.648783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.139 #47 NEW cov: 11809 ft: 14806 corp: 39/836b lim: 40 exec/s: 47 rss: 69Mb L: 17/38 MS: 1 ChangeBinInt- 00:08:07.139 [2024-11-16 16:43:52.688837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a626a cdw11:3e6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.139 [2024-11-16 16:43:52.688863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.139 [2024-11-16 16:43:52.688919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6a6a6a6a cdw11:10dbeded SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.139 [2024-11-16 16:43:52.688933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.139 #48 NEW cov: 11809 ft: 14810 corp: 40/856b lim: 40 exec/s: 48 rss: 69Mb L: 20/38 MS: 1 ChangeBit- 00:08:07.139 [2024-11-16 16:43:52.728827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:6a6a0e00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.139 [2024-11-16 16:43:52.728852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.139 #49 NEW cov: 11809 ft: 14828 corp: 41/868b lim: 40 exec/s: 49 rss: 69Mb L: 12/38 MS: 1 EraseBytes- 00:08:07.139 [2024-11-16 16:43:52.768936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:6a6a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.139 [2024-11-16 16:43:52.768962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.139 #50 NEW cov: 11809 ft: 14846 corp: 42/879b lim: 40 exec/s: 50 rss: 70Mb L: 11/38 MS: 1 EraseBytes- 00:08:07.139 [2024-11-16 16:43:52.809186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:5b6a6a97 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.139 [2024-11-16 16:43:52.809212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.139 [2024-11-16 16:43:52.809266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6a9b9595 cdw11:95ef9595 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.140 [2024-11-16 16:43:52.809279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.140 #51 NEW cov: 11809 ft: 14908 corp: 43/897b lim: 40 exec/s: 51 rss: 70Mb L: 18/38 MS: 1 InsertByte- 00:08:07.140 [2024-11-16 16:43:52.849162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:78787825 cdw11:78787868 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.140 [2024-11-16 16:43:52.849187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.140 #52 NEW cov: 11809 ft: 14926 corp: 44/907b lim: 40 exec/s: 52 rss: 70Mb L: 10/38 MS: 1 InsertByte- 00:08:07.399 [2024-11-16 16:43:52.889447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a6a6a6a cdw11:3e4a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.399 [2024-11-16 16:43:52.889473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.399 [2024-11-16 16:43:52.889531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:6a6a6a6a cdw11:106a6a6a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.399 [2024-11-16 16:43:52.889545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.399 #53 NEW cov: 11809 ft: 14930 corp: 45/924b lim: 40 exec/s: 26 rss: 70Mb L: 17/38 MS: 1 ChangeBit- 00:08:07.399 #53 DONE cov: 11809 ft: 14930 corp: 45/924b lim: 40 exec/s: 26 rss: 70Mb 00:08:07.399 ###### Recommended dictionary. ###### 00:08:07.399 "\006\000" # Uses: 1 00:08:07.399 ###### End of recommended dictionary. ###### 00:08:07.399 Done 53 runs in 2 second(s) 00:08:07.399 16:43:53 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:08:07.399 16:43:53 -- ../common.sh@72 -- # (( i++ )) 00:08:07.399 16:43:53 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:07.399 16:43:53 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:07.399 16:43:53 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:07.399 16:43:53 -- nvmf/run.sh@24 -- # local timen=1 00:08:07.399 16:43:53 -- nvmf/run.sh@25 -- # local core=0x1 00:08:07.399 16:43:53 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:07.399 16:43:53 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:07.399 16:43:53 -- nvmf/run.sh@29 -- # printf %02d 14 00:08:07.399 16:43:53 -- nvmf/run.sh@29 -- # port=4414 00:08:07.399 16:43:53 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:07.399 16:43:53 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:07.399 16:43:53 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:07.399 16:43:53 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:08:07.399 [2024-11-16 16:43:53.061076] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:07.399 [2024-11-16 16:43:53.061151] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid489347 ] 00:08:07.399 EAL: No free 2048 kB hugepages reported on node 1 00:08:07.659 [2024-11-16 16:43:53.229788] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:07.659 [2024-11-16 16:43:53.249448] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:07.659 [2024-11-16 16:43:53.249560] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:07.659 [2024-11-16 16:43:53.300811] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:07.659 [2024-11-16 16:43:53.317145] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:07.659 INFO: Running with entropic power schedule (0xFF, 100). 00:08:07.659 INFO: Seed: 2648505402 00:08:07.659 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:07.659 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:07.659 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:07.659 INFO: A corpus is not provided, starting from an empty corpus 00:08:07.659 #2 INITED exec/s: 0 rss: 59Mb 00:08:07.659 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:07.659 This may also happen if the target rejected all inputs we tried so far 00:08:07.659 [2024-11-16 16:43:53.361917] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.659 [2024-11-16 16:43:53.361954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.659 [2024-11-16 16:43:53.361989] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.659 [2024-11-16 16:43:53.362005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.659 [2024-11-16 16:43:53.362035] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.659 [2024-11-16 16:43:53.362052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.918 NEW_FUNC[1/671]: 0x4654c8 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:07.918 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:07.918 #3 NEW cov: 11570 ft: 11577 corp: 2/27b lim: 35 exec/s: 0 rss: 67Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:08:08.177 [2024-11-16 16:43:53.682635] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.178 [2024-11-16 16:43:53.682680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.178 [2024-11-16 16:43:53.682716] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000b8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.178 [2024-11-16 16:43:53.682733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.178 [2024-11-16 16:43:53.682762] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000b8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.178 [2024-11-16 16:43:53.682778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.178 #6 NEW cov: 11689 ft: 12063 corp: 3/52b lim: 35 exec/s: 0 rss: 68Mb L: 25/26 MS: 3 ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:08:08.178 [2024-11-16 16:43:53.732646] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.178 [2024-11-16 16:43:53.732687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.178 [2024-11-16 16:43:53.732719] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000b8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.178 [2024-11-16 16:43:53.732735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.178 [2024-11-16 16:43:53.732763] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000b8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.178 [2024-11-16 16:43:53.732778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.178 #7 NEW cov: 11695 ft: 12280 corp: 4/77b lim: 35 exec/s: 0 rss: 68Mb L: 25/26 MS: 1 ChangeByte- 00:08:08.178 [2024-11-16 16:43:53.792814] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.178 [2024-11-16 16:43:53.792850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.178 [2024-11-16 16:43:53.792882] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.178 [2024-11-16 16:43:53.792898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.178 [2024-11-16 16:43:53.792926] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.178 [2024-11-16 16:43:53.792941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.178 #8 NEW cov: 11780 ft: 12638 corp: 5/103b lim: 35 exec/s: 0 rss: 68Mb L: 26/26 MS: 1 ShuffleBytes- 00:08:08.178 [2024-11-16 16:43:53.853000] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.178 [2024-11-16 16:43:53.853032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.178 [2024-11-16 16:43:53.853063] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.178 [2024-11-16 16:43:53.853078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.178 [2024-11-16 16:43:53.853105] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.178 [2024-11-16 16:43:53.853120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.178 #9 NEW cov: 11787 ft: 12759 corp: 6/130b lim: 35 exec/s: 0 rss: 68Mb L: 27/27 MS: 1 InsertByte- 00:08:08.178 [2024-11-16 16:43:53.913123] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.178 [2024-11-16 16:43:53.913155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.178 [2024-11-16 16:43:53.913187] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000b8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.178 [2024-11-16 16:43:53.913204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.178 [2024-11-16 16:43:53.913231] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000b8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.178 [2024-11-16 16:43:53.913247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.437 #10 NEW cov: 11787 ft: 12882 corp: 7/157b lim: 35 exec/s: 0 rss: 68Mb L: 27/27 MS: 1 CopyPart- 00:08:08.437 [2024-11-16 16:43:53.963285] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.437 [2024-11-16 16:43:53.963317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.437 [2024-11-16 16:43:53.963348] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.437 [2024-11-16 16:43:53.963364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.437 [2024-11-16 16:43:53.963391] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.437 [2024-11-16 16:43:53.963407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.437 #14 NEW cov: 11787 ft: 12944 corp: 8/178b lim: 35 exec/s: 0 rss: 68Mb L: 21/27 MS: 4 InsertByte-CMP-InsertByte-CrossOver- DE: "\377\377"- 00:08:08.437 [2024-11-16 16:43:54.013383] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.437 [2024-11-16 16:43:54.013415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.437 [2024-11-16 16:43:54.013447] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000b8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.437 [2024-11-16 16:43:54.013463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.437 [2024-11-16 16:43:54.013491] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000b8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.437 [2024-11-16 16:43:54.013507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.437 #15 NEW cov: 11787 ft: 13043 corp: 9/203b lim: 35 exec/s: 0 rss: 68Mb L: 25/27 MS: 1 ChangeBit- 00:08:08.437 [2024-11-16 16:43:54.063395] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.438 [2024-11-16 16:43:54.063427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.438 #16 NEW cov: 11787 ft: 13964 corp: 10/214b lim: 35 exec/s: 0 rss: 68Mb L: 11/27 MS: 1 CrossOver- 00:08:08.438 [2024-11-16 16:43:54.133774] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.438 [2024-11-16 16:43:54.133806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.438 [2024-11-16 16:43:54.133837] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000b8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.438 [2024-11-16 16:43:54.133853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.438 [2024-11-16 16:43:54.133881] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ee SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.438 [2024-11-16 16:43:54.133896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.438 [2024-11-16 16:43:54.133925] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000b8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.438 [2024-11-16 16:43:54.133940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.438 #17 NEW cov: 11787 ft: 14266 corp: 11/248b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:08:08.697 [2024-11-16 16:43:54.193872] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.697 [2024-11-16 16:43:54.193904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.697 [2024-11-16 16:43:54.193935] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.697 [2024-11-16 16:43:54.193951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.697 #18 NEW cov: 11787 ft: 14447 corp: 12/266b lim: 35 exec/s: 0 rss: 68Mb L: 18/34 MS: 1 EraseBytes- 00:08:08.697 [2024-11-16 16:43:54.254084] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.697 [2024-11-16 16:43:54.254119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.697 [2024-11-16 16:43:54.254151] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.697 [2024-11-16 16:43:54.254166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.697 [2024-11-16 16:43:54.254194] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.697 [2024-11-16 16:43:54.254210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.697 [2024-11-16 16:43:54.254238] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.697 [2024-11-16 16:43:54.254269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.697 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:08.697 #19 NEW cov: 11804 ft: 14482 corp: 13/297b lim: 35 exec/s: 0 rss: 68Mb L: 31/34 MS: 1 CopyPart- 00:08:08.697 [2024-11-16 16:43:54.324214] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.697 [2024-11-16 16:43:54.324246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.697 [2024-11-16 16:43:54.324277] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.697 [2024-11-16 16:43:54.324292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.697 [2024-11-16 16:43:54.324320] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.697 [2024-11-16 16:43:54.324336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.697 #20 NEW cov: 11804 ft: 14539 corp: 14/324b lim: 35 exec/s: 20 rss: 68Mb L: 27/34 MS: 1 ShuffleBytes- 00:08:08.697 [2024-11-16 16:43:54.374384] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.697 [2024-11-16 16:43:54.374415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.697 [2024-11-16 16:43:54.374447] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.697 [2024-11-16 16:43:54.374462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.697 [2024-11-16 16:43:54.374490] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES SOFTWARE PROGRESS MARKER cid:6 cdw10:80000080 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.697 [2024-11-16 16:43:54.374506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.697 #21 NEW cov: 11804 ft: 14570 corp: 15/347b lim: 35 exec/s: 21 rss: 69Mb L: 23/34 MS: 1 InsertRepeatedBytes- 00:08:08.697 [2024-11-16 16:43:54.434499] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.697 [2024-11-16 16:43:54.434530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.697 [2024-11-16 16:43:54.434562] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.697 [2024-11-16 16:43:54.434577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.697 [2024-11-16 16:43:54.434609] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.697 [2024-11-16 16:43:54.434625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.957 #22 NEW cov: 11804 ft: 14597 corp: 16/374b lim: 35 exec/s: 22 rss: 69Mb L: 27/34 MS: 1 ChangeBit- 00:08:08.957 [2024-11-16 16:43:54.494651] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.957 [2024-11-16 16:43:54.494690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.957 [2024-11-16 16:43:54.494723] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.957 [2024-11-16 16:43:54.494738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.957 [2024-11-16 16:43:54.494766] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES SOFTWARE PROGRESS MARKER cid:6 cdw10:80000080 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.957 [2024-11-16 16:43:54.494782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.957 #23 NEW cov: 11804 ft: 14622 corp: 17/397b lim: 35 exec/s: 23 rss: 69Mb L: 23/34 MS: 1 CopyPart- 00:08:08.957 [2024-11-16 16:43:54.554822] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.957 [2024-11-16 16:43:54.554854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.957 [2024-11-16 16:43:54.554886] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000b8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.957 [2024-11-16 16:43:54.554902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.957 [2024-11-16 16:43:54.554929] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000b8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.957 [2024-11-16 16:43:54.554945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.957 #24 NEW cov: 11804 ft: 14663 corp: 18/424b lim: 35 exec/s: 24 rss: 69Mb L: 27/34 MS: 1 ShuffleBytes- 00:08:08.957 [2024-11-16 16:43:54.615035] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.957 [2024-11-16 16:43:54.615065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.957 [2024-11-16 16:43:54.615098] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.957 [2024-11-16 16:43:54.615113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.957 [2024-11-16 16:43:54.615141] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.957 [2024-11-16 16:43:54.615157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.957 [2024-11-16 16:43:54.615185] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.957 [2024-11-16 16:43:54.615200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.957 #25 NEW cov: 11804 ft: 14702 corp: 19/452b lim: 35 exec/s: 25 rss: 69Mb L: 28/34 MS: 1 InsertByte- 00:08:08.957 [2024-11-16 16:43:54.665218] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.957 [2024-11-16 16:43:54.665248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.957 [2024-11-16 16:43:54.665280] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.957 [2024-11-16 16:43:54.665295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.957 [2024-11-16 16:43:54.665324] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.957 [2024-11-16 16:43:54.665339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.957 [2024-11-16 16:43:54.665368] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.957 [2024-11-16 16:43:54.665383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.217 #26 NEW cov: 11804 ft: 14746 corp: 20/480b lim: 35 exec/s: 26 rss: 69Mb L: 28/34 MS: 1 InsertByte- 00:08:09.217 [2024-11-16 16:43:54.725265] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.217 [2024-11-16 16:43:54.725295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.217 [2024-11-16 16:43:54.725327] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.217 [2024-11-16 16:43:54.725343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.217 [2024-11-16 16:43:54.725372] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES SOFTWARE PROGRESS MARKER cid:6 cdw10:80000080 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.217 [2024-11-16 16:43:54.725388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.217 #27 NEW cov: 11804 ft: 14761 corp: 21/505b lim: 35 exec/s: 27 rss: 69Mb L: 25/34 MS: 1 PersAutoDict- DE: "\377\377"- 00:08:09.217 [2024-11-16 16:43:54.775427] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.217 [2024-11-16 16:43:54.775457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.217 [2024-11-16 16:43:54.775489] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.217 [2024-11-16 16:43:54.775505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.217 [2024-11-16 16:43:54.775533] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.217 [2024-11-16 16:43:54.775549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.217 #28 NEW cov: 11804 ft: 14785 corp: 22/531b lim: 35 exec/s: 28 rss: 69Mb L: 26/34 MS: 1 ChangeBinInt- 00:08:09.217 [2024-11-16 16:43:54.825454] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.217 [2024-11-16 16:43:54.825484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.217 [2024-11-16 16:43:54.825516] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.217 [2024-11-16 16:43:54.825536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.217 #29 NEW cov: 11804 ft: 14793 corp: 23/550b lim: 35 exec/s: 29 rss: 69Mb L: 19/34 MS: 1 InsertByte- 00:08:09.217 [2024-11-16 16:43:54.875652] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.217 [2024-11-16 16:43:54.875687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.217 [2024-11-16 16:43:54.875720] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.217 [2024-11-16 16:43:54.875735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.217 [2024-11-16 16:43:54.875763] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.217 [2024-11-16 16:43:54.875779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.217 #30 NEW cov: 11804 ft: 14830 corp: 24/576b lim: 35 exec/s: 30 rss: 69Mb L: 26/34 MS: 1 ChangeByte- 00:08:09.217 [2024-11-16 16:43:54.935849] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.217 [2024-11-16 16:43:54.935880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.217 [2024-11-16 16:43:54.935912] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000b8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.217 [2024-11-16 16:43:54.935928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.217 [2024-11-16 16:43:54.935956] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000b8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.217 [2024-11-16 16:43:54.935971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.477 #31 NEW cov: 11804 ft: 14855 corp: 25/601b lim: 35 exec/s: 31 rss: 69Mb L: 25/34 MS: 1 CrossOver- 00:08:09.477 [2024-11-16 16:43:54.985943] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.477 [2024-11-16 16:43:54.985974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.477 [2024-11-16 16:43:54.986006] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.477 [2024-11-16 16:43:54.986021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.477 [2024-11-16 16:43:54.986049] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.477 [2024-11-16 16:43:54.986064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.477 #32 NEW cov: 11804 ft: 14873 corp: 26/628b lim: 35 exec/s: 32 rss: 69Mb L: 27/34 MS: 1 ChangeByte- 00:08:09.477 [2024-11-16 16:43:55.036076] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.477 [2024-11-16 16:43:55.036106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.477 [2024-11-16 16:43:55.036138] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.477 [2024-11-16 16:43:55.036157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.477 [2024-11-16 16:43:55.036185] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.477 [2024-11-16 16:43:55.036201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.477 #33 NEW cov: 11804 ft: 14893 corp: 27/655b lim: 35 exec/s: 33 rss: 69Mb L: 27/34 MS: 1 ChangeBinInt- 00:08:09.477 [2024-11-16 16:43:55.096210] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.477 [2024-11-16 16:43:55.096240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.477 [2024-11-16 16:43:55.096272] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.477 [2024-11-16 16:43:55.096287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.477 #34 NEW cov: 11804 ft: 14920 corp: 28/671b lim: 35 exec/s: 34 rss: 69Mb L: 16/34 MS: 1 EraseBytes- 00:08:09.477 [2024-11-16 16:43:55.146463] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.477 [2024-11-16 16:43:55.146495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.477 [2024-11-16 16:43:55.146529] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000b8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.477 [2024-11-16 16:43:55.146545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.477 [2024-11-16 16:43:55.146574] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ee SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.477 [2024-11-16 16:43:55.146591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.477 [2024-11-16 16:43:55.146620] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000b8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.477 [2024-11-16 16:43:55.146636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.477 #35 NEW cov: 11804 ft: 14997 corp: 29/705b lim: 35 exec/s: 35 rss: 69Mb L: 34/34 MS: 1 ChangeASCIIInt- 00:08:09.477 [2024-11-16 16:43:55.206563] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.477 [2024-11-16 16:43:55.206595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.477 [2024-11-16 16:43:55.206627] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.477 [2024-11-16 16:43:55.206643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.477 [2024-11-16 16:43:55.206678] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:8000003b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.477 [2024-11-16 16:43:55.206694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.737 #36 NEW cov: 11804 ft: 15024 corp: 30/731b lim: 35 exec/s: 36 rss: 69Mb L: 26/34 MS: 1 ChangeByte- 00:08:09.737 [2024-11-16 16:43:55.256655] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.737 [2024-11-16 16:43:55.256694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.737 [2024-11-16 16:43:55.256730] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.737 [2024-11-16 16:43:55.256746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.737 [2024-11-16 16:43:55.256775] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.737 [2024-11-16 16:43:55.256790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.737 #37 NEW cov: 11811 ft: 15033 corp: 31/757b lim: 35 exec/s: 37 rss: 69Mb L: 26/34 MS: 1 ChangeBinInt- 00:08:09.737 [2024-11-16 16:43:55.306845] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.737 [2024-11-16 16:43:55.306877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.737 [2024-11-16 16:43:55.306908] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000b8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.737 [2024-11-16 16:43:55.306925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.737 [2024-11-16 16:43:55.306953] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000b8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.737 [2024-11-16 16:43:55.306968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.737 #38 NEW cov: 11811 ft: 15056 corp: 32/782b lim: 35 exec/s: 38 rss: 69Mb L: 25/34 MS: 1 ShuffleBytes- 00:08:09.737 [2024-11-16 16:43:55.356866] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.737 [2024-11-16 16:43:55.356897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.737 [2024-11-16 16:43:55.356929] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:8000008e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.737 [2024-11-16 16:43:55.356946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.737 #39 NEW cov: 11811 ft: 15072 corp: 33/801b lim: 35 exec/s: 19 rss: 69Mb L: 19/34 MS: 1 InsertByte- 00:08:09.737 #39 DONE cov: 11811 ft: 15072 corp: 33/801b lim: 35 exec/s: 19 rss: 69Mb 00:08:09.737 ###### Recommended dictionary. ###### 00:08:09.737 "\377\377" # Uses: 1 00:08:09.737 ###### End of recommended dictionary. ###### 00:08:09.737 Done 39 runs in 2 second(s) 00:08:09.997 16:43:55 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:08:09.997 16:43:55 -- ../common.sh@72 -- # (( i++ )) 00:08:09.997 16:43:55 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:09.997 16:43:55 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:09.997 16:43:55 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:09.997 16:43:55 -- nvmf/run.sh@24 -- # local timen=1 00:08:09.997 16:43:55 -- nvmf/run.sh@25 -- # local core=0x1 00:08:09.997 16:43:55 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:09.997 16:43:55 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:09.997 16:43:55 -- nvmf/run.sh@29 -- # printf %02d 15 00:08:09.997 16:43:55 -- nvmf/run.sh@29 -- # port=4415 00:08:09.997 16:43:55 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:09.997 16:43:55 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:09.997 16:43:55 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:09.997 16:43:55 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:08:09.997 [2024-11-16 16:43:55.548656] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:09.997 [2024-11-16 16:43:55.548756] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid489813 ] 00:08:09.997 EAL: No free 2048 kB hugepages reported on node 1 00:08:09.997 [2024-11-16 16:43:55.730471] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.256 [2024-11-16 16:43:55.750131] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:10.256 [2024-11-16 16:43:55.750248] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.256 [2024-11-16 16:43:55.801479] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:10.256 [2024-11-16 16:43:55.817789] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:10.256 INFO: Running with entropic power schedule (0xFF, 100). 00:08:10.256 INFO: Seed: 853532930 00:08:10.256 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:10.256 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:10.256 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:10.256 INFO: A corpus is not provided, starting from an empty corpus 00:08:10.256 #2 INITED exec/s: 0 rss: 59Mb 00:08:10.256 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:10.256 This may also happen if the target rejected all inputs we tried so far 00:08:10.256 [2024-11-16 16:43:55.863254] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.256 [2024-11-16 16:43:55.863283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.256 [2024-11-16 16:43:55.863340] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.256 [2024-11-16 16:43:55.863354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.257 [2024-11-16 16:43:55.863407] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.257 [2024-11-16 16:43:55.863422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.257 [2024-11-16 16:43:55.863475] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.257 [2024-11-16 16:43:55.863488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.515 NEW_FUNC[1/670]: 0x466a08 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:10.515 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:10.515 #5 NEW cov: 11564 ft: 11565 corp: 2/30b lim: 35 exec/s: 0 rss: 67Mb L: 29/29 MS: 3 ChangeByte-ChangeBinInt-InsertRepeatedBytes- 00:08:10.515 [2024-11-16 16:43:56.174201] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.515 [2024-11-16 16:43:56.174232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.515 [2024-11-16 16:43:56.174295] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.515 [2024-11-16 16:43:56.174313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.515 [2024-11-16 16:43:56.174376] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.515 [2024-11-16 16:43:56.174391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.515 [2024-11-16 16:43:56.174452] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.515 [2024-11-16 16:43:56.174465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.515 #6 NEW cov: 11677 ft: 12017 corp: 3/59b lim: 35 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 ChangeByte- 00:08:10.515 [2024-11-16 16:43:56.224264] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.515 [2024-11-16 16:43:56.224290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.515 [2024-11-16 16:43:56.224349] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.516 [2024-11-16 16:43:56.224363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.516 [2024-11-16 16:43:56.224421] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.516 [2024-11-16 16:43:56.224436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.516 [2024-11-16 16:43:56.224494] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.516 [2024-11-16 16:43:56.224509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.516 #7 NEW cov: 11683 ft: 12312 corp: 4/87b lim: 35 exec/s: 0 rss: 67Mb L: 28/29 MS: 1 InsertRepeatedBytes- 00:08:10.775 [2024-11-16 16:43:56.264394] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.775 [2024-11-16 16:43:56.264422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.775 [2024-11-16 16:43:56.264485] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.775 [2024-11-16 16:43:56.264500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.775 [2024-11-16 16:43:56.264563] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.775 [2024-11-16 16:43:56.264578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.775 [2024-11-16 16:43:56.264638] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.775 [2024-11-16 16:43:56.264653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.775 #8 NEW cov: 11768 ft: 12554 corp: 5/115b lim: 35 exec/s: 0 rss: 67Mb L: 28/29 MS: 1 ChangeBinInt- 00:08:10.775 [2024-11-16 16:43:56.304436] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.775 [2024-11-16 16:43:56.304462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.775 [2024-11-16 16:43:56.304523] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.775 [2024-11-16 16:43:56.304541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.775 NEW_FUNC[1/1]: 0x4868f8 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:10.775 #10 NEW cov: 11782 ft: 13118 corp: 6/141b lim: 35 exec/s: 0 rss: 67Mb L: 26/29 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:10.775 [2024-11-16 16:43:56.344629] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.775 [2024-11-16 16:43:56.344655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.775 [2024-11-16 16:43:56.344721] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.775 [2024-11-16 16:43:56.344736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.775 [2024-11-16 16:43:56.344795] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.775 [2024-11-16 16:43:56.344808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.775 [2024-11-16 16:43:56.344866] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.775 [2024-11-16 16:43:56.344880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.775 #11 NEW cov: 11782 ft: 13192 corp: 7/170b lim: 35 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 InsertByte- 00:08:10.775 [2024-11-16 16:43:56.384758] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.775 [2024-11-16 16:43:56.384785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.775 [2024-11-16 16:43:56.384845] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.775 [2024-11-16 16:43:56.384860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.775 [2024-11-16 16:43:56.384919] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.775 [2024-11-16 16:43:56.384933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.775 [2024-11-16 16:43:56.384992] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.775 [2024-11-16 16:43:56.385005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.776 #12 NEW cov: 11782 ft: 13353 corp: 8/199b lim: 35 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 ChangeByte- 00:08:10.776 [2024-11-16 16:43:56.424864] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.776 [2024-11-16 16:43:56.424890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.776 [2024-11-16 16:43:56.424952] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.776 [2024-11-16 16:43:56.424967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.776 [2024-11-16 16:43:56.425029] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.776 [2024-11-16 16:43:56.425043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.776 [2024-11-16 16:43:56.425109] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.776 [2024-11-16 16:43:56.425123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.776 #13 NEW cov: 11782 ft: 13440 corp: 9/229b lim: 35 exec/s: 0 rss: 67Mb L: 30/30 MS: 1 InsertByte- 00:08:10.776 [2024-11-16 16:43:56.464997] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.776 [2024-11-16 16:43:56.465023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.776 [2024-11-16 16:43:56.465082] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.776 [2024-11-16 16:43:56.465096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.776 [2024-11-16 16:43:56.465154] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.776 [2024-11-16 16:43:56.465169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.776 [2024-11-16 16:43:56.465227] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.776 [2024-11-16 16:43:56.465240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.776 #14 NEW cov: 11782 ft: 13519 corp: 10/262b lim: 35 exec/s: 0 rss: 67Mb L: 33/33 MS: 1 CrossOver- 00:08:10.776 [2024-11-16 16:43:56.505133] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.776 [2024-11-16 16:43:56.505159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.776 [2024-11-16 16:43:56.505222] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.776 [2024-11-16 16:43:56.505237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.776 [2024-11-16 16:43:56.505297] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.776 [2024-11-16 16:43:56.505311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.776 [2024-11-16 16:43:56.505370] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:10.776 [2024-11-16 16:43:56.505384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.036 #15 NEW cov: 11782 ft: 13540 corp: 11/296b lim: 35 exec/s: 0 rss: 67Mb L: 34/34 MS: 1 CopyPart- 00:08:11.036 [2024-11-16 16:43:56.545262] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.036 [2024-11-16 16:43:56.545289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.036 [2024-11-16 16:43:56.545352] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.036 [2024-11-16 16:43:56.545365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.036 [2024-11-16 16:43:56.545424] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.036 [2024-11-16 16:43:56.545444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.036 [2024-11-16 16:43:56.545504] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.036 [2024-11-16 16:43:56.545518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.036 #16 NEW cov: 11782 ft: 13574 corp: 12/327b lim: 35 exec/s: 0 rss: 67Mb L: 31/34 MS: 1 CrossOver- 00:08:11.036 [2024-11-16 16:43:56.585364] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.036 [2024-11-16 16:43:56.585390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.036 [2024-11-16 16:43:56.585451] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.036 [2024-11-16 16:43:56.585465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.036 [2024-11-16 16:43:56.585524] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.036 [2024-11-16 16:43:56.585539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.036 [2024-11-16 16:43:56.585601] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.036 [2024-11-16 16:43:56.585614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.036 #17 NEW cov: 11791 ft: 13677 corp: 13/361b lim: 35 exec/s: 0 rss: 67Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:08:11.036 [2024-11-16 16:43:56.625454] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.036 [2024-11-16 16:43:56.625481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.036 [2024-11-16 16:43:56.625542] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.036 [2024-11-16 16:43:56.625556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.036 [2024-11-16 16:43:56.625619] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.036 [2024-11-16 16:43:56.625633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.036 [2024-11-16 16:43:56.625699] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.036 [2024-11-16 16:43:56.625712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.036 #18 NEW cov: 11791 ft: 13691 corp: 14/391b lim: 35 exec/s: 0 rss: 68Mb L: 30/34 MS: 1 CopyPart- 00:08:11.036 [2024-11-16 16:43:56.665616] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.036 [2024-11-16 16:43:56.665642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.036 [2024-11-16 16:43:56.665706] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.036 [2024-11-16 16:43:56.665721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.036 [2024-11-16 16:43:56.665778] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.036 [2024-11-16 16:43:56.665795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.036 [2024-11-16 16:43:56.665856] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.036 [2024-11-16 16:43:56.665870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.036 #19 NEW cov: 11791 ft: 13709 corp: 15/424b lim: 35 exec/s: 0 rss: 68Mb L: 33/34 MS: 1 ChangeBit- 00:08:11.036 [2024-11-16 16:43:56.705644] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.036 [2024-11-16 16:43:56.705675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.036 [2024-11-16 16:43:56.705739] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.036 [2024-11-16 16:43:56.705753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.036 #21 NEW cov: 11791 ft: 13715 corp: 16/445b lim: 35 exec/s: 0 rss: 68Mb L: 21/34 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:11.036 [2024-11-16 16:43:56.745747] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.036 [2024-11-16 16:43:56.745775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.036 [2024-11-16 16:43:56.745839] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.036 [2024-11-16 16:43:56.745854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.036 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:11.036 #22 NEW cov: 11814 ft: 13771 corp: 17/471b lim: 35 exec/s: 0 rss: 68Mb L: 26/34 MS: 1 CopyPart- 00:08:11.296 [2024-11-16 16:43:56.795990] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.296 [2024-11-16 16:43:56.796016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.296 [2024-11-16 16:43:56.796079] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.296 [2024-11-16 16:43:56.796093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.296 [2024-11-16 16:43:56.796152] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.296 [2024-11-16 16:43:56.796166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.296 [2024-11-16 16:43:56.796224] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.296 [2024-11-16 16:43:56.796238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.296 #23 NEW cov: 11814 ft: 13813 corp: 18/501b lim: 35 exec/s: 0 rss: 68Mb L: 30/34 MS: 1 InsertByte- 00:08:11.296 [2024-11-16 16:43:56.836072] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.296 [2024-11-16 16:43:56.836098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.296 [2024-11-16 16:43:56.836161] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.296 [2024-11-16 16:43:56.836178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.296 [2024-11-16 16:43:56.836237] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.296 [2024-11-16 16:43:56.836252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.296 [2024-11-16 16:43:56.836311] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.296 [2024-11-16 16:43:56.836325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.296 #24 NEW cov: 11814 ft: 13830 corp: 19/530b lim: 35 exec/s: 24 rss: 68Mb L: 29/34 MS: 1 ChangeBit- 00:08:11.296 [2024-11-16 16:43:56.876237] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.296 [2024-11-16 16:43:56.876263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.296 [2024-11-16 16:43:56.876323] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.296 [2024-11-16 16:43:56.876338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.296 [2024-11-16 16:43:56.876396] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000002f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.296 [2024-11-16 16:43:56.876410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.296 [2024-11-16 16:43:56.876472] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.296 [2024-11-16 16:43:56.876487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.296 #25 NEW cov: 11814 ft: 13845 corp: 20/563b lim: 35 exec/s: 25 rss: 68Mb L: 33/34 MS: 1 InsertRepeatedBytes- 00:08:11.296 [2024-11-16 16:43:56.916295] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.296 [2024-11-16 16:43:56.916321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.296 [2024-11-16 16:43:56.916385] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.296 [2024-11-16 16:43:56.916399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.296 [2024-11-16 16:43:56.916460] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.296 [2024-11-16 16:43:56.916475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.296 [2024-11-16 16:43:56.916537] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.296 [2024-11-16 16:43:56.916551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.296 #26 NEW cov: 11814 ft: 13922 corp: 21/596b lim: 35 exec/s: 26 rss: 68Mb L: 33/34 MS: 1 ShuffleBytes- 00:08:11.296 [2024-11-16 16:43:56.956237] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.296 [2024-11-16 16:43:56.956263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.296 [2024-11-16 16:43:56.956323] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.296 [2024-11-16 16:43:56.956340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.296 #31 NEW cov: 11814 ft: 14117 corp: 22/611b lim: 35 exec/s: 31 rss: 68Mb L: 15/34 MS: 5 InsertByte-ChangeBinInt-EraseBytes-ChangeBinInt-InsertRepeatedBytes- 00:08:11.296 [2024-11-16 16:43:56.996554] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.296 [2024-11-16 16:43:56.996580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.296 [2024-11-16 16:43:56.996643] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.296 [2024-11-16 16:43:56.996657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.296 [2024-11-16 16:43:56.996727] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000365 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.296 [2024-11-16 16:43:56.996741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.297 [2024-11-16 16:43:56.996802] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.297 [2024-11-16 16:43:56.996815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.297 #32 NEW cov: 11814 ft: 14141 corp: 23/639b lim: 35 exec/s: 32 rss: 68Mb L: 28/34 MS: 1 InsertRepeatedBytes- 00:08:11.297 [2024-11-16 16:43:57.036602] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.297 [2024-11-16 16:43:57.036629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.297 [2024-11-16 16:43:57.036697] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.297 [2024-11-16 16:43:57.036712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.297 [2024-11-16 16:43:57.036773] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.297 [2024-11-16 16:43:57.036787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.556 #33 NEW cov: 11814 ft: 14253 corp: 24/660b lim: 35 exec/s: 33 rss: 68Mb L: 21/34 MS: 1 EraseBytes- 00:08:11.556 [2024-11-16 16:43:57.076874] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.556 [2024-11-16 16:43:57.076899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.556 [2024-11-16 16:43:57.076962] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.556 [2024-11-16 16:43:57.076976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.556 [2024-11-16 16:43:57.077035] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.556 [2024-11-16 16:43:57.077049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.556 [2024-11-16 16:43:57.077109] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.556 [2024-11-16 16:43:57.077123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.556 #34 NEW cov: 11814 ft: 14264 corp: 25/690b lim: 35 exec/s: 34 rss: 68Mb L: 30/34 MS: 1 ChangeBinInt- 00:08:11.556 #38 NEW cov: 11814 ft: 14474 corp: 26/701b lim: 35 exec/s: 38 rss: 68Mb L: 11/34 MS: 4 InsertByte-ShuffleBytes-CrossOver-CrossOver- 00:08:11.556 [2024-11-16 16:43:57.156973] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.556 [2024-11-16 16:43:57.157000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.556 [2024-11-16 16:43:57.157061] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.556 [2024-11-16 16:43:57.157074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.556 [2024-11-16 16:43:57.157135] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.556 [2024-11-16 16:43:57.157148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.556 [2024-11-16 16:43:57.157208] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.556 [2024-11-16 16:43:57.157221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.556 #39 NEW cov: 11814 ft: 14491 corp: 27/730b lim: 35 exec/s: 39 rss: 68Mb L: 29/34 MS: 1 ChangeBinInt- 00:08:11.556 [2024-11-16 16:43:57.197074] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.556 [2024-11-16 16:43:57.197100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.556 [2024-11-16 16:43:57.197163] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006a7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.556 [2024-11-16 16:43:57.197177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.556 #40 NEW cov: 11814 ft: 14541 corp: 28/757b lim: 35 exec/s: 40 rss: 68Mb L: 27/34 MS: 1 InsertByte- 00:08:11.556 [2024-11-16 16:43:57.237272] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.556 [2024-11-16 16:43:57.237298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.556 [2024-11-16 16:43:57.237361] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.556 [2024-11-16 16:43:57.237374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.556 [2024-11-16 16:43:57.237436] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.556 [2024-11-16 16:43:57.237450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.556 [2024-11-16 16:43:57.237514] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.557 [2024-11-16 16:43:57.237527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.557 #41 NEW cov: 11814 ft: 14609 corp: 29/791b lim: 35 exec/s: 41 rss: 68Mb L: 34/34 MS: 1 CMP- DE: "\377\377\377\013"- 00:08:11.557 [2024-11-16 16:43:57.277381] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.557 [2024-11-16 16:43:57.277407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.557 [2024-11-16 16:43:57.277470] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.557 [2024-11-16 16:43:57.277484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.557 [2024-11-16 16:43:57.277547] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.557 [2024-11-16 16:43:57.277561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.557 [2024-11-16 16:43:57.277622] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.557 [2024-11-16 16:43:57.277636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.557 #42 NEW cov: 11814 ft: 14667 corp: 30/822b lim: 35 exec/s: 42 rss: 68Mb L: 31/34 MS: 1 CopyPart- 00:08:11.817 [2024-11-16 16:43:57.317516] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.817 [2024-11-16 16:43:57.317541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.817 [2024-11-16 16:43:57.317603] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.817 [2024-11-16 16:43:57.317616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.817 [2024-11-16 16:43:57.317677] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000009e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.817 [2024-11-16 16:43:57.317691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.817 [2024-11-16 16:43:57.317755] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000009e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.817 [2024-11-16 16:43:57.317769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.817 #43 NEW cov: 11814 ft: 14691 corp: 31/856b lim: 35 exec/s: 43 rss: 68Mb L: 34/34 MS: 1 CopyPart- 00:08:11.817 [2024-11-16 16:43:57.357605] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.817 [2024-11-16 16:43:57.357630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.817 [2024-11-16 16:43:57.357690] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.817 [2024-11-16 16:43:57.357720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.817 [2024-11-16 16:43:57.357783] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.817 [2024-11-16 16:43:57.357796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.817 [2024-11-16 16:43:57.357857] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.817 [2024-11-16 16:43:57.357872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.817 #44 NEW cov: 11814 ft: 14704 corp: 32/884b lim: 35 exec/s: 44 rss: 68Mb L: 28/34 MS: 1 ChangeByte- 00:08:11.817 [2024-11-16 16:43:57.387757] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.817 [2024-11-16 16:43:57.387783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.817 [2024-11-16 16:43:57.387853] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.817 [2024-11-16 16:43:57.387868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.818 [2024-11-16 16:43:57.387929] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-11-16 16:43:57.387943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.818 [2024-11-16 16:43:57.388005] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-11-16 16:43:57.388018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.818 #45 NEW cov: 11814 ft: 14748 corp: 33/918b lim: 35 exec/s: 45 rss: 69Mb L: 34/34 MS: 1 ShuffleBytes- 00:08:11.818 [2024-11-16 16:43:57.427875] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-11-16 16:43:57.427901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.818 [2024-11-16 16:43:57.427966] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-11-16 16:43:57.427981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.818 [2024-11-16 16:43:57.428048] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-11-16 16:43:57.428062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.818 [2024-11-16 16:43:57.428122] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-11-16 16:43:57.428137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.818 #51 NEW cov: 11814 ft: 14808 corp: 34/946b lim: 35 exec/s: 51 rss: 69Mb L: 28/34 MS: 1 ChangeBinInt- 00:08:11.818 [2024-11-16 16:43:57.467943] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-11-16 16:43:57.467970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.818 [2024-11-16 16:43:57.468034] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-11-16 16:43:57.468049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.818 [2024-11-16 16:43:57.468110] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-11-16 16:43:57.468124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.818 [2024-11-16 16:43:57.468185] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-11-16 16:43:57.468199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.818 #52 NEW cov: 11814 ft: 14854 corp: 35/975b lim: 35 exec/s: 52 rss: 69Mb L: 29/34 MS: 1 InsertByte- 00:08:11.818 [2024-11-16 16:43:57.508087] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-11-16 16:43:57.508116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.818 [2024-11-16 16:43:57.508176] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-11-16 16:43:57.508190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.818 [2024-11-16 16:43:57.508251] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-11-16 16:43:57.508266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.818 [2024-11-16 16:43:57.508327] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-11-16 16:43:57.508340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.818 #53 NEW cov: 11814 ft: 14859 corp: 36/1005b lim: 35 exec/s: 53 rss: 69Mb L: 30/34 MS: 1 ChangeByte- 00:08:11.818 [2024-11-16 16:43:57.548026] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-11-16 16:43:57.548052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.818 [2024-11-16 16:43:57.548112] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-11-16 16:43:57.548127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.818 [2024-11-16 16:43:57.548188] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.818 [2024-11-16 16:43:57.548203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.078 #54 NEW cov: 11814 ft: 14896 corp: 37/1026b lim: 35 exec/s: 54 rss: 69Mb L: 21/34 MS: 1 ChangeBinInt- 00:08:12.078 [2024-11-16 16:43:57.588167] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000340 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.078 [2024-11-16 16:43:57.588193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.078 [2024-11-16 16:43:57.588256] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.078 [2024-11-16 16:43:57.588270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.078 [2024-11-16 16:43:57.588331] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.078 [2024-11-16 16:43:57.588345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.078 #55 NEW cov: 11814 ft: 14912 corp: 38/1047b lim: 35 exec/s: 55 rss: 69Mb L: 21/34 MS: 1 ChangeBit- 00:08:12.078 [2024-11-16 16:43:57.628408] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.078 [2024-11-16 16:43:57.628434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.078 [2024-11-16 16:43:57.628495] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.078 [2024-11-16 16:43:57.628509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.078 [2024-11-16 16:43:57.628566] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.078 [2024-11-16 16:43:57.628583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.078 [2024-11-16 16:43:57.628643] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.078 [2024-11-16 16:43:57.628657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.078 [2024-11-16 16:43:57.658504] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.078 [2024-11-16 16:43:57.658530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.078 [2024-11-16 16:43:57.658592] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.078 [2024-11-16 16:43:57.658606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.078 [2024-11-16 16:43:57.658665] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.078 [2024-11-16 16:43:57.658683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.078 [2024-11-16 16:43:57.658746] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.078 [2024-11-16 16:43:57.658759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.078 #57 NEW cov: 11814 ft: 14913 corp: 39/1080b lim: 35 exec/s: 57 rss: 69Mb L: 33/34 MS: 2 InsertRepeatedBytes-ShuffleBytes- 00:08:12.078 [2024-11-16 16:43:57.688584] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.078 [2024-11-16 16:43:57.688610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.078 [2024-11-16 16:43:57.688676] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.078 [2024-11-16 16:43:57.688690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.078 [2024-11-16 16:43:57.688752] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.078 [2024-11-16 16:43:57.688766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.078 [2024-11-16 16:43:57.688825] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.078 [2024-11-16 16:43:57.688840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.078 #58 NEW cov: 11814 ft: 14916 corp: 40/1114b lim: 35 exec/s: 58 rss: 69Mb L: 34/34 MS: 1 ChangeByte- 00:08:12.078 [2024-11-16 16:43:57.728707] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000065 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.078 [2024-11-16 16:43:57.728733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.078 [2024-11-16 16:43:57.728795] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.078 [2024-11-16 16:43:57.728809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.078 [2024-11-16 16:43:57.728872] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000002f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.078 [2024-11-16 16:43:57.728889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.078 [2024-11-16 16:43:57.728950] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.078 [2024-11-16 16:43:57.728964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.078 #59 NEW cov: 11814 ft: 14922 corp: 41/1147b lim: 35 exec/s: 59 rss: 69Mb L: 33/34 MS: 1 ChangeBit- 00:08:12.078 [2024-11-16 16:43:57.768824] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.079 [2024-11-16 16:43:57.768849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.079 [2024-11-16 16:43:57.768910] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.079 [2024-11-16 16:43:57.768924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.079 [2024-11-16 16:43:57.768984] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.079 [2024-11-16 16:43:57.768997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.079 [2024-11-16 16:43:57.769057] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.079 [2024-11-16 16:43:57.769070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.079 #60 NEW cov: 11814 ft: 14926 corp: 42/1180b lim: 35 exec/s: 60 rss: 69Mb L: 33/34 MS: 1 ChangeByte- 00:08:12.079 [2024-11-16 16:43:57.808971] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.079 [2024-11-16 16:43:57.808996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.079 [2024-11-16 16:43:57.809056] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.079 [2024-11-16 16:43:57.809070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.079 [2024-11-16 16:43:57.809130] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.079 [2024-11-16 16:43:57.809144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.079 [2024-11-16 16:43:57.809202] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.079 [2024-11-16 16:43:57.809216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.338 #61 NEW cov: 11814 ft: 14971 corp: 43/1214b lim: 35 exec/s: 61 rss: 69Mb L: 34/34 MS: 1 ChangeBit- 00:08:12.338 [2024-11-16 16:43:57.849047] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.338 [2024-11-16 16:43:57.849072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.338 [2024-11-16 16:43:57.849134] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000060 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.339 [2024-11-16 16:43:57.849148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.339 [2024-11-16 16:43:57.849213] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000002c0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.339 [2024-11-16 16:43:57.849227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.339 [2024-11-16 16:43:57.849288] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.339 [2024-11-16 16:43:57.849303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.339 #62 NEW cov: 11814 ft: 14982 corp: 44/1242b lim: 35 exec/s: 31 rss: 69Mb L: 28/34 MS: 1 CMP- DE: "\001\212\027(\361\275\300L"- 00:08:12.339 #62 DONE cov: 11814 ft: 14982 corp: 44/1242b lim: 35 exec/s: 31 rss: 69Mb 00:08:12.339 ###### Recommended dictionary. ###### 00:08:12.339 "\377\377\377\013" # Uses: 1 00:08:12.339 "\001\212\027(\361\275\300L" # Uses: 0 00:08:12.339 ###### End of recommended dictionary. ###### 00:08:12.339 Done 62 runs in 2 second(s) 00:08:12.339 16:43:57 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:08:12.339 16:43:57 -- ../common.sh@72 -- # (( i++ )) 00:08:12.339 16:43:57 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:12.339 16:43:57 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:12.339 16:43:57 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:12.339 16:43:57 -- nvmf/run.sh@24 -- # local timen=1 00:08:12.339 16:43:57 -- nvmf/run.sh@25 -- # local core=0x1 00:08:12.339 16:43:57 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:12.339 16:43:57 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:12.339 16:43:57 -- nvmf/run.sh@29 -- # printf %02d 16 00:08:12.339 16:43:57 -- nvmf/run.sh@29 -- # port=4416 00:08:12.339 16:43:57 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:12.339 16:43:57 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:12.339 16:43:57 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:12.339 16:43:57 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:08:12.339 [2024-11-16 16:43:58.022613] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:12.339 [2024-11-16 16:43:58.022706] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid490179 ] 00:08:12.339 EAL: No free 2048 kB hugepages reported on node 1 00:08:12.598 [2024-11-16 16:43:58.199796] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:12.598 [2024-11-16 16:43:58.218876] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:12.598 [2024-11-16 16:43:58.218992] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.598 [2024-11-16 16:43:58.270280] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:12.598 [2024-11-16 16:43:58.286625] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:12.598 INFO: Running with entropic power schedule (0xFF, 100). 00:08:12.598 INFO: Seed: 3322541752 00:08:12.598 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:12.598 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:12.598 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:12.598 INFO: A corpus is not provided, starting from an empty corpus 00:08:12.598 #2 INITED exec/s: 0 rss: 59Mb 00:08:12.598 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:12.598 This may also happen if the target rejected all inputs we tried so far 00:08:12.857 [2024-11-16 16:43:58.352862] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.857 [2024-11-16 16:43:58.352903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.857 [2024-11-16 16:43:58.353028] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.857 [2024-11-16 16:43:58.353052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.857 [2024-11-16 16:43:58.353176] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.857 [2024-11-16 16:43:58.353200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.857 [2024-11-16 16:43:58.353312] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.857 [2024-11-16 16:43:58.353334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.118 NEW_FUNC[1/671]: 0x467ec8 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:13.118 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:13.118 #6 NEW cov: 11662 ft: 11668 corp: 2/103b lim: 105 exec/s: 0 rss: 67Mb L: 102/102 MS: 4 InsertByte-ChangeBit-ChangeBit-InsertRepeatedBytes- 00:08:13.118 [2024-11-16 16:43:58.684315] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.118 [2024-11-16 16:43:58.684370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.118 [2024-11-16 16:43:58.684523] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.118 [2024-11-16 16:43:58.684553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.118 [2024-11-16 16:43:58.684709] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.118 [2024-11-16 16:43:58.684737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.118 [2024-11-16 16:43:58.684887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.118 [2024-11-16 16:43:58.684914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.118 #7 NEW cov: 11780 ft: 12240 corp: 3/207b lim: 105 exec/s: 0 rss: 67Mb L: 104/104 MS: 1 CopyPart- 00:08:13.118 [2024-11-16 16:43:58.744594] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.118 [2024-11-16 16:43:58.744631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.118 [2024-11-16 16:43:58.744743] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.118 [2024-11-16 16:43:58.744768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.118 [2024-11-16 16:43:58.744914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:16212958658533785599 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.118 [2024-11-16 16:43:58.744939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.118 [2024-11-16 16:43:58.745079] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.118 [2024-11-16 16:43:58.745101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.118 [2024-11-16 16:43:58.745238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.118 [2024-11-16 16:43:58.745260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:13.118 #13 NEW cov: 11786 ft: 12587 corp: 4/312b lim: 105 exec/s: 0 rss: 67Mb L: 105/105 MS: 1 InsertByte- 00:08:13.118 [2024-11-16 16:43:58.804498] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.118 [2024-11-16 16:43:58.804532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.118 [2024-11-16 16:43:58.804628] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18387352853623603199 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.118 [2024-11-16 16:43:58.804650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.118 [2024-11-16 16:43:58.804804] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.118 [2024-11-16 16:43:58.804839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.118 [2024-11-16 16:43:58.804987] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.118 [2024-11-16 16:43:58.805011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.118 #19 NEW cov: 11871 ft: 12812 corp: 5/415b lim: 105 exec/s: 0 rss: 67Mb L: 103/105 MS: 1 InsertByte- 00:08:13.118 [2024-11-16 16:43:58.854029] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.118 [2024-11-16 16:43:58.854055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.378 #24 NEW cov: 11871 ft: 13581 corp: 6/443b lim: 105 exec/s: 0 rss: 67Mb L: 28/105 MS: 5 CMP-ChangeBinInt-EraseBytes-ChangeBit-CrossOver- DE: "\036\000"- 00:08:13.378 [2024-11-16 16:43:58.904922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.378 [2024-11-16 16:43:58.904954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.378 [2024-11-16 16:43:58.905043] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2162009296114548735 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.378 [2024-11-16 16:43:58.905065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.378 [2024-11-16 16:43:58.905197] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.378 [2024-11-16 16:43:58.905217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.378 [2024-11-16 16:43:58.905354] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.378 [2024-11-16 16:43:58.905376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.378 #25 NEW cov: 11871 ft: 13667 corp: 7/546b lim: 105 exec/s: 0 rss: 67Mb L: 103/105 MS: 1 PersAutoDict- DE: "\036\000"- 00:08:13.378 [2024-11-16 16:43:58.954364] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.378 [2024-11-16 16:43:58.954399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.378 #26 NEW cov: 11871 ft: 13708 corp: 8/576b lim: 105 exec/s: 0 rss: 67Mb L: 30/105 MS: 1 PersAutoDict- DE: "\036\000"- 00:08:13.378 [2024-11-16 16:43:59.014652] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069584322815 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.378 [2024-11-16 16:43:59.014681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.378 #27 NEW cov: 11871 ft: 13732 corp: 9/606b lim: 105 exec/s: 0 rss: 67Mb L: 30/105 MS: 1 PersAutoDict- DE: "\036\000"- 00:08:13.378 [2024-11-16 16:43:59.064752] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:72057589927509790 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.378 [2024-11-16 16:43:59.064778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.378 #28 NEW cov: 11871 ft: 13787 corp: 10/634b lim: 105 exec/s: 0 rss: 67Mb L: 28/105 MS: 1 PersAutoDict- DE: "\036\000"- 00:08:13.378 [2024-11-16 16:43:59.115527] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.378 [2024-11-16 16:43:59.115557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.378 [2024-11-16 16:43:59.115645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18387352853623603199 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.378 [2024-11-16 16:43:59.115667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.378 [2024-11-16 16:43:59.115830] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073708503039 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.378 [2024-11-16 16:43:59.115852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.378 [2024-11-16 16:43:59.116001] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.378 [2024-11-16 16:43:59.116027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.642 #29 NEW cov: 11871 ft: 13824 corp: 11/737b lim: 105 exec/s: 0 rss: 67Mb L: 103/105 MS: 1 ChangeBit- 00:08:13.642 [2024-11-16 16:43:59.165298] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.642 [2024-11-16 16:43:59.165332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.642 [2024-11-16 16:43:59.165487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18387352853623603199 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.642 [2024-11-16 16:43:59.165507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.642 #30 NEW cov: 11871 ft: 14163 corp: 12/797b lim: 105 exec/s: 0 rss: 67Mb L: 60/105 MS: 1 EraseBytes- 00:08:13.642 [2024-11-16 16:43:59.225319] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1945555034913636126 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.642 [2024-11-16 16:43:59.225344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.642 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:13.642 #31 NEW cov: 11894 ft: 14220 corp: 13/825b lim: 105 exec/s: 0 rss: 68Mb L: 28/105 MS: 1 ChangeByte- 00:08:13.642 [2024-11-16 16:43:59.285561] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.642 [2024-11-16 16:43:59.285590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.642 #32 NEW cov: 11894 ft: 14245 corp: 14/858b lim: 105 exec/s: 0 rss: 68Mb L: 33/105 MS: 1 EraseBytes- 00:08:13.642 [2024-11-16 16:43:59.345870] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069584322815 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.642 [2024-11-16 16:43:59.345902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.642 [2024-11-16 16:43:59.346039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.642 [2024-11-16 16:43:59.346060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.642 #33 NEW cov: 11894 ft: 14265 corp: 15/912b lim: 105 exec/s: 33 rss: 68Mb L: 54/105 MS: 1 CopyPart- 00:08:13.900 [2024-11-16 16:43:59.406163] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069584322815 len:64512 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.900 [2024-11-16 16:43:59.406198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.900 [2024-11-16 16:43:59.406334] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.900 [2024-11-16 16:43:59.406359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.900 #34 NEW cov: 11894 ft: 14291 corp: 16/966b lim: 105 exec/s: 34 rss: 68Mb L: 54/105 MS: 1 ChangeBit- 00:08:13.900 [2024-11-16 16:43:59.455957] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069584322815 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.900 [2024-11-16 16:43:59.455990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.900 #35 NEW cov: 11894 ft: 14300 corp: 17/997b lim: 105 exec/s: 35 rss: 68Mb L: 31/105 MS: 1 InsertByte- 00:08:13.900 [2024-11-16 16:43:59.506139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.901 [2024-11-16 16:43:59.506171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.901 #36 NEW cov: 11894 ft: 14305 corp: 18/1025b lim: 105 exec/s: 36 rss: 68Mb L: 28/105 MS: 1 ShuffleBytes- 00:08:13.901 [2024-11-16 16:43:59.557039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.901 [2024-11-16 16:43:59.557070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.901 [2024-11-16 16:43:59.557173] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2161761905998299135 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.901 [2024-11-16 16:43:59.557200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.901 [2024-11-16 16:43:59.557340] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.901 [2024-11-16 16:43:59.557366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.901 [2024-11-16 16:43:59.557498] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.901 [2024-11-16 16:43:59.557519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.901 #37 NEW cov: 11894 ft: 14314 corp: 19/1128b lim: 105 exec/s: 37 rss: 68Mb L: 103/105 MS: 1 ChangeByte- 00:08:13.901 [2024-11-16 16:43:59.606937] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133470 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.901 [2024-11-16 16:43:59.606972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.901 [2024-11-16 16:43:59.607099] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.901 [2024-11-16 16:43:59.607123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.901 [2024-11-16 16:43:59.607265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709492991 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.901 [2024-11-16 16:43:59.607291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.901 #38 NEW cov: 11894 ft: 14604 corp: 20/1196b lim: 105 exec/s: 38 rss: 68Mb L: 68/105 MS: 1 InsertRepeatedBytes- 00:08:14.160 [2024-11-16 16:43:59.666771] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1945555034913636126 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.160 [2024-11-16 16:43:59.666803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.160 #39 NEW cov: 11894 ft: 14623 corp: 21/1224b lim: 105 exec/s: 39 rss: 68Mb L: 28/105 MS: 1 ShuffleBytes- 00:08:14.160 [2024-11-16 16:43:59.716893] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.160 [2024-11-16 16:43:59.716921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.160 #40 NEW cov: 11894 ft: 14685 corp: 22/1252b lim: 105 exec/s: 40 rss: 68Mb L: 28/105 MS: 1 ChangeByte- 00:08:14.160 [2024-11-16 16:43:59.767229] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069584322815 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.160 [2024-11-16 16:43:59.767261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.160 [2024-11-16 16:43:59.767404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.160 [2024-11-16 16:43:59.767430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.160 #41 NEW cov: 11894 ft: 14702 corp: 23/1306b lim: 105 exec/s: 41 rss: 68Mb L: 54/105 MS: 1 ChangeBinInt- 00:08:14.160 [2024-11-16 16:43:59.817191] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65311 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.160 [2024-11-16 16:43:59.817237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.160 #42 NEW cov: 11894 ft: 14711 corp: 24/1334b lim: 105 exec/s: 42 rss: 68Mb L: 28/105 MS: 1 PersAutoDict- DE: "\036\000"- 00:08:14.160 [2024-11-16 16:43:59.877853] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65311 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.160 [2024-11-16 16:43:59.877888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.160 [2024-11-16 16:43:59.877980] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.160 [2024-11-16 16:43:59.878002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.160 [2024-11-16 16:43:59.878147] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.160 [2024-11-16 16:43:59.878171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.160 #43 NEW cov: 11894 ft: 14732 corp: 25/1400b lim: 105 exec/s: 43 rss: 68Mb L: 66/105 MS: 1 CrossOver- 00:08:14.420 [2024-11-16 16:43:59.938261] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.420 [2024-11-16 16:43:59.938300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.420 [2024-11-16 16:43:59.938401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.420 [2024-11-16 16:43:59.938426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.420 [2024-11-16 16:43:59.938561] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:57344 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.420 [2024-11-16 16:43:59.938586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.420 [2024-11-16 16:43:59.938736] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.420 [2024-11-16 16:43:59.938762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.420 #44 NEW cov: 11894 ft: 14783 corp: 26/1502b lim: 105 exec/s: 44 rss: 68Mb L: 102/105 MS: 1 ChangeBit- 00:08:14.420 [2024-11-16 16:43:59.987823] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069599133695 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.420 [2024-11-16 16:43:59.987850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.420 #45 NEW cov: 11894 ft: 14784 corp: 27/1531b lim: 105 exec/s: 45 rss: 68Mb L: 29/105 MS: 1 InsertByte- 00:08:14.420 [2024-11-16 16:44:00.038872] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.420 [2024-11-16 16:44:00.038906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.420 [2024-11-16 16:44:00.039014] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2161761905998299135 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.420 [2024-11-16 16:44:00.039031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.420 [2024-11-16 16:44:00.039164] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.420 [2024-11-16 16:44:00.039182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.420 [2024-11-16 16:44:00.039303] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.420 [2024-11-16 16:44:00.039331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.420 [2024-11-16 16:44:00.039487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.420 [2024-11-16 16:44:00.039512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:14.420 #46 NEW cov: 11894 ft: 14801 corp: 28/1636b lim: 105 exec/s: 46 rss: 68Mb L: 105/105 MS: 1 CrossOver- 00:08:14.420 [2024-11-16 16:44:00.098147] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069584322815 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.420 [2024-11-16 16:44:00.098183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.420 #47 NEW cov: 11894 ft: 14805 corp: 29/1666b lim: 105 exec/s: 47 rss: 68Mb L: 30/105 MS: 1 ShuffleBytes- 00:08:14.420 [2024-11-16 16:44:00.149001] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.420 [2024-11-16 16:44:00.149037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.420 [2024-11-16 16:44:00.149115] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.420 [2024-11-16 16:44:00.149139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.420 [2024-11-16 16:44:00.149273] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.420 [2024-11-16 16:44:00.149296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.420 [2024-11-16 16:44:00.149435] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.420 [2024-11-16 16:44:00.149461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.680 #48 NEW cov: 11894 ft: 14851 corp: 30/1768b lim: 105 exec/s: 48 rss: 68Mb L: 102/105 MS: 1 ShuffleBytes- 00:08:14.680 [2024-11-16 16:44:00.199518] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709522175 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.680 [2024-11-16 16:44:00.199552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.680 [2024-11-16 16:44:00.199646] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2161761905998299135 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.680 [2024-11-16 16:44:00.199675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.680 [2024-11-16 16:44:00.199815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.680 [2024-11-16 16:44:00.199837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.680 [2024-11-16 16:44:00.199982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.680 [2024-11-16 16:44:00.200004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.680 [2024-11-16 16:44:00.200145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.680 [2024-11-16 16:44:00.200169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:14.680 #49 NEW cov: 11894 ft: 14857 corp: 31/1873b lim: 105 exec/s: 49 rss: 69Mb L: 105/105 MS: 1 ChangeByte- 00:08:14.680 [2024-11-16 16:44:00.259373] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.680 [2024-11-16 16:44:00.259407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.680 [2024-11-16 16:44:00.259530] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.680 [2024-11-16 16:44:00.259554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.680 [2024-11-16 16:44:00.259688] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.680 [2024-11-16 16:44:00.259738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.680 [2024-11-16 16:44:00.259888] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.680 [2024-11-16 16:44:00.259915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.680 #50 NEW cov: 11894 ft: 14883 corp: 32/1977b lim: 105 exec/s: 50 rss: 69Mb L: 104/105 MS: 1 PersAutoDict- DE: "\036\000"- 00:08:14.680 [2024-11-16 16:44:00.308911] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446495579971256319 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.680 [2024-11-16 16:44:00.308944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.680 #51 NEW cov: 11894 ft: 14890 corp: 33/2006b lim: 105 exec/s: 25 rss: 69Mb L: 29/105 MS: 1 ChangeBinInt- 00:08:14.680 #51 DONE cov: 11894 ft: 14890 corp: 33/2006b lim: 105 exec/s: 25 rss: 69Mb 00:08:14.680 ###### Recommended dictionary. ###### 00:08:14.680 "\036\000" # Uses: 6 00:08:14.680 ###### End of recommended dictionary. ###### 00:08:14.680 Done 51 runs in 2 second(s) 00:08:14.940 16:44:00 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:08:14.940 16:44:00 -- ../common.sh@72 -- # (( i++ )) 00:08:14.940 16:44:00 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:14.940 16:44:00 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:14.940 16:44:00 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:14.940 16:44:00 -- nvmf/run.sh@24 -- # local timen=1 00:08:14.940 16:44:00 -- nvmf/run.sh@25 -- # local core=0x1 00:08:14.940 16:44:00 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:14.940 16:44:00 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:14.940 16:44:00 -- nvmf/run.sh@29 -- # printf %02d 17 00:08:14.940 16:44:00 -- nvmf/run.sh@29 -- # port=4417 00:08:14.940 16:44:00 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:14.940 16:44:00 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:14.940 16:44:00 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:14.940 16:44:00 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:08:14.940 [2024-11-16 16:44:00.493944] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:14.940 [2024-11-16 16:44:00.494010] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid490716 ] 00:08:14.940 EAL: No free 2048 kB hugepages reported on node 1 00:08:14.940 [2024-11-16 16:44:00.674944] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.200 [2024-11-16 16:44:00.694877] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:15.200 [2024-11-16 16:44:00.694998] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.200 [2024-11-16 16:44:00.746604] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:15.200 [2024-11-16 16:44:00.762881] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:15.200 INFO: Running with entropic power schedule (0xFF, 100). 00:08:15.200 INFO: Seed: 1504570900 00:08:15.200 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:15.200 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:15.200 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:15.200 INFO: A corpus is not provided, starting from an empty corpus 00:08:15.200 #2 INITED exec/s: 0 rss: 60Mb 00:08:15.200 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:15.200 This may also happen if the target rejected all inputs we tried so far 00:08:15.200 [2024-11-16 16:44:00.818006] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070136004607 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.200 [2024-11-16 16:44:00.818035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.459 NEW_FUNC[1/672]: 0x46b1b8 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:15.459 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:15.459 #14 NEW cov: 11688 ft: 11657 corp: 2/39b lim: 120 exec/s: 0 rss: 67Mb L: 38/38 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:15.459 [2024-11-16 16:44:01.119003] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18386789900096634879 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.459 [2024-11-16 16:44:01.119037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.459 [2024-11-16 16:44:01.119102] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.459 [2024-11-16 16:44:01.119120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.459 #15 NEW cov: 11801 ft: 12974 corp: 3/90b lim: 120 exec/s: 0 rss: 67Mb L: 51/51 MS: 1 CopyPart- 00:08:15.459 [2024-11-16 16:44:01.168930] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18386789900096634879 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.459 [2024-11-16 16:44:01.168961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.460 #16 NEW cov: 11807 ft: 13245 corp: 4/136b lim: 120 exec/s: 0 rss: 67Mb L: 46/51 MS: 1 EraseBytes- 00:08:15.719 [2024-11-16 16:44:01.209133] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446743155307970559 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.719 [2024-11-16 16:44:01.209168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.719 [2024-11-16 16:44:01.209225] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.719 [2024-11-16 16:44:01.209242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.719 #22 NEW cov: 11892 ft: 13526 corp: 5/187b lim: 120 exec/s: 0 rss: 67Mb L: 51/51 MS: 1 ShuffleBytes- 00:08:15.719 [2024-11-16 16:44:01.249146] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18386789900096634879 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.719 [2024-11-16 16:44:01.249175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.719 #23 NEW cov: 11892 ft: 13591 corp: 6/233b lim: 120 exec/s: 0 rss: 67Mb L: 46/51 MS: 1 ChangeBinInt- 00:08:15.719 [2024-11-16 16:44:01.289578] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.719 [2024-11-16 16:44:01.289608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.719 [2024-11-16 16:44:01.289651] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.719 [2024-11-16 16:44:01.289667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.719 [2024-11-16 16:44:01.289729] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.719 [2024-11-16 16:44:01.289745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.719 #24 NEW cov: 11892 ft: 13996 corp: 7/325b lim: 120 exec/s: 0 rss: 67Mb L: 92/92 MS: 1 InsertRepeatedBytes- 00:08:15.719 [2024-11-16 16:44:01.329365] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18386789900096634879 len:12032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.719 [2024-11-16 16:44:01.329393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.719 #25 NEW cov: 11892 ft: 14044 corp: 8/371b lim: 120 exec/s: 0 rss: 68Mb L: 46/92 MS: 1 ShuffleBytes- 00:08:15.719 [2024-11-16 16:44:01.369852] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.719 [2024-11-16 16:44:01.369880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.720 [2024-11-16 16:44:01.369919] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.720 [2024-11-16 16:44:01.369936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.720 [2024-11-16 16:44:01.369996] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.720 [2024-11-16 16:44:01.370011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.720 #26 NEW cov: 11892 ft: 14078 corp: 9/463b lim: 120 exec/s: 0 rss: 68Mb L: 92/92 MS: 1 ChangeByte- 00:08:15.720 [2024-11-16 16:44:01.420121] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.720 [2024-11-16 16:44:01.420153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.720 [2024-11-16 16:44:01.420193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.720 [2024-11-16 16:44:01.420209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.720 [2024-11-16 16:44:01.420262] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.720 [2024-11-16 16:44:01.420279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.720 [2024-11-16 16:44:01.420332] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:81 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.720 [2024-11-16 16:44:01.420347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.720 #27 NEW cov: 11892 ft: 14490 corp: 10/567b lim: 120 exec/s: 0 rss: 68Mb L: 104/104 MS: 1 InsertRepeatedBytes- 00:08:15.720 [2024-11-16 16:44:01.459723] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18387635424538394623 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.720 [2024-11-16 16:44:01.459751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.979 #28 NEW cov: 11892 ft: 14574 corp: 11/613b lim: 120 exec/s: 0 rss: 68Mb L: 46/104 MS: 1 ChangeBinInt- 00:08:15.979 [2024-11-16 16:44:01.500071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446743155307970559 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.979 [2024-11-16 16:44:01.500101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.979 [2024-11-16 16:44:01.500141] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.979 [2024-11-16 16:44:01.500158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.979 #29 NEW cov: 11892 ft: 14635 corp: 12/664b lim: 120 exec/s: 0 rss: 68Mb L: 51/104 MS: 1 ShuffleBytes- 00:08:15.979 [2024-11-16 16:44:01.539969] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446509874145198079 len:65327 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.979 [2024-11-16 16:44:01.539999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.979 #30 NEW cov: 11892 ft: 14724 corp: 13/711b lim: 120 exec/s: 0 rss: 68Mb L: 47/104 MS: 1 InsertByte- 00:08:15.979 [2024-11-16 16:44:01.580273] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.979 [2024-11-16 16:44:01.580300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.979 [2024-11-16 16:44:01.580337] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.979 [2024-11-16 16:44:01.580354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.979 #31 NEW cov: 11892 ft: 14793 corp: 14/760b lim: 120 exec/s: 0 rss: 68Mb L: 49/104 MS: 1 EraseBytes- 00:08:15.979 [2024-11-16 16:44:01.630237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446509874145198079 len:65327 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.979 [2024-11-16 16:44:01.630265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.979 #32 NEW cov: 11892 ft: 14875 corp: 15/793b lim: 120 exec/s: 0 rss: 68Mb L: 33/104 MS: 1 CrossOver- 00:08:15.979 [2024-11-16 16:44:01.670299] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18386789900096634879 len:12032 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.979 [2024-11-16 16:44:01.670329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.979 #33 NEW cov: 11892 ft: 14895 corp: 16/839b lim: 120 exec/s: 0 rss: 68Mb L: 46/104 MS: 1 CopyPart- 00:08:15.979 [2024-11-16 16:44:01.710574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.979 [2024-11-16 16:44:01.710602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.979 [2024-11-16 16:44:01.710639] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.979 [2024-11-16 16:44:01.710655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.238 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:16.239 #34 NEW cov: 11915 ft: 15020 corp: 17/888b lim: 120 exec/s: 0 rss: 68Mb L: 49/104 MS: 1 ShuffleBytes- 00:08:16.239 [2024-11-16 16:44:01.760586] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18386789900096634879 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.239 [2024-11-16 16:44:01.760615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.239 #35 NEW cov: 11915 ft: 15043 corp: 18/934b lim: 120 exec/s: 0 rss: 68Mb L: 46/104 MS: 1 ChangeByte- 00:08:16.239 [2024-11-16 16:44:01.800838] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4074504960 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.239 [2024-11-16 16:44:01.800866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.239 [2024-11-16 16:44:01.800901] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.239 [2024-11-16 16:44:01.800918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.239 #40 NEW cov: 11915 ft: 15083 corp: 19/983b lim: 120 exec/s: 40 rss: 68Mb L: 49/104 MS: 5 ChangeByte-ChangeByte-ChangeByte-CMP-InsertRepeatedBytes- DE: "\334\003\000\000\000\000\000\000"- 00:08:16.239 [2024-11-16 16:44:01.840835] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446509874145198079 len:65327 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.239 [2024-11-16 16:44:01.840864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.239 #41 NEW cov: 11915 ft: 15094 corp: 20/1016b lim: 120 exec/s: 41 rss: 68Mb L: 33/104 MS: 1 ChangeByte- 00:08:16.239 [2024-11-16 16:44:01.881275] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18386789900096634879 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.239 [2024-11-16 16:44:01.881304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.239 [2024-11-16 16:44:01.881351] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.239 [2024-11-16 16:44:01.881367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.239 [2024-11-16 16:44:01.881424] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446742974197923840 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.239 [2024-11-16 16:44:01.881444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.239 #42 NEW cov: 11915 ft: 15101 corp: 21/1111b lim: 120 exec/s: 42 rss: 68Mb L: 95/104 MS: 1 CrossOver- 00:08:16.239 [2024-11-16 16:44:01.921541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.239 [2024-11-16 16:44:01.921571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.239 [2024-11-16 16:44:01.921623] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.239 [2024-11-16 16:44:01.921639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.239 [2024-11-16 16:44:01.921700] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.239 [2024-11-16 16:44:01.921716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.239 [2024-11-16 16:44:01.921775] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:81 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.239 [2024-11-16 16:44:01.921791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.239 #43 NEW cov: 11915 ft: 15134 corp: 22/1215b lim: 120 exec/s: 43 rss: 68Mb L: 104/104 MS: 1 ShuffleBytes- 00:08:16.239 [2024-11-16 16:44:01.971225] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:3098242344080637951 len:65327 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.239 [2024-11-16 16:44:01.971254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.498 #44 NEW cov: 11915 ft: 15164 corp: 23/1262b lim: 120 exec/s: 44 rss: 69Mb L: 47/104 MS: 1 InsertByte- 00:08:16.498 [2024-11-16 16:44:02.011342] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446509874145198079 len:65327 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.498 [2024-11-16 16:44:02.011370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.498 #45 NEW cov: 11915 ft: 15246 corp: 24/1295b lim: 120 exec/s: 45 rss: 69Mb L: 33/104 MS: 1 ChangeBinInt- 00:08:16.498 [2024-11-16 16:44:02.051472] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446509874145198079 len:65327 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.498 [2024-11-16 16:44:02.051501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.498 #46 NEW cov: 11915 ft: 15270 corp: 25/1328b lim: 120 exec/s: 46 rss: 69Mb L: 33/104 MS: 1 ChangeBinInt- 00:08:16.498 [2024-11-16 16:44:02.092095] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.498 [2024-11-16 16:44:02.092124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.498 [2024-11-16 16:44:02.092162] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.498 [2024-11-16 16:44:02.092178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.498 [2024-11-16 16:44:02.092233] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.498 [2024-11-16 16:44:02.092250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.498 [2024-11-16 16:44:02.092310] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:81 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.498 [2024-11-16 16:44:02.092326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.498 #47 NEW cov: 11915 ft: 15340 corp: 26/1432b lim: 120 exec/s: 47 rss: 69Mb L: 104/104 MS: 1 ChangeByte- 00:08:16.498 [2024-11-16 16:44:02.141729] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4634292322439286864 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.498 [2024-11-16 16:44:02.141758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.498 #48 NEW cov: 11915 ft: 15359 corp: 27/1463b lim: 120 exec/s: 48 rss: 69Mb L: 31/104 MS: 1 EraseBytes- 00:08:16.498 [2024-11-16 16:44:02.181966] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18386789900096634879 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.498 [2024-11-16 16:44:02.181993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.498 [2024-11-16 16:44:02.182028] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.498 [2024-11-16 16:44:02.182044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.498 #49 NEW cov: 11915 ft: 15369 corp: 28/1521b lim: 120 exec/s: 49 rss: 69Mb L: 58/104 MS: 1 InsertRepeatedBytes- 00:08:16.498 [2024-11-16 16:44:02.222380] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.498 [2024-11-16 16:44:02.222409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.498 [2024-11-16 16:44:02.222459] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.498 [2024-11-16 16:44:02.222476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.498 [2024-11-16 16:44:02.222530] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.498 [2024-11-16 16:44:02.222546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.498 [2024-11-16 16:44:02.222607] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:81 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.498 [2024-11-16 16:44:02.222622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.498 #50 NEW cov: 11915 ft: 15382 corp: 29/1625b lim: 120 exec/s: 50 rss: 69Mb L: 104/104 MS: 1 ChangeBit- 00:08:16.758 [2024-11-16 16:44:02.261993] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4634292322439286864 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.758 [2024-11-16 16:44:02.262020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.758 #51 NEW cov: 11915 ft: 15396 corp: 30/1657b lim: 120 exec/s: 51 rss: 69Mb L: 32/104 MS: 1 InsertByte- 00:08:16.758 [2024-11-16 16:44:02.302318] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.758 [2024-11-16 16:44:02.302348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.758 [2024-11-16 16:44:02.302392] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5787125522518528080 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.758 [2024-11-16 16:44:02.302410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.758 #52 NEW cov: 11915 ft: 15406 corp: 31/1720b lim: 120 exec/s: 52 rss: 69Mb L: 63/104 MS: 1 EraseBytes- 00:08:16.758 [2024-11-16 16:44:02.342401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4074504960 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.758 [2024-11-16 16:44:02.342429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.758 [2024-11-16 16:44:02.342469] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:9223372036854775808 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.758 [2024-11-16 16:44:02.342484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.758 #53 NEW cov: 11915 ft: 15431 corp: 32/1769b lim: 120 exec/s: 53 rss: 69Mb L: 49/104 MS: 1 ChangeByte- 00:08:16.758 [2024-11-16 16:44:02.382402] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446509874145198079 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.758 [2024-11-16 16:44:02.382429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.758 #54 NEW cov: 11915 ft: 15434 corp: 33/1802b lim: 120 exec/s: 54 rss: 69Mb L: 33/104 MS: 1 CopyPart- 00:08:16.758 [2024-11-16 16:44:02.423076] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.758 [2024-11-16 16:44:02.423104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.758 [2024-11-16 16:44:02.423143] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.758 [2024-11-16 16:44:02.423159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.758 [2024-11-16 16:44:02.423212] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.758 [2024-11-16 16:44:02.423229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.758 [2024-11-16 16:44:02.423302] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:81 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.758 [2024-11-16 16:44:02.423319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.758 #55 NEW cov: 11915 ft: 15448 corp: 34/1906b lim: 120 exec/s: 55 rss: 69Mb L: 104/104 MS: 1 ShuffleBytes- 00:08:16.758 [2024-11-16 16:44:02.463090] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18386789900096634879 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.758 [2024-11-16 16:44:02.463119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.758 [2024-11-16 16:44:02.463170] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.758 [2024-11-16 16:44:02.463187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.758 [2024-11-16 16:44:02.463238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:393216 len:65281 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.758 [2024-11-16 16:44:02.463253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.758 [2024-11-16 16:44:02.463312] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:281474976645120 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.758 [2024-11-16 16:44:02.463325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.758 #56 NEW cov: 11915 ft: 15451 corp: 35/2005b lim: 120 exec/s: 56 rss: 69Mb L: 99/104 MS: 1 CMP- DE: "\006\000\000\000"- 00:08:17.018 [2024-11-16 16:44:02.513083] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.018 [2024-11-16 16:44:02.513110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.019 [2024-11-16 16:44:02.513150] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.019 [2024-11-16 16:44:02.513164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.019 [2024-11-16 16:44:02.513220] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.019 [2024-11-16 16:44:02.513236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.019 #57 NEW cov: 11915 ft: 15458 corp: 36/2097b lim: 120 exec/s: 57 rss: 69Mb L: 92/104 MS: 1 ShuffleBytes- 00:08:17.019 [2024-11-16 16:44:02.553316] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.019 [2024-11-16 16:44:02.553343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.019 [2024-11-16 16:44:02.553397] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.019 [2024-11-16 16:44:02.553413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.019 [2024-11-16 16:44:02.553470] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:5787213827046133840 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.019 [2024-11-16 16:44:02.553485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.019 [2024-11-16 16:44:02.553542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:81 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.019 [2024-11-16 16:44:02.553558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.019 #58 NEW cov: 11915 ft: 15470 corp: 37/2209b lim: 120 exec/s: 58 rss: 69Mb L: 112/112 MS: 1 PersAutoDict- DE: "\334\003\000\000\000\000\000\000"- 00:08:17.019 [2024-11-16 16:44:02.603094] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446509874145198079 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.019 [2024-11-16 16:44:02.603122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.019 #59 NEW cov: 11915 ft: 15482 corp: 38/2242b lim: 120 exec/s: 59 rss: 69Mb L: 33/112 MS: 1 ChangeByte- 00:08:17.019 [2024-11-16 16:44:02.643147] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4634292322439286864 len:20561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.019 [2024-11-16 16:44:02.643176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.019 #60 NEW cov: 11915 ft: 15486 corp: 39/2273b lim: 120 exec/s: 60 rss: 69Mb L: 31/112 MS: 1 CopyPart- 00:08:17.019 [2024-11-16 16:44:02.683258] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744070136004607 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.019 [2024-11-16 16:44:02.683287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.019 #61 NEW cov: 11915 ft: 15521 corp: 40/2312b lim: 120 exec/s: 61 rss: 69Mb L: 39/112 MS: 1 InsertByte- 00:08:17.019 [2024-11-16 16:44:02.723875] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7161677110969590627 len:25444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.019 [2024-11-16 16:44:02.723904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.019 [2024-11-16 16:44:02.723951] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7161677110969590627 len:25444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.019 [2024-11-16 16:44:02.723967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.019 [2024-11-16 16:44:02.724020] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:7161677110969590627 len:25444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.019 [2024-11-16 16:44:02.724035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.019 [2024-11-16 16:44:02.724089] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:7161677110969590627 len:25444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.019 [2024-11-16 16:44:02.724104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.019 #65 NEW cov: 11915 ft: 15524 corp: 41/2429b lim: 120 exec/s: 65 rss: 69Mb L: 117/117 MS: 4 InsertByte-CopyPart-EraseBytes-InsertRepeatedBytes- 00:08:17.019 [2024-11-16 16:44:02.763682] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446509874145198079 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.019 [2024-11-16 16:44:02.763709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.019 [2024-11-16 16:44:02.763746] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:361700864190383365 len:1286 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.019 [2024-11-16 16:44:02.763777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.278 #67 NEW cov: 11915 ft: 15533 corp: 42/2495b lim: 120 exec/s: 67 rss: 69Mb L: 66/117 MS: 2 EraseBytes-InsertRepeatedBytes- 00:08:17.279 [2024-11-16 16:44:02.804091] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:7161677110969590627 len:25444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.279 [2024-11-16 16:44:02.804121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.279 [2024-11-16 16:44:02.804167] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:7161677110969590627 len:25444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.279 [2024-11-16 16:44:02.804182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.279 [2024-11-16 16:44:02.804235] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:7161677110969590627 len:25444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.279 [2024-11-16 16:44:02.804252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.279 [2024-11-16 16:44:02.804305] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:7161677110969590627 len:25444 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.279 [2024-11-16 16:44:02.804321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.279 #68 NEW cov: 11915 ft: 15540 corp: 43/2614b lim: 120 exec/s: 34 rss: 69Mb L: 119/119 MS: 1 CrossOver- 00:08:17.279 #68 DONE cov: 11915 ft: 15540 corp: 43/2614b lim: 120 exec/s: 34 rss: 69Mb 00:08:17.279 ###### Recommended dictionary. ###### 00:08:17.279 "\334\003\000\000\000\000\000\000" # Uses: 1 00:08:17.279 "\006\000\000\000" # Uses: 0 00:08:17.279 ###### End of recommended dictionary. ###### 00:08:17.279 Done 68 runs in 2 second(s) 00:08:17.279 16:44:02 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:08:17.279 16:44:02 -- ../common.sh@72 -- # (( i++ )) 00:08:17.279 16:44:02 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:17.279 16:44:02 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:17.279 16:44:02 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:17.279 16:44:02 -- nvmf/run.sh@24 -- # local timen=1 00:08:17.279 16:44:02 -- nvmf/run.sh@25 -- # local core=0x1 00:08:17.279 16:44:02 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:17.279 16:44:02 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:17.279 16:44:02 -- nvmf/run.sh@29 -- # printf %02d 18 00:08:17.279 16:44:02 -- nvmf/run.sh@29 -- # port=4418 00:08:17.279 16:44:02 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:17.279 16:44:02 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:17.279 16:44:02 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:17.279 16:44:02 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:08:17.279 [2024-11-16 16:44:02.990204] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:17.279 [2024-11-16 16:44:02.990295] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid491104 ] 00:08:17.279 EAL: No free 2048 kB hugepages reported on node 1 00:08:17.539 [2024-11-16 16:44:03.176647] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:17.539 [2024-11-16 16:44:03.195866] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:17.539 [2024-11-16 16:44:03.195980] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:17.539 [2024-11-16 16:44:03.247301] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:17.539 [2024-11-16 16:44:03.263632] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:17.539 INFO: Running with entropic power schedule (0xFF, 100). 00:08:17.539 INFO: Seed: 4004573415 00:08:17.798 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:17.798 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:17.798 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:17.798 INFO: A corpus is not provided, starting from an empty corpus 00:08:17.798 #2 INITED exec/s: 0 rss: 59Mb 00:08:17.798 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:17.798 This may also happen if the target rejected all inputs we tried so far 00:08:17.798 [2024-11-16 16:44:03.329777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:17.798 [2024-11-16 16:44:03.329819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.798 [2024-11-16 16:44:03.329928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:17.798 [2024-11-16 16:44:03.329949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.798 [2024-11-16 16:44:03.330062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:17.798 [2024-11-16 16:44:03.330082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.798 [2024-11-16 16:44:03.330191] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:17.798 [2024-11-16 16:44:03.330212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.059 NEW_FUNC[1/670]: 0x46ea18 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:18.059 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:18.059 #8 NEW cov: 11619 ft: 11630 corp: 2/97b lim: 100 exec/s: 0 rss: 67Mb L: 96/96 MS: 1 InsertRepeatedBytes- 00:08:18.059 [2024-11-16 16:44:03.640510] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.059 [2024-11-16 16:44:03.640548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.059 [2024-11-16 16:44:03.640677] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.059 [2024-11-16 16:44:03.640696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.059 [2024-11-16 16:44:03.640802] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.059 [2024-11-16 16:44:03.640822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.059 [2024-11-16 16:44:03.640934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.059 [2024-11-16 16:44:03.640955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.059 #9 NEW cov: 11745 ft: 12158 corp: 3/196b lim: 100 exec/s: 0 rss: 67Mb L: 99/99 MS: 1 InsertRepeatedBytes- 00:08:18.059 [2024-11-16 16:44:03.690274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.059 [2024-11-16 16:44:03.690307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.059 [2024-11-16 16:44:03.690434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.059 [2024-11-16 16:44:03.690454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.059 #12 NEW cov: 11751 ft: 12688 corp: 4/238b lim: 100 exec/s: 0 rss: 67Mb L: 42/99 MS: 3 ChangeBit-InsertRepeatedBytes-InsertRepeatedBytes- 00:08:18.059 [2024-11-16 16:44:03.730685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.059 [2024-11-16 16:44:03.730721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.059 [2024-11-16 16:44:03.730826] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.059 [2024-11-16 16:44:03.730848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.059 [2024-11-16 16:44:03.730960] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.059 [2024-11-16 16:44:03.730981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.059 [2024-11-16 16:44:03.731097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.059 [2024-11-16 16:44:03.731115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.059 #13 NEW cov: 11836 ft: 12950 corp: 5/335b lim: 100 exec/s: 0 rss: 67Mb L: 97/99 MS: 1 InsertByte- 00:08:18.059 [2024-11-16 16:44:03.770886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.059 [2024-11-16 16:44:03.770915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.059 [2024-11-16 16:44:03.771034] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.059 [2024-11-16 16:44:03.771057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.059 [2024-11-16 16:44:03.771168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.059 [2024-11-16 16:44:03.771188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.059 [2024-11-16 16:44:03.771304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.059 [2024-11-16 16:44:03.771328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.059 #14 NEW cov: 11836 ft: 13017 corp: 6/432b lim: 100 exec/s: 0 rss: 67Mb L: 97/99 MS: 1 ChangeBinInt- 00:08:18.319 [2024-11-16 16:44:03.821105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.319 [2024-11-16 16:44:03.821135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.319 [2024-11-16 16:44:03.821221] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.319 [2024-11-16 16:44:03.821238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.319 [2024-11-16 16:44:03.821358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.319 [2024-11-16 16:44:03.821381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.319 [2024-11-16 16:44:03.821491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.319 [2024-11-16 16:44:03.821511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.319 #15 NEW cov: 11836 ft: 13240 corp: 7/529b lim: 100 exec/s: 0 rss: 67Mb L: 97/99 MS: 1 ChangeBit- 00:08:18.319 [2024-11-16 16:44:03.861187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.319 [2024-11-16 16:44:03.861219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.319 [2024-11-16 16:44:03.861328] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.319 [2024-11-16 16:44:03.861346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.319 [2024-11-16 16:44:03.861461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.319 [2024-11-16 16:44:03.861483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.319 [2024-11-16 16:44:03.861597] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.319 [2024-11-16 16:44:03.861617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.319 #16 NEW cov: 11836 ft: 13283 corp: 8/626b lim: 100 exec/s: 0 rss: 67Mb L: 97/99 MS: 1 CrossOver- 00:08:18.319 [2024-11-16 16:44:03.901225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.319 [2024-11-16 16:44:03.901255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.320 [2024-11-16 16:44:03.901397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.320 [2024-11-16 16:44:03.901414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.320 [2024-11-16 16:44:03.901530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.320 [2024-11-16 16:44:03.901550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.320 [2024-11-16 16:44:03.901673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.320 [2024-11-16 16:44:03.901690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.320 #22 NEW cov: 11836 ft: 13307 corp: 9/725b lim: 100 exec/s: 0 rss: 67Mb L: 99/99 MS: 1 CMP- DE: "\001\000"- 00:08:18.320 [2024-11-16 16:44:03.941409] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.320 [2024-11-16 16:44:03.941439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.320 [2024-11-16 16:44:03.941551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.320 [2024-11-16 16:44:03.941572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.320 [2024-11-16 16:44:03.941708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.320 [2024-11-16 16:44:03.941728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.320 [2024-11-16 16:44:03.941843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.320 [2024-11-16 16:44:03.941864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.320 #23 NEW cov: 11836 ft: 13335 corp: 10/824b lim: 100 exec/s: 0 rss: 67Mb L: 99/99 MS: 1 PersAutoDict- DE: "\001\000"- 00:08:18.320 [2024-11-16 16:44:03.981119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.320 [2024-11-16 16:44:03.981149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.320 [2024-11-16 16:44:03.981256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.320 [2024-11-16 16:44:03.981277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.320 #24 NEW cov: 11836 ft: 13445 corp: 11/866b lim: 100 exec/s: 0 rss: 67Mb L: 42/99 MS: 1 CopyPart- 00:08:18.320 [2024-11-16 16:44:04.021664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.320 [2024-11-16 16:44:04.021701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.320 [2024-11-16 16:44:04.021825] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.320 [2024-11-16 16:44:04.021847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.320 [2024-11-16 16:44:04.021968] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.320 [2024-11-16 16:44:04.021988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.320 [2024-11-16 16:44:04.022119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.320 [2024-11-16 16:44:04.022140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.320 #25 NEW cov: 11836 ft: 13461 corp: 12/954b lim: 100 exec/s: 0 rss: 67Mb L: 88/99 MS: 1 EraseBytes- 00:08:18.320 [2024-11-16 16:44:04.061763] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.320 [2024-11-16 16:44:04.061791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.320 [2024-11-16 16:44:04.061853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.320 [2024-11-16 16:44:04.061875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.320 [2024-11-16 16:44:04.061986] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.320 [2024-11-16 16:44:04.062005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.320 [2024-11-16 16:44:04.062116] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.320 [2024-11-16 16:44:04.062135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.580 #26 NEW cov: 11836 ft: 13566 corp: 13/1053b lim: 100 exec/s: 0 rss: 67Mb L: 99/99 MS: 1 ChangeBit- 00:08:18.580 [2024-11-16 16:44:04.101909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.580 [2024-11-16 16:44:04.101939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.580 [2024-11-16 16:44:04.102050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.580 [2024-11-16 16:44:04.102070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.580 [2024-11-16 16:44:04.102187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.580 [2024-11-16 16:44:04.102208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.580 [2024-11-16 16:44:04.102321] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.580 [2024-11-16 16:44:04.102342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.580 #27 NEW cov: 11836 ft: 13603 corp: 14/1152b lim: 100 exec/s: 0 rss: 68Mb L: 99/99 MS: 1 ChangeBit- 00:08:18.580 [2024-11-16 16:44:04.141985] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.580 [2024-11-16 16:44:04.142012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.580 [2024-11-16 16:44:04.142087] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.580 [2024-11-16 16:44:04.142104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.580 [2024-11-16 16:44:04.142214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.580 [2024-11-16 16:44:04.142236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.580 [2024-11-16 16:44:04.142354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.580 [2024-11-16 16:44:04.142376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.580 #28 NEW cov: 11836 ft: 13611 corp: 15/1251b lim: 100 exec/s: 0 rss: 68Mb L: 99/99 MS: 1 PersAutoDict- DE: "\001\000"- 00:08:18.580 [2024-11-16 16:44:04.182157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.580 [2024-11-16 16:44:04.182187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.580 [2024-11-16 16:44:04.182287] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.580 [2024-11-16 16:44:04.182305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.580 [2024-11-16 16:44:04.182425] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.580 [2024-11-16 16:44:04.182444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.580 [2024-11-16 16:44:04.182572] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.580 [2024-11-16 16:44:04.182592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.580 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:18.580 #29 NEW cov: 11859 ft: 13638 corp: 16/1349b lim: 100 exec/s: 0 rss: 68Mb L: 98/99 MS: 1 InsertByte- 00:08:18.580 [2024-11-16 16:44:04.222252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.580 [2024-11-16 16:44:04.222280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.580 [2024-11-16 16:44:04.222359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.580 [2024-11-16 16:44:04.222379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.580 [2024-11-16 16:44:04.222491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.580 [2024-11-16 16:44:04.222510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.580 [2024-11-16 16:44:04.222626] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.580 [2024-11-16 16:44:04.222646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.580 #30 NEW cov: 11859 ft: 13690 corp: 17/1446b lim: 100 exec/s: 0 rss: 68Mb L: 97/99 MS: 1 CopyPart- 00:08:18.580 [2024-11-16 16:44:04.262346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.580 [2024-11-16 16:44:04.262374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.580 [2024-11-16 16:44:04.262469] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.580 [2024-11-16 16:44:04.262489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.580 [2024-11-16 16:44:04.262612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.580 [2024-11-16 16:44:04.262632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.580 [2024-11-16 16:44:04.262739] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.580 [2024-11-16 16:44:04.262758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.580 #31 NEW cov: 11859 ft: 13718 corp: 18/1543b lim: 100 exec/s: 0 rss: 68Mb L: 97/99 MS: 1 ChangeBinInt- 00:08:18.580 [2024-11-16 16:44:04.302596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.580 [2024-11-16 16:44:04.302622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.580 [2024-11-16 16:44:04.302742] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.580 [2024-11-16 16:44:04.302765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.580 [2024-11-16 16:44:04.302883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.580 [2024-11-16 16:44:04.302908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.580 [2024-11-16 16:44:04.303023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.580 [2024-11-16 16:44:04.303043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.580 #32 NEW cov: 11859 ft: 13729 corp: 19/1642b lim: 100 exec/s: 32 rss: 68Mb L: 99/99 MS: 1 CopyPart- 00:08:18.840 [2024-11-16 16:44:04.342508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.840 [2024-11-16 16:44:04.342538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.840 [2024-11-16 16:44:04.342647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.840 [2024-11-16 16:44:04.342673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.840 [2024-11-16 16:44:04.342788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.840 [2024-11-16 16:44:04.342807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.840 [2024-11-16 16:44:04.342922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.840 [2024-11-16 16:44:04.342944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.840 #33 NEW cov: 11859 ft: 13757 corp: 20/1739b lim: 100 exec/s: 33 rss: 68Mb L: 97/99 MS: 1 ChangeBinInt- 00:08:18.840 [2024-11-16 16:44:04.382691] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.840 [2024-11-16 16:44:04.382720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.840 [2024-11-16 16:44:04.382808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.840 [2024-11-16 16:44:04.382830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.840 [2024-11-16 16:44:04.382939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.840 [2024-11-16 16:44:04.382959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.840 [2024-11-16 16:44:04.383078] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.840 [2024-11-16 16:44:04.383096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.840 #34 NEW cov: 11859 ft: 13773 corp: 21/1836b lim: 100 exec/s: 34 rss: 68Mb L: 97/99 MS: 1 CMP- DE: "\001\000\000\000\002\010'\323"- 00:08:18.840 [2024-11-16 16:44:04.423000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.841 [2024-11-16 16:44:04.423029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.841 [2024-11-16 16:44:04.423113] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.841 [2024-11-16 16:44:04.423131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.841 [2024-11-16 16:44:04.423241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.841 [2024-11-16 16:44:04.423263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.841 [2024-11-16 16:44:04.423377] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.841 [2024-11-16 16:44:04.423399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.841 [2024-11-16 16:44:04.423514] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:18.841 [2024-11-16 16:44:04.423533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:18.841 #35 NEW cov: 11859 ft: 13838 corp: 22/1936b lim: 100 exec/s: 35 rss: 68Mb L: 100/100 MS: 1 InsertByte- 00:08:18.841 [2024-11-16 16:44:04.473026] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.841 [2024-11-16 16:44:04.473054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.841 [2024-11-16 16:44:04.473154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.841 [2024-11-16 16:44:04.473174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.841 [2024-11-16 16:44:04.473285] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.841 [2024-11-16 16:44:04.473310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.841 [2024-11-16 16:44:04.473434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.841 [2024-11-16 16:44:04.473456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.841 #36 NEW cov: 11859 ft: 13844 corp: 23/2035b lim: 100 exec/s: 36 rss: 68Mb L: 99/100 MS: 1 ChangeByte- 00:08:18.841 [2024-11-16 16:44:04.513087] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.841 [2024-11-16 16:44:04.513119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.841 [2024-11-16 16:44:04.513217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.841 [2024-11-16 16:44:04.513235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.841 [2024-11-16 16:44:04.513347] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:18.841 [2024-11-16 16:44:04.513363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.841 [2024-11-16 16:44:04.513482] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:18.841 [2024-11-16 16:44:04.513502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.841 #37 NEW cov: 11859 ft: 13863 corp: 24/2134b lim: 100 exec/s: 37 rss: 68Mb L: 99/100 MS: 1 CrossOver- 00:08:18.841 [2024-11-16 16:44:04.552924] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.841 [2024-11-16 16:44:04.552956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.841 [2024-11-16 16:44:04.553059] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:18.841 [2024-11-16 16:44:04.553076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.841 #38 NEW cov: 11859 ft: 13893 corp: 25/2190b lim: 100 exec/s: 38 rss: 68Mb L: 56/100 MS: 1 EraseBytes- 00:08:19.101 [2024-11-16 16:44:04.603346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.101 [2024-11-16 16:44:04.603377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.101 [2024-11-16 16:44:04.603471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.101 [2024-11-16 16:44:04.603486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.101 [2024-11-16 16:44:04.603599] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.101 [2024-11-16 16:44:04.603618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.101 [2024-11-16 16:44:04.603731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.101 [2024-11-16 16:44:04.603759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.101 #39 NEW cov: 11859 ft: 13938 corp: 26/2286b lim: 100 exec/s: 39 rss: 68Mb L: 96/100 MS: 1 ChangeByte- 00:08:19.101 [2024-11-16 16:44:04.643537] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.101 [2024-11-16 16:44:04.643567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.101 [2024-11-16 16:44:04.643667] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.101 [2024-11-16 16:44:04.643692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.101 [2024-11-16 16:44:04.643810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.101 [2024-11-16 16:44:04.643831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.101 [2024-11-16 16:44:04.643952] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.101 [2024-11-16 16:44:04.643974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.101 #40 NEW cov: 11859 ft: 14017 corp: 27/2383b lim: 100 exec/s: 40 rss: 68Mb L: 97/100 MS: 1 ChangeByte- 00:08:19.101 [2024-11-16 16:44:04.693906] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.101 [2024-11-16 16:44:04.693936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.101 [2024-11-16 16:44:04.694013] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.101 [2024-11-16 16:44:04.694034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.101 [2024-11-16 16:44:04.694142] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.101 [2024-11-16 16:44:04.694163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.101 [2024-11-16 16:44:04.694274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.101 [2024-11-16 16:44:04.694294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.101 [2024-11-16 16:44:04.694410] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:19.101 [2024-11-16 16:44:04.694429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:19.101 #41 NEW cov: 11859 ft: 14033 corp: 28/2483b lim: 100 exec/s: 41 rss: 68Mb L: 100/100 MS: 1 InsertByte- 00:08:19.101 [2024-11-16 16:44:04.743861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.101 [2024-11-16 16:44:04.743893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.101 [2024-11-16 16:44:04.744005] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.101 [2024-11-16 16:44:04.744033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.101 [2024-11-16 16:44:04.744142] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.101 [2024-11-16 16:44:04.744164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.101 [2024-11-16 16:44:04.744269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.101 [2024-11-16 16:44:04.744290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.101 #42 NEW cov: 11859 ft: 14040 corp: 29/2582b lim: 100 exec/s: 42 rss: 68Mb L: 99/100 MS: 1 ChangeBit- 00:08:19.101 [2024-11-16 16:44:04.783958] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.101 [2024-11-16 16:44:04.783989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.101 [2024-11-16 16:44:04.784085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.101 [2024-11-16 16:44:04.784105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.101 [2024-11-16 16:44:04.784220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.101 [2024-11-16 16:44:04.784243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.101 [2024-11-16 16:44:04.784354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.101 [2024-11-16 16:44:04.784373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.101 #43 NEW cov: 11859 ft: 14043 corp: 30/2679b lim: 100 exec/s: 43 rss: 68Mb L: 97/100 MS: 1 ChangeBit- 00:08:19.101 [2024-11-16 16:44:04.823952] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.101 [2024-11-16 16:44:04.823982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.101 [2024-11-16 16:44:04.824095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.101 [2024-11-16 16:44:04.824113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.101 [2024-11-16 16:44:04.824225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.101 [2024-11-16 16:44:04.824244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.101 #44 NEW cov: 11859 ft: 14297 corp: 31/2748b lim: 100 exec/s: 44 rss: 68Mb L: 69/100 MS: 1 CrossOver- 00:08:19.362 [2024-11-16 16:44:04.864129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.362 [2024-11-16 16:44:04.864161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.362 [2024-11-16 16:44:04.864231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.362 [2024-11-16 16:44:04.864251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.362 [2024-11-16 16:44:04.864354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.362 [2024-11-16 16:44:04.864376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.362 [2024-11-16 16:44:04.864493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.362 [2024-11-16 16:44:04.864512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.362 #45 NEW cov: 11859 ft: 14304 corp: 32/2847b lim: 100 exec/s: 45 rss: 69Mb L: 99/100 MS: 1 ChangeBit- 00:08:19.362 [2024-11-16 16:44:04.904237] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.362 [2024-11-16 16:44:04.904268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.362 [2024-11-16 16:44:04.904356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.362 [2024-11-16 16:44:04.904377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.362 [2024-11-16 16:44:04.904486] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.362 [2024-11-16 16:44:04.904503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.362 [2024-11-16 16:44:04.904620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.362 [2024-11-16 16:44:04.904639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.362 #46 NEW cov: 11859 ft: 14337 corp: 33/2944b lim: 100 exec/s: 46 rss: 69Mb L: 97/100 MS: 1 ChangeBinInt- 00:08:19.362 [2024-11-16 16:44:04.944176] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.362 [2024-11-16 16:44:04.944206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.362 [2024-11-16 16:44:04.944303] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.362 [2024-11-16 16:44:04.944323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.362 [2024-11-16 16:44:04.944436] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.362 [2024-11-16 16:44:04.944456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.362 #47 NEW cov: 11859 ft: 14362 corp: 34/3019b lim: 100 exec/s: 47 rss: 69Mb L: 75/100 MS: 1 EraseBytes- 00:08:19.362 [2024-11-16 16:44:04.984582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.362 [2024-11-16 16:44:04.984609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.362 [2024-11-16 16:44:04.984714] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.362 [2024-11-16 16:44:04.984748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.362 [2024-11-16 16:44:04.984861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.362 [2024-11-16 16:44:04.984878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.362 [2024-11-16 16:44:04.984988] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.362 [2024-11-16 16:44:04.985007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.362 #48 NEW cov: 11859 ft: 14373 corp: 35/3118b lim: 100 exec/s: 48 rss: 69Mb L: 99/100 MS: 1 CrossOver- 00:08:19.362 [2024-11-16 16:44:05.024782] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.362 [2024-11-16 16:44:05.024814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.362 [2024-11-16 16:44:05.024934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.362 [2024-11-16 16:44:05.024958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.362 [2024-11-16 16:44:05.025073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.362 [2024-11-16 16:44:05.025095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.362 [2024-11-16 16:44:05.025229] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.362 [2024-11-16 16:44:05.025249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.362 [2024-11-16 16:44:05.025369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:19.362 [2024-11-16 16:44:05.025391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:19.362 #49 NEW cov: 11859 ft: 14401 corp: 36/3218b lim: 100 exec/s: 49 rss: 69Mb L: 100/100 MS: 1 CrossOver- 00:08:19.362 [2024-11-16 16:44:05.074826] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.362 [2024-11-16 16:44:05.074856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.362 [2024-11-16 16:44:05.074941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.362 [2024-11-16 16:44:05.074957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.362 [2024-11-16 16:44:05.075069] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.362 [2024-11-16 16:44:05.075087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.362 [2024-11-16 16:44:05.075200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.362 [2024-11-16 16:44:05.075221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.362 #50 NEW cov: 11859 ft: 14402 corp: 37/3315b lim: 100 exec/s: 50 rss: 69Mb L: 97/100 MS: 1 PersAutoDict- DE: "\001\000\000\000\002\010'\323"- 00:08:19.623 [2024-11-16 16:44:05.115211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.623 [2024-11-16 16:44:05.115241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.623 [2024-11-16 16:44:05.115335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.623 [2024-11-16 16:44:05.115355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.623 [2024-11-16 16:44:05.115463] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.623 [2024-11-16 16:44:05.115482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.623 [2024-11-16 16:44:05.115586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.623 [2024-11-16 16:44:05.115606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.623 [2024-11-16 16:44:05.115723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:19.623 [2024-11-16 16:44:05.115748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:19.623 #51 NEW cov: 11859 ft: 14415 corp: 38/3415b lim: 100 exec/s: 51 rss: 69Mb L: 100/100 MS: 1 CrossOver- 00:08:19.623 [2024-11-16 16:44:05.155205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.623 [2024-11-16 16:44:05.155232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.623 [2024-11-16 16:44:05.155334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.623 [2024-11-16 16:44:05.155352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.623 [2024-11-16 16:44:05.155469] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.623 [2024-11-16 16:44:05.155491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.623 [2024-11-16 16:44:05.155610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.623 [2024-11-16 16:44:05.155625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.623 #52 NEW cov: 11859 ft: 14423 corp: 39/3512b lim: 100 exec/s: 52 rss: 69Mb L: 97/100 MS: 1 PersAutoDict- DE: "\001\000\000\000\002\010'\323"- 00:08:19.623 [2024-11-16 16:44:05.195199] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.623 [2024-11-16 16:44:05.195230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.623 [2024-11-16 16:44:05.195360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.623 [2024-11-16 16:44:05.195379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.623 [2024-11-16 16:44:05.195490] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.623 [2024-11-16 16:44:05.195507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.623 [2024-11-16 16:44:05.195627] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.623 [2024-11-16 16:44:05.195648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.623 #53 NEW cov: 11859 ft: 14453 corp: 40/3609b lim: 100 exec/s: 53 rss: 69Mb L: 97/100 MS: 1 ChangeByte- 00:08:19.623 [2024-11-16 16:44:05.235533] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.623 [2024-11-16 16:44:05.235565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.623 [2024-11-16 16:44:05.235672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.623 [2024-11-16 16:44:05.235697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.623 [2024-11-16 16:44:05.235817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.623 [2024-11-16 16:44:05.235841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.623 [2024-11-16 16:44:05.235958] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.623 [2024-11-16 16:44:05.235980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.623 [2024-11-16 16:44:05.236100] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:19.623 [2024-11-16 16:44:05.236125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:19.623 #54 NEW cov: 11859 ft: 14460 corp: 41/3709b lim: 100 exec/s: 54 rss: 69Mb L: 100/100 MS: 1 ChangeByte- 00:08:19.623 [2024-11-16 16:44:05.275422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.623 [2024-11-16 16:44:05.275452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.623 [2024-11-16 16:44:05.275554] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.623 [2024-11-16 16:44:05.275585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.623 [2024-11-16 16:44:05.275712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.623 [2024-11-16 16:44:05.275735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.624 [2024-11-16 16:44:05.275857] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.624 [2024-11-16 16:44:05.275876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.624 #55 NEW cov: 11859 ft: 14469 corp: 42/3808b lim: 100 exec/s: 55 rss: 69Mb L: 99/100 MS: 1 PersAutoDict- DE: "\001\000\000\000\002\010'\323"- 00:08:19.624 [2024-11-16 16:44:05.315145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.624 [2024-11-16 16:44:05.315174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.624 [2024-11-16 16:44:05.315280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.624 [2024-11-16 16:44:05.315299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.624 #56 NEW cov: 11859 ft: 14480 corp: 43/3850b lim: 100 exec/s: 28 rss: 69Mb L: 42/100 MS: 1 ShuffleBytes- 00:08:19.624 #56 DONE cov: 11859 ft: 14480 corp: 43/3850b lim: 100 exec/s: 28 rss: 69Mb 00:08:19.624 ###### Recommended dictionary. ###### 00:08:19.624 "\001\000" # Uses: 2 00:08:19.624 "\001\000\000\000\002\010'\323" # Uses: 3 00:08:19.624 ###### End of recommended dictionary. ###### 00:08:19.624 Done 56 runs in 2 second(s) 00:08:19.884 16:44:05 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:08:19.884 16:44:05 -- ../common.sh@72 -- # (( i++ )) 00:08:19.884 16:44:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:19.884 16:44:05 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:19.884 16:44:05 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:19.884 16:44:05 -- nvmf/run.sh@24 -- # local timen=1 00:08:19.884 16:44:05 -- nvmf/run.sh@25 -- # local core=0x1 00:08:19.884 16:44:05 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:19.884 16:44:05 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:19.884 16:44:05 -- nvmf/run.sh@29 -- # printf %02d 19 00:08:19.884 16:44:05 -- nvmf/run.sh@29 -- # port=4419 00:08:19.884 16:44:05 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:19.884 16:44:05 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:19.884 16:44:05 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:19.884 16:44:05 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:08:19.884 [2024-11-16 16:44:05.489576] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:19.884 [2024-11-16 16:44:05.489648] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid491549 ] 00:08:19.884 EAL: No free 2048 kB hugepages reported on node 1 00:08:20.143 [2024-11-16 16:44:05.670332] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.144 [2024-11-16 16:44:05.689750] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:20.144 [2024-11-16 16:44:05.689861] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.144 [2024-11-16 16:44:05.741113] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:20.144 [2024-11-16 16:44:05.757408] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:20.144 INFO: Running with entropic power schedule (0xFF, 100). 00:08:20.144 INFO: Seed: 2203612330 00:08:20.144 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:20.144 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:20.144 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:20.144 INFO: A corpus is not provided, starting from an empty corpus 00:08:20.144 #2 INITED exec/s: 0 rss: 59Mb 00:08:20.144 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:20.144 This may also happen if the target rejected all inputs we tried so far 00:08:20.144 [2024-11-16 16:44:05.802197] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2387225703270654241 len:8482 00:08:20.144 [2024-11-16 16:44:05.802232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.144 [2024-11-16 16:44:05.802265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2387225703656530209 len:8482 00:08:20.144 [2024-11-16 16:44:05.802282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.144 [2024-11-16 16:44:05.802310] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2387225703656530209 len:8482 00:08:20.144 [2024-11-16 16:44:05.802326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.144 [2024-11-16 16:44:05.802354] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2387225703656530209 len:8482 00:08:20.144 [2024-11-16 16:44:05.802369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.144 [2024-11-16 16:44:05.802395] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:2387225703656530209 len:8482 00:08:20.144 [2024-11-16 16:44:05.802410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:20.403 NEW_FUNC[1/670]: 0x4719d8 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:20.403 NEW_FUNC[2/670]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:20.403 #3 NEW cov: 11610 ft: 11611 corp: 2/51b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:08:20.403 [2024-11-16 16:44:06.122904] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2387225703270654241 len:8482 00:08:20.403 [2024-11-16 16:44:06.122941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.403 [2024-11-16 16:44:06.122973] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2387225703656530209 len:8482 00:08:20.403 [2024-11-16 16:44:06.122989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.403 [2024-11-16 16:44:06.123020] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2387225703656530977 len:8482 00:08:20.403 [2024-11-16 16:44:06.123036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.403 [2024-11-16 16:44:06.123062] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2387225703656530209 len:8482 00:08:20.403 [2024-11-16 16:44:06.123077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.403 [2024-11-16 16:44:06.123103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:2387225703656530209 len:8482 00:08:20.403 [2024-11-16 16:44:06.123118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:20.663 #9 NEW cov: 11723 ft: 12148 corp: 3/101b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 ChangeByte- 00:08:20.663 [2024-11-16 16:44:06.192964] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2387225703270654241 len:8482 00:08:20.663 [2024-11-16 16:44:06.192996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.663 [2024-11-16 16:44:06.193026] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2387225703656530209 len:8482 00:08:20.663 [2024-11-16 16:44:06.193042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.663 [2024-11-16 16:44:06.193069] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3540147208263377953 len:8482 00:08:20.663 [2024-11-16 16:44:06.193085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.663 [2024-11-16 16:44:06.193111] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2387225703656530209 len:8482 00:08:20.663 [2024-11-16 16:44:06.193126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.663 [2024-11-16 16:44:06.193152] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:2387225703656530209 len:8482 00:08:20.663 [2024-11-16 16:44:06.193166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:20.663 #10 NEW cov: 11729 ft: 12460 corp: 4/151b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 ChangeBit- 00:08:20.663 [2024-11-16 16:44:06.252958] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3851 00:08:20.663 [2024-11-16 16:44:06.252989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.663 #11 NEW cov: 11814 ft: 13036 corp: 5/161b lim: 50 exec/s: 0 rss: 67Mb L: 10/50 MS: 1 InsertRepeatedBytes- 00:08:20.663 [2024-11-16 16:44:06.313259] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2387225703270654241 len:8482 00:08:20.663 [2024-11-16 16:44:06.313295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.663 [2024-11-16 16:44:06.313328] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2387225703656530209 len:8482 00:08:20.663 [2024-11-16 16:44:06.313346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.663 [2024-11-16 16:44:06.313374] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3540147208263377953 len:8482 00:08:20.663 [2024-11-16 16:44:06.313394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.663 [2024-11-16 16:44:06.313420] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2387225703656530209 len:8482 00:08:20.663 [2024-11-16 16:44:06.313436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.663 [2024-11-16 16:44:06.313461] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:16059518366869741534 len:57052 00:08:20.663 [2024-11-16 16:44:06.313477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:20.663 #12 NEW cov: 11814 ft: 13190 corp: 6/211b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 ChangeBinInt- 00:08:20.663 [2024-11-16 16:44:06.373440] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2387225703270654241 len:8482 00:08:20.663 [2024-11-16 16:44:06.373470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.663 [2024-11-16 16:44:06.373501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2387225703656530209 len:8482 00:08:20.663 [2024-11-16 16:44:06.373519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.663 [2024-11-16 16:44:06.373547] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3540147208263377953 len:8482 00:08:20.663 [2024-11-16 16:44:06.373563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.664 [2024-11-16 16:44:06.373590] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2387225703656530209 len:8482 00:08:20.664 [2024-11-16 16:44:06.373606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.664 [2024-11-16 16:44:06.373633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:4693068712870224161 len:8482 00:08:20.664 [2024-11-16 16:44:06.373649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:20.664 #13 NEW cov: 11814 ft: 13274 corp: 7/261b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 ChangeByte- 00:08:20.923 [2024-11-16 16:44:06.423563] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2387225703270658337 len:8482 00:08:20.923 [2024-11-16 16:44:06.423592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.923 [2024-11-16 16:44:06.423621] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2387225703656530209 len:8482 00:08:20.923 [2024-11-16 16:44:06.423638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.923 [2024-11-16 16:44:06.423665] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3540147208263377953 len:8482 00:08:20.923 [2024-11-16 16:44:06.423686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.923 [2024-11-16 16:44:06.423729] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2387225703656530209 len:8482 00:08:20.923 [2024-11-16 16:44:06.423745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.923 [2024-11-16 16:44:06.423771] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:2387225703656530209 len:8482 00:08:20.923 [2024-11-16 16:44:06.423787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:20.923 #14 NEW cov: 11814 ft: 13428 corp: 8/311b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 ChangeByte- 00:08:20.923 [2024-11-16 16:44:06.473657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2387225703270654241 len:8482 00:08:20.923 [2024-11-16 16:44:06.473694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.923 [2024-11-16 16:44:06.473724] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2387189278038892833 len:1 00:08:20.923 [2024-11-16 16:44:06.473741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.923 [2024-11-16 16:44:06.473769] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2387225703100710912 len:8482 00:08:20.923 [2024-11-16 16:44:06.473784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.923 [2024-11-16 16:44:06.473809] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2387225703656530209 len:8482 00:08:20.923 [2024-11-16 16:44:06.473824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.923 [2024-11-16 16:44:06.473850] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:2387225703656530209 len:8482 00:08:20.923 [2024-11-16 16:44:06.473864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:20.923 #15 NEW cov: 11814 ft: 13497 corp: 9/361b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:20.923 [2024-11-16 16:44:06.523852] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2387225703270654241 len:8482 00:08:20.924 [2024-11-16 16:44:06.523882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.924 [2024-11-16 16:44:06.523913] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:555819297 len:1 00:08:20.924 [2024-11-16 16:44:06.523931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.924 [2024-11-16 16:44:06.523959] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2387225703100719393 len:8482 00:08:20.924 [2024-11-16 16:44:06.523975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.924 [2024-11-16 16:44:06.524002] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2387225703656530209 len:8482 00:08:20.924 [2024-11-16 16:44:06.524017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.924 [2024-11-16 16:44:06.524044] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:2387225703656530209 len:8482 00:08:20.924 [2024-11-16 16:44:06.524059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:20.924 #16 NEW cov: 11814 ft: 13518 corp: 10/411b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 CopyPart- 00:08:20.924 [2024-11-16 16:44:06.583822] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:23851 00:08:20.924 [2024-11-16 16:44:06.583852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.924 #20 NEW cov: 11814 ft: 13550 corp: 11/421b lim: 50 exec/s: 0 rss: 67Mb L: 10/50 MS: 4 ChangeByte-ChangeByte-InsertByte-PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:20.924 [2024-11-16 16:44:06.633998] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1085102592571150095 len:3851 00:08:20.924 [2024-11-16 16:44:06.634029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.184 #21 NEW cov: 11814 ft: 13593 corp: 12/431b lim: 50 exec/s: 0 rss: 67Mb L: 10/50 MS: 1 ShuffleBytes- 00:08:21.184 [2024-11-16 16:44:06.704284] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2387225703270654241 len:8482 00:08:21.184 [2024-11-16 16:44:06.704314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.184 [2024-11-16 16:44:06.704344] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2387189278038892800 len:1 00:08:21.184 [2024-11-16 16:44:06.704361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.184 [2024-11-16 16:44:06.704388] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2387225703100710912 len:8482 00:08:21.184 [2024-11-16 16:44:06.704403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.184 [2024-11-16 16:44:06.704429] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2387225703656530209 len:8482 00:08:21.184 [2024-11-16 16:44:06.704444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.184 [2024-11-16 16:44:06.704470] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:2387225703656530209 len:8482 00:08:21.184 [2024-11-16 16:44:06.704485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:21.184 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:21.184 #22 NEW cov: 11831 ft: 13643 corp: 13/481b lim: 50 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 ChangeByte- 00:08:21.184 [2024-11-16 16:44:06.754277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1085102596195028751 len:3851 00:08:21.184 [2024-11-16 16:44:06.754307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.184 #23 NEW cov: 11831 ft: 13731 corp: 14/491b lim: 50 exec/s: 23 rss: 68Mb L: 10/50 MS: 1 ChangeBinInt- 00:08:21.184 [2024-11-16 16:44:06.824613] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069667229695 len:65536 00:08:21.184 [2024-11-16 16:44:06.824643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.184 [2024-11-16 16:44:06.824683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:21.184 [2024-11-16 16:44:06.824701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.184 [2024-11-16 16:44:06.824729] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:21.184 [2024-11-16 16:44:06.824745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.184 [2024-11-16 16:44:06.824773] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446743042917400575 len:3856 00:08:21.184 [2024-11-16 16:44:06.824788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.184 #24 NEW cov: 11831 ft: 13835 corp: 15/535b lim: 50 exec/s: 24 rss: 68Mb L: 44/50 MS: 1 InsertRepeatedBytes- 00:08:21.184 [2024-11-16 16:44:06.874728] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2387225703270654241 len:8482 00:08:21.184 [2024-11-16 16:44:06.874757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.184 [2024-11-16 16:44:06.874786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2387189278038892833 len:1 00:08:21.184 [2024-11-16 16:44:06.874803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.184 [2024-11-16 16:44:06.874829] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2387225703100710912 len:8482 00:08:21.184 [2024-11-16 16:44:06.874844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.184 [2024-11-16 16:44:06.874869] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2387225703656530209 len:8482 00:08:21.184 [2024-11-16 16:44:06.874884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.184 [2024-11-16 16:44:06.874909] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:2387225703656530209 len:8482 00:08:21.184 [2024-11-16 16:44:06.874924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:21.184 #25 NEW cov: 11831 ft: 13855 corp: 16/585b lim: 50 exec/s: 25 rss: 68Mb L: 50/50 MS: 1 ShuffleBytes- 00:08:21.184 [2024-11-16 16:44:06.924885] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2387225703270658337 len:8482 00:08:21.184 [2024-11-16 16:44:06.924916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.184 [2024-11-16 16:44:06.924947] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2387225703656530209 len:8482 00:08:21.184 [2024-11-16 16:44:06.924964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.184 [2024-11-16 16:44:06.924993] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:3540147208263377953 len:8482 00:08:21.184 [2024-11-16 16:44:06.925008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.184 [2024-11-16 16:44:06.925035] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2387225703656530209 len:8482 00:08:21.184 [2024-11-16 16:44:06.925051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.184 [2024-11-16 16:44:06.925077] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:2387225703656530209 len:8482 00:08:21.184 [2024-11-16 16:44:06.925093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:21.444 #26 NEW cov: 11831 ft: 13858 corp: 17/635b lim: 50 exec/s: 26 rss: 68Mb L: 50/50 MS: 1 CopyPart- 00:08:21.444 [2024-11-16 16:44:06.995063] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2387225703270654241 len:8482 00:08:21.444 [2024-11-16 16:44:06.995092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.444 [2024-11-16 16:44:06.995121] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:555819297 len:1 00:08:21.444 [2024-11-16 16:44:06.995138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.444 [2024-11-16 16:44:06.995169] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2387225703100719393 len:8484 00:08:21.444 [2024-11-16 16:44:06.995184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.444 [2024-11-16 16:44:06.995210] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2387225703656530209 len:8482 00:08:21.444 [2024-11-16 16:44:06.995225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.444 [2024-11-16 16:44:06.995250] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:2387225703656530209 len:8482 00:08:21.444 [2024-11-16 16:44:06.995264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:21.444 #27 NEW cov: 11831 ft: 13881 corp: 18/685b lim: 50 exec/s: 27 rss: 68Mb L: 50/50 MS: 1 ChangeBit- 00:08:21.444 [2024-11-16 16:44:07.055192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069667229695 len:65536 00:08:21.444 [2024-11-16 16:44:07.055221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.444 [2024-11-16 16:44:07.055250] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:21.444 [2024-11-16 16:44:07.055267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.444 [2024-11-16 16:44:07.055294] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:21.444 [2024-11-16 16:44:07.055309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.444 [2024-11-16 16:44:07.055334] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446743042917400575 len:3856 00:08:21.444 [2024-11-16 16:44:07.055349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.444 #28 NEW cov: 11831 ft: 13903 corp: 19/731b lim: 50 exec/s: 28 rss: 68Mb L: 46/50 MS: 1 CrossOver- 00:08:21.444 [2024-11-16 16:44:07.115387] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2387225703270654241 len:8482 00:08:21.444 [2024-11-16 16:44:07.115416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.444 [2024-11-16 16:44:07.115446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:555819297 len:34 00:08:21.444 [2024-11-16 16:44:07.115463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.445 [2024-11-16 16:44:07.115490] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2387225703100710945 len:8484 00:08:21.445 [2024-11-16 16:44:07.115505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.445 [2024-11-16 16:44:07.115530] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2387225703656530209 len:8482 00:08:21.445 [2024-11-16 16:44:07.115545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.445 [2024-11-16 16:44:07.115570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:2387225703656530209 len:8482 00:08:21.445 [2024-11-16 16:44:07.115585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:21.445 #29 NEW cov: 11831 ft: 13979 corp: 20/781b lim: 50 exec/s: 29 rss: 68Mb L: 50/50 MS: 1 ShuffleBytes- 00:08:21.445 [2024-11-16 16:44:07.175392] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1080863910821564175 len:1 00:08:21.445 [2024-11-16 16:44:07.175422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.704 #30 NEW cov: 11831 ft: 14021 corp: 21/799b lim: 50 exec/s: 30 rss: 68Mb L: 18/50 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:21.704 [2024-11-16 16:44:07.225707] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:169934848 len:1 00:08:21.704 [2024-11-16 16:44:07.225736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.704 [2024-11-16 16:44:07.225767] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2387189278038892833 len:1 00:08:21.704 [2024-11-16 16:44:07.225784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.704 [2024-11-16 16:44:07.225812] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2387225703100710912 len:8482 00:08:21.704 [2024-11-16 16:44:07.225838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.704 [2024-11-16 16:44:07.225863] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2387225703656530209 len:8482 00:08:21.704 [2024-11-16 16:44:07.225878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.704 [2024-11-16 16:44:07.225903] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:2387225703656530209 len:8482 00:08:21.704 [2024-11-16 16:44:07.225918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:21.704 #31 NEW cov: 11831 ft: 14084 corp: 22/849b lim: 50 exec/s: 31 rss: 68Mb L: 50/50 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:21.704 [2024-11-16 16:44:07.275800] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2387225703270654241 len:8482 00:08:21.705 [2024-11-16 16:44:07.275829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.705 [2024-11-16 16:44:07.275858] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:555819297 len:1 00:08:21.705 [2024-11-16 16:44:07.275875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.705 [2024-11-16 16:44:07.275902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2387225703100719393 len:8484 00:08:21.705 [2024-11-16 16:44:07.275917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.705 [2024-11-16 16:44:07.275942] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2387225703660724513 len:8482 00:08:21.705 [2024-11-16 16:44:07.275957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.705 [2024-11-16 16:44:07.275983] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:2387225703656530209 len:8482 00:08:21.705 [2024-11-16 16:44:07.275997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:21.705 #32 NEW cov: 11831 ft: 14138 corp: 23/899b lim: 50 exec/s: 32 rss: 68Mb L: 50/50 MS: 1 ChangeBit- 00:08:21.705 [2024-11-16 16:44:07.325959] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2387225703270654241 len:8482 00:08:21.705 [2024-11-16 16:44:07.325988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.705 [2024-11-16 16:44:07.326018] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2387225703656530209 len:8482 00:08:21.705 [2024-11-16 16:44:07.326036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.705 [2024-11-16 16:44:07.326065] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2387225703656530977 len:8482 00:08:21.705 [2024-11-16 16:44:07.326081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.705 [2024-11-16 16:44:07.326107] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2387225703656530209 len:8482 00:08:21.705 [2024-11-16 16:44:07.326122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.705 [2024-11-16 16:44:07.326148] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:2387225703656530209 len:8482 00:08:21.705 [2024-11-16 16:44:07.326164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:21.705 #33 NEW cov: 11831 ft: 14154 corp: 24/949b lim: 50 exec/s: 33 rss: 68Mb L: 50/50 MS: 1 CopyPart- 00:08:21.705 [2024-11-16 16:44:07.375942] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1073741824 len:94 00:08:21.705 [2024-11-16 16:44:07.375972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.705 #34 NEW cov: 11831 ft: 14162 corp: 25/960b lim: 50 exec/s: 34 rss: 68Mb L: 11/50 MS: 1 InsertByte- 00:08:21.705 [2024-11-16 16:44:07.436710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1085102596195028751 len:3851 00:08:21.705 [2024-11-16 16:44:07.436741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.964 #35 NEW cov: 11831 ft: 14238 corp: 26/970b lim: 50 exec/s: 35 rss: 68Mb L: 10/50 MS: 1 ShuffleBytes- 00:08:21.964 [2024-11-16 16:44:07.477054] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2387225703269146913 len:8482 00:08:21.964 [2024-11-16 16:44:07.477085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.964 [2024-11-16 16:44:07.477152] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2377900603807441185 len:1 00:08:21.964 [2024-11-16 16:44:07.477173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.964 [2024-11-16 16:44:07.477239] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2387225703100710945 len:8482 00:08:21.964 [2024-11-16 16:44:07.477259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.964 #36 NEW cov: 11831 ft: 14554 corp: 27/1005b lim: 50 exec/s: 36 rss: 68Mb L: 35/50 MS: 1 CrossOver- 00:08:21.964 [2024-11-16 16:44:07.516888] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2387225703269146913 len:8482 00:08:21.964 [2024-11-16 16:44:07.516918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.964 #37 NEW cov: 11831 ft: 14569 corp: 28/1022b lim: 50 exec/s: 37 rss: 68Mb L: 17/50 MS: 1 CrossOver- 00:08:21.964 [2024-11-16 16:44:07.557409] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069667229695 len:65536 00:08:21.964 [2024-11-16 16:44:07.557442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.964 [2024-11-16 16:44:07.557511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:21.964 [2024-11-16 16:44:07.557536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.964 [2024-11-16 16:44:07.557605] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551396 len:65536 00:08:21.964 [2024-11-16 16:44:07.557626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.964 [2024-11-16 16:44:07.557697] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:3856 00:08:21.964 [2024-11-16 16:44:07.557718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.964 #38 NEW cov: 11831 ft: 14592 corp: 29/1069b lim: 50 exec/s: 38 rss: 68Mb L: 47/50 MS: 1 InsertByte- 00:08:21.964 [2024-11-16 16:44:07.597359] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2387225703269146913 len:8482 00:08:21.964 [2024-11-16 16:44:07.597389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.964 [2024-11-16 16:44:07.597452] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2377900603807441185 len:1 00:08:21.964 [2024-11-16 16:44:07.597474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.964 [2024-11-16 16:44:07.597537] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2387225703100710945 len:8994 00:08:21.964 [2024-11-16 16:44:07.597557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.964 #39 NEW cov: 11831 ft: 14620 corp: 30/1104b lim: 50 exec/s: 39 rss: 68Mb L: 35/50 MS: 1 ChangeBinInt- 00:08:21.964 [2024-11-16 16:44:07.637699] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2387225703270654241 len:8482 00:08:21.964 [2024-11-16 16:44:07.637729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.964 [2024-11-16 16:44:07.637786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2387189278038892800 len:1 00:08:21.964 [2024-11-16 16:44:07.637808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.964 [2024-11-16 16:44:07.637872] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2387225703100710912 len:8482 00:08:21.964 [2024-11-16 16:44:07.637894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.964 [2024-11-16 16:44:07.637958] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2387225703656530209 len:8482 00:08:21.964 [2024-11-16 16:44:07.637977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.964 [2024-11-16 16:44:07.638045] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:2387296072400707873 len:8482 00:08:21.964 [2024-11-16 16:44:07.638065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:21.964 #40 NEW cov: 11831 ft: 14636 corp: 31/1154b lim: 50 exec/s: 40 rss: 68Mb L: 50/50 MS: 1 ChangeBit- 00:08:21.964 [2024-11-16 16:44:07.677423] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1085102596195028751 len:3851 00:08:21.964 [2024-11-16 16:44:07.677452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.964 #41 NEW cov: 11838 ft: 14661 corp: 32/1165b lim: 50 exec/s: 41 rss: 68Mb L: 11/50 MS: 1 InsertByte- 00:08:22.224 [2024-11-16 16:44:07.717528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1085102594718633743 len:3851 00:08:22.224 [2024-11-16 16:44:07.717557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.224 #42 NEW cov: 11838 ft: 14687 corp: 33/1175b lim: 50 exec/s: 42 rss: 68Mb L: 10/50 MS: 1 ChangeBit- 00:08:22.224 [2024-11-16 16:44:07.757820] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2387225703269146913 len:8482 00:08:22.224 [2024-11-16 16:44:07.757849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.224 [2024-11-16 16:44:07.757914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2377900603807441185 len:1 00:08:22.224 [2024-11-16 16:44:07.757935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.224 [2024-11-16 16:44:07.758000] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2387225703100710945 len:8482 00:08:22.224 [2024-11-16 16:44:07.758019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.224 #43 NEW cov: 11838 ft: 14699 corp: 34/1210b lim: 50 exec/s: 43 rss: 68Mb L: 35/50 MS: 1 ShuffleBytes- 00:08:22.224 [2024-11-16 16:44:07.798095] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069667229695 len:65536 00:08:22.224 [2024-11-16 16:44:07.798124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.224 [2024-11-16 16:44:07.798185] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709420543 len:65536 00:08:22.224 [2024-11-16 16:44:07.798207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.224 [2024-11-16 16:44:07.798270] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:22.224 [2024-11-16 16:44:07.798289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.224 [2024-11-16 16:44:07.798355] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446743042917400575 len:3856 00:08:22.224 [2024-11-16 16:44:07.798374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.224 #49 NEW cov: 11838 ft: 14728 corp: 35/1254b lim: 50 exec/s: 24 rss: 68Mb L: 44/50 MS: 1 ChangeBit- 00:08:22.224 #49 DONE cov: 11838 ft: 14728 corp: 35/1254b lim: 50 exec/s: 24 rss: 68Mb 00:08:22.224 ###### Recommended dictionary. ###### 00:08:22.224 "\000\000\000\000\000\000\000\000" # Uses: 4 00:08:22.224 ###### End of recommended dictionary. ###### 00:08:22.224 Done 49 runs in 2 second(s) 00:08:22.224 16:44:07 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:08:22.224 16:44:07 -- ../common.sh@72 -- # (( i++ )) 00:08:22.224 16:44:07 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:22.224 16:44:07 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:22.224 16:44:07 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:22.224 16:44:07 -- nvmf/run.sh@24 -- # local timen=1 00:08:22.224 16:44:07 -- nvmf/run.sh@25 -- # local core=0x1 00:08:22.224 16:44:07 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:22.224 16:44:07 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:22.224 16:44:07 -- nvmf/run.sh@29 -- # printf %02d 20 00:08:22.224 16:44:07 -- nvmf/run.sh@29 -- # port=4420 00:08:22.224 16:44:07 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:22.224 16:44:07 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:22.224 16:44:07 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:22.224 16:44:07 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:08:22.484 [2024-11-16 16:44:07.973528] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:22.484 [2024-11-16 16:44:07.973593] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid492094 ] 00:08:22.484 EAL: No free 2048 kB hugepages reported on node 1 00:08:22.484 [2024-11-16 16:44:08.148882] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:22.484 [2024-11-16 16:44:08.168240] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:22.484 [2024-11-16 16:44:08.168369] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:22.484 [2024-11-16 16:44:08.219732] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:22.744 [2024-11-16 16:44:08.236086] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:22.744 INFO: Running with entropic power schedule (0xFF, 100). 00:08:22.744 INFO: Seed: 387636383 00:08:22.744 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:22.744 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:22.744 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:22.744 INFO: A corpus is not provided, starting from an empty corpus 00:08:22.744 #2 INITED exec/s: 0 rss: 59Mb 00:08:22.744 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:22.744 This may also happen if the target rejected all inputs we tried so far 00:08:22.744 [2024-11-16 16:44:08.291453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:22.744 [2024-11-16 16:44:08.291484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.744 [2024-11-16 16:44:08.291550] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:22.744 [2024-11-16 16:44:08.291572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.744 [2024-11-16 16:44:08.291639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:22.744 [2024-11-16 16:44:08.291659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.003 NEW_FUNC[1/668]: 0x473598 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:23.003 NEW_FUNC[2/668]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:23.003 #5 NEW cov: 11647 ft: 11653 corp: 2/65b lim: 90 exec/s: 0 rss: 67Mb L: 64/64 MS: 3 ChangeBit-CopyPart-InsertRepeatedBytes- 00:08:23.003 [2024-11-16 16:44:08.592272] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.003 [2024-11-16 16:44:08.592306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.003 [2024-11-16 16:44:08.592379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.003 [2024-11-16 16:44:08.592400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.003 [2024-11-16 16:44:08.592467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.003 [2024-11-16 16:44:08.592486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.003 [2024-11-16 16:44:08.592550] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:23.003 [2024-11-16 16:44:08.592569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.003 NEW_FUNC[1/4]: 0xf5e728 in posix_sock_flush /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/sock/posix/posix.c:1441 00:08:23.003 NEW_FUNC[2/4]: 0x16c7178 in nvme_qpair_is_admin_queue /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1090 00:08:23.003 #6 NEW cov: 11781 ft: 12531 corp: 3/143b lim: 90 exec/s: 0 rss: 67Mb L: 78/78 MS: 1 InsertRepeatedBytes- 00:08:23.003 [2024-11-16 16:44:08.642409] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.004 [2024-11-16 16:44:08.642439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.004 [2024-11-16 16:44:08.642499] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.004 [2024-11-16 16:44:08.642521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.004 [2024-11-16 16:44:08.642588] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.004 [2024-11-16 16:44:08.642608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.004 [2024-11-16 16:44:08.642674] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:23.004 [2024-11-16 16:44:08.642693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.004 #7 NEW cov: 11787 ft: 12734 corp: 4/221b lim: 90 exec/s: 0 rss: 67Mb L: 78/78 MS: 1 ChangeBit- 00:08:23.004 [2024-11-16 16:44:08.682522] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.004 [2024-11-16 16:44:08.682550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.004 [2024-11-16 16:44:08.682608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.004 [2024-11-16 16:44:08.682629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.004 [2024-11-16 16:44:08.682698] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.004 [2024-11-16 16:44:08.682721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.004 [2024-11-16 16:44:08.682788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:23.004 [2024-11-16 16:44:08.682811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.004 #8 NEW cov: 11872 ft: 12963 corp: 5/308b lim: 90 exec/s: 0 rss: 67Mb L: 87/87 MS: 1 InsertRepeatedBytes- 00:08:23.004 [2024-11-16 16:44:08.722617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.004 [2024-11-16 16:44:08.722645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.004 [2024-11-16 16:44:08.722719] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.004 [2024-11-16 16:44:08.722742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.004 [2024-11-16 16:44:08.722807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.004 [2024-11-16 16:44:08.722826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.004 [2024-11-16 16:44:08.722893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:23.004 [2024-11-16 16:44:08.722912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.004 #9 NEW cov: 11872 ft: 13107 corp: 6/395b lim: 90 exec/s: 0 rss: 67Mb L: 87/87 MS: 1 ChangeBinInt- 00:08:23.264 [2024-11-16 16:44:08.762586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.264 [2024-11-16 16:44:08.762615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.264 [2024-11-16 16:44:08.762685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.264 [2024-11-16 16:44:08.762707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.264 [2024-11-16 16:44:08.762772] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.264 [2024-11-16 16:44:08.762791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.264 #10 NEW cov: 11872 ft: 13195 corp: 7/457b lim: 90 exec/s: 0 rss: 67Mb L: 62/87 MS: 1 InsertRepeatedBytes- 00:08:23.264 [2024-11-16 16:44:08.802713] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.264 [2024-11-16 16:44:08.802742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.264 [2024-11-16 16:44:08.802810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.264 [2024-11-16 16:44:08.802831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.264 [2024-11-16 16:44:08.802898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.264 [2024-11-16 16:44:08.802917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.264 #11 NEW cov: 11872 ft: 13298 corp: 8/519b lim: 90 exec/s: 0 rss: 68Mb L: 62/87 MS: 1 ChangeBit- 00:08:23.264 [2024-11-16 16:44:08.843012] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.264 [2024-11-16 16:44:08.843040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.264 [2024-11-16 16:44:08.843094] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.264 [2024-11-16 16:44:08.843115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.264 [2024-11-16 16:44:08.843181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.264 [2024-11-16 16:44:08.843201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.264 [2024-11-16 16:44:08.843283] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:23.264 [2024-11-16 16:44:08.843307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.264 #12 NEW cov: 11872 ft: 13315 corp: 9/606b lim: 90 exec/s: 0 rss: 68Mb L: 87/87 MS: 1 CrossOver- 00:08:23.264 [2024-11-16 16:44:08.882932] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.264 [2024-11-16 16:44:08.882961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.264 [2024-11-16 16:44:08.883027] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.264 [2024-11-16 16:44:08.883049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.264 [2024-11-16 16:44:08.883115] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.264 [2024-11-16 16:44:08.883134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.264 #13 NEW cov: 11872 ft: 13335 corp: 10/669b lim: 90 exec/s: 0 rss: 68Mb L: 63/87 MS: 1 InsertByte- 00:08:23.264 [2024-11-16 16:44:08.923172] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.264 [2024-11-16 16:44:08.923202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.264 [2024-11-16 16:44:08.923266] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.264 [2024-11-16 16:44:08.923287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.264 [2024-11-16 16:44:08.923354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.264 [2024-11-16 16:44:08.923373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.264 [2024-11-16 16:44:08.923438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:23.264 [2024-11-16 16:44:08.923458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.264 #14 NEW cov: 11872 ft: 13355 corp: 11/756b lim: 90 exec/s: 0 rss: 68Mb L: 87/87 MS: 1 ChangeBit- 00:08:23.264 [2024-11-16 16:44:08.963333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.264 [2024-11-16 16:44:08.963363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.264 [2024-11-16 16:44:08.963427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.264 [2024-11-16 16:44:08.963449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.264 [2024-11-16 16:44:08.963515] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.264 [2024-11-16 16:44:08.963535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.264 [2024-11-16 16:44:08.963599] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:23.264 [2024-11-16 16:44:08.963619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.264 #15 NEW cov: 11872 ft: 13393 corp: 12/844b lim: 90 exec/s: 0 rss: 68Mb L: 88/88 MS: 1 InsertByte- 00:08:23.264 [2024-11-16 16:44:09.003549] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.264 [2024-11-16 16:44:09.003580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.264 [2024-11-16 16:44:09.003652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.264 [2024-11-16 16:44:09.003677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.264 [2024-11-16 16:44:09.003749] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.264 [2024-11-16 16:44:09.003772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.264 [2024-11-16 16:44:09.003839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:23.264 [2024-11-16 16:44:09.003860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.264 [2024-11-16 16:44:09.003928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:23.264 [2024-11-16 16:44:09.003950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:23.523 #19 NEW cov: 11872 ft: 13513 corp: 13/934b lim: 90 exec/s: 0 rss: 68Mb L: 90/90 MS: 4 CopyPart-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:08:23.523 [2024-11-16 16:44:09.043388] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.523 [2024-11-16 16:44:09.043418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.523 [2024-11-16 16:44:09.043485] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.523 [2024-11-16 16:44:09.043507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.523 [2024-11-16 16:44:09.043575] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.523 [2024-11-16 16:44:09.043594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.523 #20 NEW cov: 11872 ft: 13530 corp: 14/996b lim: 90 exec/s: 0 rss: 68Mb L: 62/90 MS: 1 CopyPart- 00:08:23.523 [2024-11-16 16:44:09.083644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.523 [2024-11-16 16:44:09.083678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.523 [2024-11-16 16:44:09.083741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.523 [2024-11-16 16:44:09.083763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.524 [2024-11-16 16:44:09.083827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.524 [2024-11-16 16:44:09.083847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.524 [2024-11-16 16:44:09.083913] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:23.524 [2024-11-16 16:44:09.083932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.524 #21 NEW cov: 11872 ft: 13591 corp: 15/1078b lim: 90 exec/s: 0 rss: 68Mb L: 82/90 MS: 1 CopyPart- 00:08:23.524 [2024-11-16 16:44:09.123750] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.524 [2024-11-16 16:44:09.123779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.524 [2024-11-16 16:44:09.123870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.524 [2024-11-16 16:44:09.123892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.524 [2024-11-16 16:44:09.123963] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.524 [2024-11-16 16:44:09.123986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.524 [2024-11-16 16:44:09.124067] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:23.524 [2024-11-16 16:44:09.124086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.524 #22 NEW cov: 11872 ft: 13618 corp: 16/1166b lim: 90 exec/s: 0 rss: 68Mb L: 88/90 MS: 1 ChangeByte- 00:08:23.524 [2024-11-16 16:44:09.163557] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.524 [2024-11-16 16:44:09.163586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.524 [2024-11-16 16:44:09.163655] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.524 [2024-11-16 16:44:09.163683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.524 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:23.524 #23 NEW cov: 11895 ft: 14045 corp: 17/1213b lim: 90 exec/s: 0 rss: 68Mb L: 47/90 MS: 1 EraseBytes- 00:08:23.524 [2024-11-16 16:44:09.203969] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.524 [2024-11-16 16:44:09.203998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.524 [2024-11-16 16:44:09.204059] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.524 [2024-11-16 16:44:09.204079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.524 [2024-11-16 16:44:09.204148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.524 [2024-11-16 16:44:09.204167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.524 [2024-11-16 16:44:09.204232] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:23.524 [2024-11-16 16:44:09.204251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.524 #24 NEW cov: 11895 ft: 14064 corp: 18/1301b lim: 90 exec/s: 0 rss: 68Mb L: 88/90 MS: 1 ShuffleBytes- 00:08:23.524 [2024-11-16 16:44:09.243803] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.524 [2024-11-16 16:44:09.243832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.524 [2024-11-16 16:44:09.243900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.524 [2024-11-16 16:44:09.243922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.524 #30 NEW cov: 11895 ft: 14075 corp: 19/1351b lim: 90 exec/s: 0 rss: 68Mb L: 50/90 MS: 1 EraseBytes- 00:08:23.783 [2024-11-16 16:44:09.284336] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.783 [2024-11-16 16:44:09.284365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.783 [2024-11-16 16:44:09.284426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.783 [2024-11-16 16:44:09.284446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.783 [2024-11-16 16:44:09.284516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.783 [2024-11-16 16:44:09.284535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.783 [2024-11-16 16:44:09.284598] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:23.783 [2024-11-16 16:44:09.284617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.783 [2024-11-16 16:44:09.284688] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:23.783 [2024-11-16 16:44:09.284711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:23.783 #31 NEW cov: 11895 ft: 14082 corp: 20/1441b lim: 90 exec/s: 31 rss: 68Mb L: 90/90 MS: 1 CopyPart- 00:08:23.783 [2024-11-16 16:44:09.324320] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.783 [2024-11-16 16:44:09.324348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.783 [2024-11-16 16:44:09.324422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.783 [2024-11-16 16:44:09.324443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.783 [2024-11-16 16:44:09.324507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.783 [2024-11-16 16:44:09.324525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.783 [2024-11-16 16:44:09.324590] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:23.783 [2024-11-16 16:44:09.324609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.783 #32 NEW cov: 11895 ft: 14157 corp: 21/1530b lim: 90 exec/s: 32 rss: 68Mb L: 89/90 MS: 1 CopyPart- 00:08:23.783 [2024-11-16 16:44:09.364437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.783 [2024-11-16 16:44:09.364465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.783 [2024-11-16 16:44:09.364530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.783 [2024-11-16 16:44:09.364551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.783 [2024-11-16 16:44:09.364617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.783 [2024-11-16 16:44:09.364636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.783 [2024-11-16 16:44:09.364704] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:23.783 [2024-11-16 16:44:09.364724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.783 #33 NEW cov: 11895 ft: 14173 corp: 22/1618b lim: 90 exec/s: 33 rss: 68Mb L: 88/90 MS: 1 InsertByte- 00:08:23.783 [2024-11-16 16:44:09.404550] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.783 [2024-11-16 16:44:09.404578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.783 [2024-11-16 16:44:09.404638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.783 [2024-11-16 16:44:09.404661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.783 [2024-11-16 16:44:09.404736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.783 [2024-11-16 16:44:09.404759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.783 [2024-11-16 16:44:09.404837] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:23.783 [2024-11-16 16:44:09.404856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.783 #34 NEW cov: 11895 ft: 14243 corp: 23/1706b lim: 90 exec/s: 34 rss: 68Mb L: 88/90 MS: 1 ChangeBinInt- 00:08:23.783 [2024-11-16 16:44:09.444672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.783 [2024-11-16 16:44:09.444701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.783 [2024-11-16 16:44:09.444759] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.783 [2024-11-16 16:44:09.444781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.783 [2024-11-16 16:44:09.444847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.783 [2024-11-16 16:44:09.444866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.784 [2024-11-16 16:44:09.444930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:23.784 [2024-11-16 16:44:09.444949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.784 #35 NEW cov: 11895 ft: 14327 corp: 24/1788b lim: 90 exec/s: 35 rss: 68Mb L: 82/90 MS: 1 ChangeBinInt- 00:08:23.784 [2024-11-16 16:44:09.484791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.784 [2024-11-16 16:44:09.484820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.784 [2024-11-16 16:44:09.484877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.784 [2024-11-16 16:44:09.484898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.784 [2024-11-16 16:44:09.484963] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.784 [2024-11-16 16:44:09.484981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.784 [2024-11-16 16:44:09.485046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:23.784 [2024-11-16 16:44:09.485066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.784 #36 NEW cov: 11895 ft: 14330 corp: 25/1875b lim: 90 exec/s: 36 rss: 68Mb L: 87/90 MS: 1 ShuffleBytes- 00:08:23.784 [2024-11-16 16:44:09.524887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:23.784 [2024-11-16 16:44:09.524916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.784 [2024-11-16 16:44:09.524976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:23.784 [2024-11-16 16:44:09.524997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.784 [2024-11-16 16:44:09.525063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:23.784 [2024-11-16 16:44:09.525085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.784 [2024-11-16 16:44:09.525150] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:23.784 [2024-11-16 16:44:09.525169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.043 #37 NEW cov: 11898 ft: 14449 corp: 26/1963b lim: 90 exec/s: 37 rss: 69Mb L: 88/90 MS: 1 CopyPart- 00:08:24.043 [2024-11-16 16:44:09.565197] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.043 [2024-11-16 16:44:09.565226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.043 [2024-11-16 16:44:09.565284] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.043 [2024-11-16 16:44:09.565304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.043 [2024-11-16 16:44:09.565370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.043 [2024-11-16 16:44:09.565390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.043 [2024-11-16 16:44:09.565452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.043 [2024-11-16 16:44:09.565471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.043 [2024-11-16 16:44:09.565534] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:24.043 [2024-11-16 16:44:09.565553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:24.043 #38 NEW cov: 11898 ft: 14469 corp: 27/2053b lim: 90 exec/s: 38 rss: 69Mb L: 90/90 MS: 1 CopyPart- 00:08:24.043 [2024-11-16 16:44:09.604818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.043 [2024-11-16 16:44:09.604846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.043 [2024-11-16 16:44:09.604913] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.043 [2024-11-16 16:44:09.604937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.043 #39 NEW cov: 11898 ft: 14483 corp: 28/2102b lim: 90 exec/s: 39 rss: 69Mb L: 49/90 MS: 1 EraseBytes- 00:08:24.043 [2024-11-16 16:44:09.645252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.043 [2024-11-16 16:44:09.645279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.043 [2024-11-16 16:44:09.645338] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.043 [2024-11-16 16:44:09.645359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.043 [2024-11-16 16:44:09.645425] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.043 [2024-11-16 16:44:09.645444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.043 [2024-11-16 16:44:09.645509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.043 [2024-11-16 16:44:09.645528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.043 #40 NEW cov: 11898 ft: 14490 corp: 29/2189b lim: 90 exec/s: 40 rss: 69Mb L: 87/90 MS: 1 CrossOver- 00:08:24.043 [2024-11-16 16:44:09.685341] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.043 [2024-11-16 16:44:09.685369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.043 [2024-11-16 16:44:09.685432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.043 [2024-11-16 16:44:09.685454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.043 [2024-11-16 16:44:09.685520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.043 [2024-11-16 16:44:09.685541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.043 [2024-11-16 16:44:09.685621] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.043 [2024-11-16 16:44:09.685640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.043 #41 NEW cov: 11898 ft: 14529 corp: 30/2277b lim: 90 exec/s: 41 rss: 69Mb L: 88/90 MS: 1 ChangeBit- 00:08:24.044 [2024-11-16 16:44:09.725465] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.044 [2024-11-16 16:44:09.725493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.044 [2024-11-16 16:44:09.725552] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.044 [2024-11-16 16:44:09.725573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.044 [2024-11-16 16:44:09.725638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.044 [2024-11-16 16:44:09.725657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.044 [2024-11-16 16:44:09.725740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.044 [2024-11-16 16:44:09.725760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.044 #42 NEW cov: 11898 ft: 14533 corp: 31/2364b lim: 90 exec/s: 42 rss: 69Mb L: 87/90 MS: 1 ShuffleBytes- 00:08:24.044 [2024-11-16 16:44:09.765262] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.044 [2024-11-16 16:44:09.765291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.044 [2024-11-16 16:44:09.765361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.044 [2024-11-16 16:44:09.765385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.044 #43 NEW cov: 11898 ft: 14581 corp: 32/2415b lim: 90 exec/s: 43 rss: 69Mb L: 51/90 MS: 1 EraseBytes- 00:08:24.303 [2024-11-16 16:44:09.805685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.303 [2024-11-16 16:44:09.805713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.303 [2024-11-16 16:44:09.805771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.303 [2024-11-16 16:44:09.805792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.303 [2024-11-16 16:44:09.805856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.303 [2024-11-16 16:44:09.805875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.303 [2024-11-16 16:44:09.805943] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.303 [2024-11-16 16:44:09.805962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.303 #44 NEW cov: 11898 ft: 14583 corp: 33/2489b lim: 90 exec/s: 44 rss: 69Mb L: 74/90 MS: 1 InsertRepeatedBytes- 00:08:24.303 [2024-11-16 16:44:09.845506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.303 [2024-11-16 16:44:09.845535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.303 [2024-11-16 16:44:09.845604] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.303 [2024-11-16 16:44:09.845626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.303 #45 NEW cov: 11898 ft: 14641 corp: 34/2538b lim: 90 exec/s: 45 rss: 69Mb L: 49/90 MS: 1 ChangeByte- 00:08:24.303 [2024-11-16 16:44:09.885896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.303 [2024-11-16 16:44:09.885924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.303 [2024-11-16 16:44:09.885982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.303 [2024-11-16 16:44:09.886003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.303 [2024-11-16 16:44:09.886067] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.303 [2024-11-16 16:44:09.886086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.303 [2024-11-16 16:44:09.886152] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.303 [2024-11-16 16:44:09.886171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.303 #46 NEW cov: 11898 ft: 14648 corp: 35/2626b lim: 90 exec/s: 46 rss: 69Mb L: 88/90 MS: 1 CopyPart- 00:08:24.303 [2024-11-16 16:44:09.926035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.304 [2024-11-16 16:44:09.926063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.304 [2024-11-16 16:44:09.926119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.304 [2024-11-16 16:44:09.926141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.304 [2024-11-16 16:44:09.926206] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.304 [2024-11-16 16:44:09.926227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.304 [2024-11-16 16:44:09.926290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.304 [2024-11-16 16:44:09.926309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.304 #47 NEW cov: 11898 ft: 14665 corp: 36/2704b lim: 90 exec/s: 47 rss: 69Mb L: 78/90 MS: 1 ChangeBit- 00:08:24.304 [2024-11-16 16:44:09.965997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.304 [2024-11-16 16:44:09.966025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.304 [2024-11-16 16:44:09.966093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.304 [2024-11-16 16:44:09.966118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.304 [2024-11-16 16:44:09.966183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.304 [2024-11-16 16:44:09.966202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.304 #48 NEW cov: 11898 ft: 14744 corp: 37/2769b lim: 90 exec/s: 48 rss: 69Mb L: 65/90 MS: 1 InsertRepeatedBytes- 00:08:24.304 [2024-11-16 16:44:10.006315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.304 [2024-11-16 16:44:10.006344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.304 [2024-11-16 16:44:10.006412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.304 [2024-11-16 16:44:10.006435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.304 [2024-11-16 16:44:10.006507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.304 [2024-11-16 16:44:10.006530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.304 [2024-11-16 16:44:10.006598] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.304 [2024-11-16 16:44:10.006620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.304 #49 NEW cov: 11898 ft: 14755 corp: 38/2847b lim: 90 exec/s: 49 rss: 69Mb L: 78/90 MS: 1 ChangeBit- 00:08:24.563 [2024-11-16 16:44:10.056471] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.563 [2024-11-16 16:44:10.056502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.563 [2024-11-16 16:44:10.056555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.563 [2024-11-16 16:44:10.056577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.563 [2024-11-16 16:44:10.056644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.563 [2024-11-16 16:44:10.056666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.564 [2024-11-16 16:44:10.056754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.564 [2024-11-16 16:44:10.056780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.564 #50 NEW cov: 11898 ft: 14763 corp: 39/2936b lim: 90 exec/s: 50 rss: 69Mb L: 89/90 MS: 1 InsertByte- 00:08:24.564 [2024-11-16 16:44:10.096687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.564 [2024-11-16 16:44:10.096720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.564 [2024-11-16 16:44:10.096789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.564 [2024-11-16 16:44:10.096811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.564 [2024-11-16 16:44:10.096881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.564 [2024-11-16 16:44:10.096904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.564 [2024-11-16 16:44:10.096976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.564 [2024-11-16 16:44:10.096997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.564 [2024-11-16 16:44:10.097063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:24.564 [2024-11-16 16:44:10.097084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:24.564 #51 NEW cov: 11898 ft: 14765 corp: 40/3026b lim: 90 exec/s: 51 rss: 69Mb L: 90/90 MS: 1 ShuffleBytes- 00:08:24.564 [2024-11-16 16:44:10.146511] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.564 [2024-11-16 16:44:10.146541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.564 [2024-11-16 16:44:10.146609] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.564 [2024-11-16 16:44:10.146632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.564 [2024-11-16 16:44:10.146706] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.564 [2024-11-16 16:44:10.146728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.564 #52 NEW cov: 11898 ft: 14777 corp: 41/3085b lim: 90 exec/s: 52 rss: 69Mb L: 59/90 MS: 1 CrossOver- 00:08:24.564 [2024-11-16 16:44:10.186506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.564 [2024-11-16 16:44:10.186552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.564 [2024-11-16 16:44:10.186623] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.564 [2024-11-16 16:44:10.186647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.564 #53 NEW cov: 11898 ft: 14788 corp: 42/3132b lim: 90 exec/s: 53 rss: 70Mb L: 47/90 MS: 1 ShuffleBytes- 00:08:24.564 [2024-11-16 16:44:10.226921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.564 [2024-11-16 16:44:10.226950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.564 [2024-11-16 16:44:10.227011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.564 [2024-11-16 16:44:10.227049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.564 [2024-11-16 16:44:10.227118] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.564 [2024-11-16 16:44:10.227140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.564 [2024-11-16 16:44:10.227207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.564 [2024-11-16 16:44:10.227226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.564 #54 NEW cov: 11898 ft: 14814 corp: 43/3216b lim: 90 exec/s: 54 rss: 70Mb L: 84/90 MS: 1 InsertRepeatedBytes- 00:08:24.564 [2024-11-16 16:44:10.267031] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.564 [2024-11-16 16:44:10.267059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.564 [2024-11-16 16:44:10.267122] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.564 [2024-11-16 16:44:10.267146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.564 [2024-11-16 16:44:10.267212] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.564 [2024-11-16 16:44:10.267231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.564 [2024-11-16 16:44:10.267294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.564 [2024-11-16 16:44:10.267314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.564 #55 NEW cov: 11898 ft: 14830 corp: 44/3305b lim: 90 exec/s: 27 rss: 70Mb L: 89/90 MS: 1 CopyPart- 00:08:24.564 #55 DONE cov: 11898 ft: 14830 corp: 44/3305b lim: 90 exec/s: 27 rss: 70Mb 00:08:24.564 Done 55 runs in 2 second(s) 00:08:24.823 16:44:10 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:08:24.823 16:44:10 -- ../common.sh@72 -- # (( i++ )) 00:08:24.823 16:44:10 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:24.823 16:44:10 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:24.823 16:44:10 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:24.823 16:44:10 -- nvmf/run.sh@24 -- # local timen=1 00:08:24.823 16:44:10 -- nvmf/run.sh@25 -- # local core=0x1 00:08:24.823 16:44:10 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:24.823 16:44:10 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:24.823 16:44:10 -- nvmf/run.sh@29 -- # printf %02d 21 00:08:24.823 16:44:10 -- nvmf/run.sh@29 -- # port=4421 00:08:24.823 16:44:10 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:24.823 16:44:10 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:24.823 16:44:10 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:24.823 16:44:10 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:08:24.823 [2024-11-16 16:44:10.450556] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:24.823 [2024-11-16 16:44:10.450645] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid492384 ] 00:08:24.823 EAL: No free 2048 kB hugepages reported on node 1 00:08:25.083 [2024-11-16 16:44:10.634909] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.083 [2024-11-16 16:44:10.655083] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:25.083 [2024-11-16 16:44:10.655201] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.083 [2024-11-16 16:44:10.706889] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:25.083 [2024-11-16 16:44:10.723227] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:25.083 INFO: Running with entropic power schedule (0xFF, 100). 00:08:25.083 INFO: Seed: 2874685563 00:08:25.083 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:25.083 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:25.083 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:25.083 INFO: A corpus is not provided, starting from an empty corpus 00:08:25.083 #2 INITED exec/s: 0 rss: 60Mb 00:08:25.083 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:25.083 This may also happen if the target rejected all inputs we tried so far 00:08:25.083 [2024-11-16 16:44:10.789201] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.083 [2024-11-16 16:44:10.789246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.083 [2024-11-16 16:44:10.789360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:25.083 [2024-11-16 16:44:10.789381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.342 NEW_FUNC[1/672]: 0x4767c8 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:25.342 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:25.342 #12 NEW cov: 11643 ft: 11643 corp: 2/25b lim: 50 exec/s: 0 rss: 67Mb L: 24/24 MS: 5 InsertByte-ChangeBit-ChangeBinInt-CopyPart-InsertRepeatedBytes- 00:08:25.601 [2024-11-16 16:44:11.109310] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.601 [2024-11-16 16:44:11.109349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.601 #14 NEW cov: 11756 ft: 12887 corp: 3/36b lim: 50 exec/s: 0 rss: 67Mb L: 11/24 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:25.602 [2024-11-16 16:44:11.150224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.602 [2024-11-16 16:44:11.150257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.602 [2024-11-16 16:44:11.150358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:25.602 [2024-11-16 16:44:11.150377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.602 [2024-11-16 16:44:11.150495] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:25.602 [2024-11-16 16:44:11.150517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.602 #15 NEW cov: 11762 ft: 13401 corp: 4/70b lim: 50 exec/s: 0 rss: 67Mb L: 34/34 MS: 1 CrossOver- 00:08:25.602 [2024-11-16 16:44:11.189858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.602 [2024-11-16 16:44:11.189889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.602 #16 NEW cov: 11847 ft: 13750 corp: 5/81b lim: 50 exec/s: 0 rss: 67Mb L: 11/34 MS: 1 ChangeBinInt- 00:08:25.602 [2024-11-16 16:44:11.239998] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.602 [2024-11-16 16:44:11.240029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.602 #19 NEW cov: 11847 ft: 13898 corp: 6/92b lim: 50 exec/s: 0 rss: 67Mb L: 11/34 MS: 3 InsertRepeatedBytes-CrossOver-InsertRepeatedBytes- 00:08:25.602 [2024-11-16 16:44:11.280373] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.602 [2024-11-16 16:44:11.280399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.602 [2024-11-16 16:44:11.280515] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:25.602 [2024-11-16 16:44:11.280539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.602 #20 NEW cov: 11847 ft: 13944 corp: 7/116b lim: 50 exec/s: 0 rss: 67Mb L: 24/34 MS: 1 ChangeBinInt- 00:08:25.602 [2024-11-16 16:44:11.320687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.602 [2024-11-16 16:44:11.320721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.602 [2024-11-16 16:44:11.320834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:25.602 [2024-11-16 16:44:11.320855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.602 [2024-11-16 16:44:11.320972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:25.602 [2024-11-16 16:44:11.320993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.602 #21 NEW cov: 11847 ft: 13994 corp: 8/148b lim: 50 exec/s: 0 rss: 67Mb L: 32/34 MS: 1 InsertRepeatedBytes- 00:08:25.862 [2024-11-16 16:44:11.361045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.862 [2024-11-16 16:44:11.361079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.862 [2024-11-16 16:44:11.361184] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:25.862 [2024-11-16 16:44:11.361205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.862 [2024-11-16 16:44:11.361321] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:25.862 [2024-11-16 16:44:11.361338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.862 [2024-11-16 16:44:11.361456] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:25.862 [2024-11-16 16:44:11.361474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.862 #22 NEW cov: 11847 ft: 14351 corp: 9/195b lim: 50 exec/s: 0 rss: 67Mb L: 47/47 MS: 1 InsertRepeatedBytes- 00:08:25.862 [2024-11-16 16:44:11.401182] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.862 [2024-11-16 16:44:11.401213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.862 [2024-11-16 16:44:11.401330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:25.862 [2024-11-16 16:44:11.401352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.862 [2024-11-16 16:44:11.401462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:25.862 [2024-11-16 16:44:11.401485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.862 [2024-11-16 16:44:11.401606] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:25.862 [2024-11-16 16:44:11.401628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.862 #23 NEW cov: 11847 ft: 14376 corp: 10/242b lim: 50 exec/s: 0 rss: 68Mb L: 47/47 MS: 1 ShuffleBytes- 00:08:25.862 [2024-11-16 16:44:11.441123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.862 [2024-11-16 16:44:11.441153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.862 [2024-11-16 16:44:11.441249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:25.862 [2024-11-16 16:44:11.441273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.862 [2024-11-16 16:44:11.441394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:25.862 [2024-11-16 16:44:11.441418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.862 #24 NEW cov: 11847 ft: 14421 corp: 11/276b lim: 50 exec/s: 0 rss: 68Mb L: 34/47 MS: 1 ChangeBinInt- 00:08:25.862 [2024-11-16 16:44:11.480765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.862 [2024-11-16 16:44:11.480795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.862 #25 NEW cov: 11847 ft: 14448 corp: 12/288b lim: 50 exec/s: 0 rss: 68Mb L: 12/47 MS: 1 InsertByte- 00:08:25.862 [2024-11-16 16:44:11.521372] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.862 [2024-11-16 16:44:11.521402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.862 [2024-11-16 16:44:11.521496] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:25.862 [2024-11-16 16:44:11.521515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.862 [2024-11-16 16:44:11.521623] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:25.862 [2024-11-16 16:44:11.521643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.862 #26 NEW cov: 11847 ft: 14529 corp: 13/322b lim: 50 exec/s: 0 rss: 68Mb L: 34/47 MS: 1 ChangeBinInt- 00:08:25.862 [2024-11-16 16:44:11.561641] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.862 [2024-11-16 16:44:11.561673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.862 [2024-11-16 16:44:11.561697] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:25.862 [2024-11-16 16:44:11.561707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.862 [2024-11-16 16:44:11.561723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:25.862 [2024-11-16 16:44:11.561743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.862 [2024-11-16 16:44:11.561852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:25.862 [2024-11-16 16:44:11.561870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.862 #27 NEW cov: 11847 ft: 14575 corp: 14/370b lim: 50 exec/s: 0 rss: 68Mb L: 48/48 MS: 1 CopyPart- 00:08:25.862 [2024-11-16 16:44:11.601703] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:25.862 [2024-11-16 16:44:11.601731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.862 [2024-11-16 16:44:11.601821] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:25.862 [2024-11-16 16:44:11.601841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.862 [2024-11-16 16:44:11.601961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:25.862 [2024-11-16 16:44:11.601980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.862 [2024-11-16 16:44:11.602088] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:25.862 [2024-11-16 16:44:11.602109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.122 #28 NEW cov: 11847 ft: 14605 corp: 15/413b lim: 50 exec/s: 0 rss: 68Mb L: 43/48 MS: 1 InsertRepeatedBytes- 00:08:26.123 [2024-11-16 16:44:11.641524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.123 [2024-11-16 16:44:11.641552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.123 [2024-11-16 16:44:11.641658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.123 [2024-11-16 16:44:11.641684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.123 [2024-11-16 16:44:11.641812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.123 [2024-11-16 16:44:11.641836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.123 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:26.123 #29 NEW cov: 11870 ft: 14645 corp: 16/449b lim: 50 exec/s: 0 rss: 68Mb L: 36/48 MS: 1 EraseBytes- 00:08:26.123 [2024-11-16 16:44:11.691966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.123 [2024-11-16 16:44:11.691997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.123 [2024-11-16 16:44:11.692114] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.123 [2024-11-16 16:44:11.692140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.123 [2024-11-16 16:44:11.692264] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.123 [2024-11-16 16:44:11.692289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.123 #30 NEW cov: 11870 ft: 14693 corp: 17/483b lim: 50 exec/s: 0 rss: 68Mb L: 34/48 MS: 1 ChangeBit- 00:08:26.123 [2024-11-16 16:44:11.741793] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.123 [2024-11-16 16:44:11.741822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.123 [2024-11-16 16:44:11.741941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.123 [2024-11-16 16:44:11.741964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.123 #35 NEW cov: 11870 ft: 14741 corp: 18/504b lim: 50 exec/s: 35 rss: 68Mb L: 21/48 MS: 5 EraseBytes-CrossOver-EraseBytes-ChangeBit-InsertRepeatedBytes- 00:08:26.123 [2024-11-16 16:44:11.781520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.123 [2024-11-16 16:44:11.781553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.123 #36 NEW cov: 11870 ft: 14767 corp: 19/516b lim: 50 exec/s: 36 rss: 68Mb L: 12/48 MS: 1 ChangeBit- 00:08:26.123 [2024-11-16 16:44:11.821712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.123 [2024-11-16 16:44:11.821744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.123 #37 NEW cov: 11870 ft: 14794 corp: 20/527b lim: 50 exec/s: 37 rss: 68Mb L: 11/48 MS: 1 ChangeBit- 00:08:26.123 [2024-11-16 16:44:11.861805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.123 [2024-11-16 16:44:11.861834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.382 #38 NEW cov: 11870 ft: 14806 corp: 21/543b lim: 50 exec/s: 38 rss: 68Mb L: 16/48 MS: 1 CrossOver- 00:08:26.382 [2024-11-16 16:44:11.902558] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.382 [2024-11-16 16:44:11.902587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.382 [2024-11-16 16:44:11.902717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.382 [2024-11-16 16:44:11.902741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.382 [2024-11-16 16:44:11.902867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.382 [2024-11-16 16:44:11.902888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.382 #39 NEW cov: 11870 ft: 14835 corp: 22/577b lim: 50 exec/s: 39 rss: 68Mb L: 34/48 MS: 1 ChangeByte- 00:08:26.382 [2024-11-16 16:44:11.952836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.382 [2024-11-16 16:44:11.952877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.382 [2024-11-16 16:44:11.952994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.382 [2024-11-16 16:44:11.953018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.382 [2024-11-16 16:44:11.953128] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.382 [2024-11-16 16:44:11.953147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.382 [2024-11-16 16:44:11.953260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:26.382 [2024-11-16 16:44:11.953281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.382 #40 NEW cov: 11870 ft: 14864 corp: 23/625b lim: 50 exec/s: 40 rss: 68Mb L: 48/48 MS: 1 ChangeByte- 00:08:26.382 [2024-11-16 16:44:11.992760] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.382 [2024-11-16 16:44:11.992791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.382 [2024-11-16 16:44:11.992908] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.383 [2024-11-16 16:44:11.992931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.383 [2024-11-16 16:44:11.993045] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.383 [2024-11-16 16:44:11.993067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.383 #41 NEW cov: 11870 ft: 14884 corp: 24/664b lim: 50 exec/s: 41 rss: 68Mb L: 39/48 MS: 1 EraseBytes- 00:08:26.383 [2024-11-16 16:44:12.042870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.383 [2024-11-16 16:44:12.042900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.383 [2024-11-16 16:44:12.042996] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.383 [2024-11-16 16:44:12.043016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.383 [2024-11-16 16:44:12.043139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.383 [2024-11-16 16:44:12.043160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.383 #42 NEW cov: 11870 ft: 14898 corp: 25/703b lim: 50 exec/s: 42 rss: 68Mb L: 39/48 MS: 1 CopyPart- 00:08:26.383 [2024-11-16 16:44:12.092997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.383 [2024-11-16 16:44:12.093027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.383 [2024-11-16 16:44:12.093141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.383 [2024-11-16 16:44:12.093162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.383 [2024-11-16 16:44:12.093290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.383 [2024-11-16 16:44:12.093313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.383 #43 NEW cov: 11870 ft: 14915 corp: 26/737b lim: 50 exec/s: 43 rss: 69Mb L: 34/48 MS: 1 CMP- DE: "\307\211(71\027\212\000"- 00:08:26.642 [2024-11-16 16:44:12.133392] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.643 [2024-11-16 16:44:12.133427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.643 [2024-11-16 16:44:12.133510] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.643 [2024-11-16 16:44:12.133534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.643 [2024-11-16 16:44:12.133658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.643 [2024-11-16 16:44:12.133686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.643 [2024-11-16 16:44:12.133817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:26.643 [2024-11-16 16:44:12.133838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.643 #44 NEW cov: 11870 ft: 14924 corp: 27/785b lim: 50 exec/s: 44 rss: 69Mb L: 48/48 MS: 1 ChangeByte- 00:08:26.643 [2024-11-16 16:44:12.172800] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.643 [2024-11-16 16:44:12.172827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.643 #45 NEW cov: 11870 ft: 14958 corp: 28/797b lim: 50 exec/s: 45 rss: 69Mb L: 12/48 MS: 1 PersAutoDict- DE: "\307\211(71\027\212\000"- 00:08:26.643 [2024-11-16 16:44:12.213443] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.643 [2024-11-16 16:44:12.213473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.643 [2024-11-16 16:44:12.213579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.643 [2024-11-16 16:44:12.213602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.643 [2024-11-16 16:44:12.213718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.643 [2024-11-16 16:44:12.213743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.643 #46 NEW cov: 11870 ft: 14972 corp: 29/831b lim: 50 exec/s: 46 rss: 69Mb L: 34/48 MS: 1 ShuffleBytes- 00:08:26.643 [2024-11-16 16:44:12.253730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.643 [2024-11-16 16:44:12.253760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.643 [2024-11-16 16:44:12.253827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.643 [2024-11-16 16:44:12.253851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.643 [2024-11-16 16:44:12.253975] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.643 [2024-11-16 16:44:12.253997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.643 [2024-11-16 16:44:12.254125] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:26.643 [2024-11-16 16:44:12.254149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.643 #52 NEW cov: 11870 ft: 14995 corp: 30/879b lim: 50 exec/s: 52 rss: 69Mb L: 48/48 MS: 1 ChangeBinInt- 00:08:26.643 [2024-11-16 16:44:12.303675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.643 [2024-11-16 16:44:12.303706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.643 [2024-11-16 16:44:12.303808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.643 [2024-11-16 16:44:12.303838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.643 [2024-11-16 16:44:12.303961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.643 [2024-11-16 16:44:12.303984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.643 #53 NEW cov: 11870 ft: 15030 corp: 31/913b lim: 50 exec/s: 53 rss: 69Mb L: 34/48 MS: 1 ChangeBit- 00:08:26.643 [2024-11-16 16:44:12.354101] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.643 [2024-11-16 16:44:12.354131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.643 [2024-11-16 16:44:12.354211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.643 [2024-11-16 16:44:12.354234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.643 [2024-11-16 16:44:12.354346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.643 [2024-11-16 16:44:12.354365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.643 [2024-11-16 16:44:12.354466] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:26.643 [2024-11-16 16:44:12.354488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.643 #54 NEW cov: 11870 ft: 15044 corp: 32/960b lim: 50 exec/s: 54 rss: 69Mb L: 47/48 MS: 1 CopyPart- 00:08:26.902 [2024-11-16 16:44:12.393462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.902 [2024-11-16 16:44:12.393491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.902 #55 NEW cov: 11870 ft: 15061 corp: 33/977b lim: 50 exec/s: 55 rss: 69Mb L: 17/48 MS: 1 CopyPart- 00:08:26.902 [2024-11-16 16:44:12.443835] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.902 [2024-11-16 16:44:12.443866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.902 [2024-11-16 16:44:12.443978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.902 [2024-11-16 16:44:12.444003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.902 #56 NEW cov: 11870 ft: 15064 corp: 34/1000b lim: 50 exec/s: 56 rss: 69Mb L: 23/48 MS: 1 CrossOver- 00:08:26.902 [2024-11-16 16:44:12.494178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.902 [2024-11-16 16:44:12.494210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.902 [2024-11-16 16:44:12.494314] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.902 [2024-11-16 16:44:12.494335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.902 [2024-11-16 16:44:12.494455] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.902 [2024-11-16 16:44:12.494477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.902 #62 NEW cov: 11870 ft: 15107 corp: 35/1031b lim: 50 exec/s: 62 rss: 69Mb L: 31/48 MS: 1 InsertRepeatedBytes- 00:08:26.902 [2024-11-16 16:44:12.544059] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.902 [2024-11-16 16:44:12.544089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.902 [2024-11-16 16:44:12.544209] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.902 [2024-11-16 16:44:12.544228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.902 #63 NEW cov: 11870 ft: 15149 corp: 36/1058b lim: 50 exec/s: 63 rss: 69Mb L: 27/48 MS: 1 EraseBytes- 00:08:26.902 [2024-11-16 16:44:12.584196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.902 [2024-11-16 16:44:12.584225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.902 [2024-11-16 16:44:12.584339] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.902 [2024-11-16 16:44:12.584356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.902 #64 NEW cov: 11870 ft: 15297 corp: 37/1086b lim: 50 exec/s: 64 rss: 69Mb L: 28/48 MS: 1 EraseBytes- 00:08:26.902 [2024-11-16 16:44:12.624731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.902 [2024-11-16 16:44:12.624764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.902 [2024-11-16 16:44:12.624848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.902 [2024-11-16 16:44:12.624869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.902 [2024-11-16 16:44:12.624979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.902 [2024-11-16 16:44:12.625000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.902 [2024-11-16 16:44:12.625118] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:26.902 [2024-11-16 16:44:12.625137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.902 #65 NEW cov: 11870 ft: 15300 corp: 38/1135b lim: 50 exec/s: 65 rss: 69Mb L: 49/49 MS: 1 CopyPart- 00:08:27.162 [2024-11-16 16:44:12.664909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.162 [2024-11-16 16:44:12.664943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.162 [2024-11-16 16:44:12.665052] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.162 [2024-11-16 16:44:12.665072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.162 [2024-11-16 16:44:12.665187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.162 [2024-11-16 16:44:12.665210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.162 [2024-11-16 16:44:12.665327] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.162 [2024-11-16 16:44:12.665348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.162 #66 NEW cov: 11870 ft: 15331 corp: 39/1175b lim: 50 exec/s: 66 rss: 69Mb L: 40/49 MS: 1 InsertRepeatedBytes- 00:08:27.162 [2024-11-16 16:44:12.705245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.162 [2024-11-16 16:44:12.705273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.162 [2024-11-16 16:44:12.705346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.162 [2024-11-16 16:44:12.705377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.162 [2024-11-16 16:44:12.705498] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.162 [2024-11-16 16:44:12.705521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.162 [2024-11-16 16:44:12.705632] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.162 [2024-11-16 16:44:12.705653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.162 [2024-11-16 16:44:12.705776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:27.162 [2024-11-16 16:44:12.705797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:27.162 #67 NEW cov: 11870 ft: 15393 corp: 40/1225b lim: 50 exec/s: 67 rss: 69Mb L: 50/50 MS: 1 InsertByte- 00:08:27.162 [2024-11-16 16:44:12.754931] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.163 [2024-11-16 16:44:12.754963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.163 [2024-11-16 16:44:12.755072] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.163 [2024-11-16 16:44:12.755092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.163 [2024-11-16 16:44:12.755205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.163 [2024-11-16 16:44:12.755227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.163 #68 NEW cov: 11870 ft: 15474 corp: 41/1259b lim: 50 exec/s: 34 rss: 70Mb L: 34/50 MS: 1 ChangeByte- 00:08:27.163 #68 DONE cov: 11870 ft: 15474 corp: 41/1259b lim: 50 exec/s: 34 rss: 70Mb 00:08:27.163 ###### Recommended dictionary. ###### 00:08:27.163 "\307\211(71\027\212\000" # Uses: 1 00:08:27.163 ###### End of recommended dictionary. ###### 00:08:27.163 Done 68 runs in 2 second(s) 00:08:27.163 16:44:12 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:08:27.163 16:44:12 -- ../common.sh@72 -- # (( i++ )) 00:08:27.163 16:44:12 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:27.163 16:44:12 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:27.163 16:44:12 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:27.163 16:44:12 -- nvmf/run.sh@24 -- # local timen=1 00:08:27.163 16:44:12 -- nvmf/run.sh@25 -- # local core=0x1 00:08:27.163 16:44:12 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:27.163 16:44:12 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:27.163 16:44:12 -- nvmf/run.sh@29 -- # printf %02d 22 00:08:27.163 16:44:12 -- nvmf/run.sh@29 -- # port=4422 00:08:27.163 16:44:12 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:27.163 16:44:12 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:27.163 16:44:12 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:27.422 16:44:12 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:08:27.422 [2024-11-16 16:44:12.938269] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:27.422 [2024-11-16 16:44:12.938334] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid492917 ] 00:08:27.422 EAL: No free 2048 kB hugepages reported on node 1 00:08:27.422 [2024-11-16 16:44:13.114929] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:27.422 [2024-11-16 16:44:13.135264] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:27.422 [2024-11-16 16:44:13.135377] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:27.682 [2024-11-16 16:44:13.186648] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:27.682 [2024-11-16 16:44:13.202945] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:27.682 INFO: Running with entropic power schedule (0xFF, 100). 00:08:27.682 INFO: Seed: 1057665514 00:08:27.682 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:27.682 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:27.682 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:27.682 INFO: A corpus is not provided, starting from an empty corpus 00:08:27.682 #2 INITED exec/s: 0 rss: 59Mb 00:08:27.682 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:27.682 This may also happen if the target rejected all inputs we tried so far 00:08:27.682 [2024-11-16 16:44:13.251988] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:27.682 [2024-11-16 16:44:13.252018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.941 NEW_FUNC[1/672]: 0x478a98 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:27.941 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:27.941 #7 NEW cov: 11669 ft: 11670 corp: 2/24b lim: 85 exec/s: 0 rss: 67Mb L: 23/23 MS: 5 ChangeByte-InsertByte-InsertByte-InsertByte-InsertRepeatedBytes- 00:08:27.941 [2024-11-16 16:44:13.572665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:27.941 [2024-11-16 16:44:13.572701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.941 #18 NEW cov: 11782 ft: 12113 corp: 3/47b lim: 85 exec/s: 0 rss: 67Mb L: 23/23 MS: 1 ChangeBit- 00:08:27.941 [2024-11-16 16:44:13.622892] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:27.941 [2024-11-16 16:44:13.622920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.941 [2024-11-16 16:44:13.622966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:27.941 [2024-11-16 16:44:13.622981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.941 #20 NEW cov: 11788 ft: 13117 corp: 4/86b lim: 85 exec/s: 0 rss: 67Mb L: 39/39 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:27.941 [2024-11-16 16:44:13.662802] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:27.941 [2024-11-16 16:44:13.662831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.941 #21 NEW cov: 11873 ft: 13424 corp: 5/110b lim: 85 exec/s: 0 rss: 67Mb L: 24/39 MS: 1 CrossOver- 00:08:28.201 [2024-11-16 16:44:13.703065] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.201 [2024-11-16 16:44:13.703092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.201 [2024-11-16 16:44:13.703161] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.201 [2024-11-16 16:44:13.703176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.201 #22 NEW cov: 11873 ft: 13691 corp: 6/149b lim: 85 exec/s: 0 rss: 67Mb L: 39/39 MS: 1 ShuffleBytes- 00:08:28.201 [2024-11-16 16:44:13.743315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.201 [2024-11-16 16:44:13.743342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.201 [2024-11-16 16:44:13.743380] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.201 [2024-11-16 16:44:13.743394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.201 [2024-11-16 16:44:13.743444] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.201 [2024-11-16 16:44:13.743459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.201 #26 NEW cov: 11873 ft: 14088 corp: 7/207b lim: 85 exec/s: 0 rss: 67Mb L: 58/58 MS: 4 CrossOver-ShuffleBytes-InsertRepeatedBytes-InsertRepeatedBytes- 00:08:28.201 [2024-11-16 16:44:13.783153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.201 [2024-11-16 16:44:13.783179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.201 #27 NEW cov: 11873 ft: 14178 corp: 8/233b lim: 85 exec/s: 0 rss: 67Mb L: 26/58 MS: 1 CrossOver- 00:08:28.201 [2024-11-16 16:44:13.823325] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.201 [2024-11-16 16:44:13.823351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.201 #28 NEW cov: 11873 ft: 14277 corp: 9/257b lim: 85 exec/s: 0 rss: 67Mb L: 24/58 MS: 1 CopyPart- 00:08:28.201 [2024-11-16 16:44:13.863555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.201 [2024-11-16 16:44:13.863581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.201 [2024-11-16 16:44:13.863619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.201 [2024-11-16 16:44:13.863639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.202 #29 NEW cov: 11873 ft: 14304 corp: 10/292b lim: 85 exec/s: 0 rss: 67Mb L: 35/58 MS: 1 InsertRepeatedBytes- 00:08:28.202 [2024-11-16 16:44:13.903634] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.202 [2024-11-16 16:44:13.903661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.202 [2024-11-16 16:44:13.903707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.202 [2024-11-16 16:44:13.903722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.202 #30 NEW cov: 11873 ft: 14338 corp: 11/334b lim: 85 exec/s: 0 rss: 67Mb L: 42/58 MS: 1 InsertRepeatedBytes- 00:08:28.202 [2024-11-16 16:44:13.943709] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.202 [2024-11-16 16:44:13.943736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.461 #31 NEW cov: 11873 ft: 14360 corp: 12/358b lim: 85 exec/s: 0 rss: 67Mb L: 24/58 MS: 1 ChangeBinInt- 00:08:28.461 [2024-11-16 16:44:13.983741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.461 [2024-11-16 16:44:13.983768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.461 #32 NEW cov: 11873 ft: 14384 corp: 13/387b lim: 85 exec/s: 0 rss: 67Mb L: 29/58 MS: 1 EraseBytes- 00:08:28.461 [2024-11-16 16:44:14.023892] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.461 [2024-11-16 16:44:14.023918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.461 #33 NEW cov: 11873 ft: 14402 corp: 14/410b lim: 85 exec/s: 0 rss: 68Mb L: 23/58 MS: 1 ShuffleBytes- 00:08:28.461 [2024-11-16 16:44:14.063961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.461 [2024-11-16 16:44:14.063988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.461 #34 NEW cov: 11873 ft: 14422 corp: 15/434b lim: 85 exec/s: 0 rss: 68Mb L: 24/58 MS: 1 ShuffleBytes- 00:08:28.461 [2024-11-16 16:44:14.104662] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.461 [2024-11-16 16:44:14.104696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.461 [2024-11-16 16:44:14.104749] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.462 [2024-11-16 16:44:14.104764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.462 [2024-11-16 16:44:14.104814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.462 [2024-11-16 16:44:14.104829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.462 [2024-11-16 16:44:14.104879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:28.462 [2024-11-16 16:44:14.104895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.462 [2024-11-16 16:44:14.104947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:28.462 [2024-11-16 16:44:14.104962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:28.462 #35 NEW cov: 11873 ft: 14853 corp: 16/519b lim: 85 exec/s: 0 rss: 68Mb L: 85/85 MS: 1 CrossOver- 00:08:28.462 [2024-11-16 16:44:14.154269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.462 [2024-11-16 16:44:14.154295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.462 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:28.462 #37 NEW cov: 11896 ft: 14890 corp: 17/544b lim: 85 exec/s: 0 rss: 68Mb L: 25/85 MS: 2 ShuffleBytes-CrossOver- 00:08:28.462 [2024-11-16 16:44:14.194497] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.462 [2024-11-16 16:44:14.194524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.462 [2024-11-16 16:44:14.194571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.462 [2024-11-16 16:44:14.194587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.721 #38 NEW cov: 11896 ft: 14924 corp: 18/580b lim: 85 exec/s: 0 rss: 68Mb L: 36/85 MS: 1 EraseBytes- 00:08:28.721 [2024-11-16 16:44:14.234614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.721 [2024-11-16 16:44:14.234640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.721 [2024-11-16 16:44:14.234679] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.721 [2024-11-16 16:44:14.234694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.721 #39 NEW cov: 11896 ft: 15002 corp: 19/619b lim: 85 exec/s: 39 rss: 68Mb L: 39/85 MS: 1 CrossOver- 00:08:28.721 [2024-11-16 16:44:14.274726] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.721 [2024-11-16 16:44:14.274752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.721 [2024-11-16 16:44:14.274788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.721 [2024-11-16 16:44:14.274803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.721 #40 NEW cov: 11896 ft: 15046 corp: 20/658b lim: 85 exec/s: 40 rss: 68Mb L: 39/85 MS: 1 ChangeBit- 00:08:28.721 [2024-11-16 16:44:14.314891] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.722 [2024-11-16 16:44:14.314917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.722 [2024-11-16 16:44:14.314970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.722 [2024-11-16 16:44:14.314985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.722 #41 NEW cov: 11896 ft: 15066 corp: 21/698b lim: 85 exec/s: 41 rss: 68Mb L: 40/85 MS: 1 CrossOver- 00:08:28.722 [2024-11-16 16:44:14.354848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.722 [2024-11-16 16:44:14.354875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.722 #42 NEW cov: 11896 ft: 15121 corp: 22/721b lim: 85 exec/s: 42 rss: 68Mb L: 23/85 MS: 1 ChangeByte- 00:08:28.722 [2024-11-16 16:44:14.394986] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.722 [2024-11-16 16:44:14.395014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.722 #43 NEW cov: 11896 ft: 15155 corp: 23/750b lim: 85 exec/s: 43 rss: 68Mb L: 29/85 MS: 1 ChangeBinInt- 00:08:28.722 [2024-11-16 16:44:14.435271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.722 [2024-11-16 16:44:14.435298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.722 [2024-11-16 16:44:14.435346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.722 [2024-11-16 16:44:14.435362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.722 #44 NEW cov: 11896 ft: 15175 corp: 24/789b lim: 85 exec/s: 44 rss: 68Mb L: 39/85 MS: 1 ChangeByte- 00:08:28.982 [2024-11-16 16:44:14.475638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.982 [2024-11-16 16:44:14.475664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.982 [2024-11-16 16:44:14.475718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.982 [2024-11-16 16:44:14.475734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.982 [2024-11-16 16:44:14.475785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.982 [2024-11-16 16:44:14.475800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.982 [2024-11-16 16:44:14.475853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:28.982 [2024-11-16 16:44:14.475867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.982 #45 NEW cov: 11896 ft: 15195 corp: 25/870b lim: 85 exec/s: 45 rss: 68Mb L: 81/85 MS: 1 CrossOver- 00:08:28.982 [2024-11-16 16:44:14.515290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.982 [2024-11-16 16:44:14.515317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.982 #46 NEW cov: 11896 ft: 15272 corp: 26/893b lim: 85 exec/s: 46 rss: 68Mb L: 23/85 MS: 1 ShuffleBytes- 00:08:28.982 [2024-11-16 16:44:14.555847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.982 [2024-11-16 16:44:14.555873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.982 [2024-11-16 16:44:14.555908] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.982 [2024-11-16 16:44:14.555922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.982 [2024-11-16 16:44:14.555971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.982 [2024-11-16 16:44:14.555986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.982 [2024-11-16 16:44:14.556036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:28.982 [2024-11-16 16:44:14.556051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.982 #47 NEW cov: 11896 ft: 15283 corp: 27/974b lim: 85 exec/s: 47 rss: 68Mb L: 81/85 MS: 1 ChangeBinInt- 00:08:28.982 [2024-11-16 16:44:14.595646] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.982 [2024-11-16 16:44:14.595677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.982 [2024-11-16 16:44:14.595729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.982 [2024-11-16 16:44:14.595749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.982 #48 NEW cov: 11896 ft: 15305 corp: 28/1016b lim: 85 exec/s: 48 rss: 68Mb L: 42/85 MS: 1 CrossOver- 00:08:28.982 [2024-11-16 16:44:14.636092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.982 [2024-11-16 16:44:14.636119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.982 [2024-11-16 16:44:14.636154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.982 [2024-11-16 16:44:14.636170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.982 [2024-11-16 16:44:14.636220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.982 [2024-11-16 16:44:14.636236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.982 [2024-11-16 16:44:14.636289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:28.982 [2024-11-16 16:44:14.636305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.982 #49 NEW cov: 11896 ft: 15314 corp: 29/1098b lim: 85 exec/s: 49 rss: 68Mb L: 82/85 MS: 1 InsertByte- 00:08:28.982 [2024-11-16 16:44:14.685946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.982 [2024-11-16 16:44:14.685973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.982 [2024-11-16 16:44:14.686026] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.982 [2024-11-16 16:44:14.686041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.982 #50 NEW cov: 11896 ft: 15334 corp: 30/1140b lim: 85 exec/s: 50 rss: 68Mb L: 42/85 MS: 1 ChangeByte- 00:08:28.982 [2024-11-16 16:44:14.725935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.982 [2024-11-16 16:44:14.725962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.242 #51 NEW cov: 11896 ft: 15387 corp: 31/1169b lim: 85 exec/s: 51 rss: 68Mb L: 29/85 MS: 1 CopyPart- 00:08:29.242 [2024-11-16 16:44:14.766003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.242 [2024-11-16 16:44:14.766030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.242 #52 NEW cov: 11896 ft: 15399 corp: 32/1202b lim: 85 exec/s: 52 rss: 69Mb L: 33/85 MS: 1 CMP- DE: "G\000\000\000\000\000\000\000"- 00:08:29.242 [2024-11-16 16:44:14.806160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.242 [2024-11-16 16:44:14.806186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.242 #53 NEW cov: 11896 ft: 15418 corp: 33/1234b lim: 85 exec/s: 53 rss: 69Mb L: 32/85 MS: 1 PersAutoDict- DE: "G\000\000\000\000\000\000\000"- 00:08:29.242 [2024-11-16 16:44:14.836218] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.242 [2024-11-16 16:44:14.836244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.242 #54 NEW cov: 11896 ft: 15426 corp: 34/1258b lim: 85 exec/s: 54 rss: 69Mb L: 24/85 MS: 1 ChangeByte- 00:08:29.242 [2024-11-16 16:44:14.866293] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.242 [2024-11-16 16:44:14.866318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.242 #55 NEW cov: 11896 ft: 15470 corp: 35/1277b lim: 85 exec/s: 55 rss: 69Mb L: 19/85 MS: 1 EraseBytes- 00:08:29.242 [2024-11-16 16:44:14.906458] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.242 [2024-11-16 16:44:14.906483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.242 [2024-11-16 16:44:14.936672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.242 [2024-11-16 16:44:14.936698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.242 [2024-11-16 16:44:14.936737] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.242 [2024-11-16 16:44:14.936751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.242 #57 NEW cov: 11896 ft: 15486 corp: 36/1314b lim: 85 exec/s: 57 rss: 69Mb L: 37/85 MS: 2 ChangeByte-PersAutoDict- DE: "G\000\000\000\000\000\000\000"- 00:08:29.242 [2024-11-16 16:44:14.977083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.242 [2024-11-16 16:44:14.977109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.242 [2024-11-16 16:44:14.977157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.242 [2024-11-16 16:44:14.977172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.242 [2024-11-16 16:44:14.977221] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.242 [2024-11-16 16:44:14.977236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.242 [2024-11-16 16:44:14.977287] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.242 [2024-11-16 16:44:14.977300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.502 #58 NEW cov: 11896 ft: 15563 corp: 37/1387b lim: 85 exec/s: 58 rss: 69Mb L: 73/85 MS: 1 InsertRepeatedBytes- 00:08:29.502 [2024-11-16 16:44:15.026964] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.502 [2024-11-16 16:44:15.026990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.502 [2024-11-16 16:44:15.027025] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.502 [2024-11-16 16:44:15.027040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.502 #59 NEW cov: 11896 ft: 15614 corp: 38/1426b lim: 85 exec/s: 59 rss: 69Mb L: 39/85 MS: 1 CopyPart- 00:08:29.502 [2024-11-16 16:44:15.067086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.502 [2024-11-16 16:44:15.067114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.502 [2024-11-16 16:44:15.067160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.502 [2024-11-16 16:44:15.067178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.502 #60 NEW cov: 11896 ft: 15624 corp: 39/1468b lim: 85 exec/s: 60 rss: 69Mb L: 42/85 MS: 1 CopyPart- 00:08:29.502 [2024-11-16 16:44:15.107042] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.502 [2024-11-16 16:44:15.107073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.502 #61 NEW cov: 11896 ft: 15625 corp: 40/1494b lim: 85 exec/s: 61 rss: 69Mb L: 26/85 MS: 1 InsertByte- 00:08:29.502 [2024-11-16 16:44:15.147152] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.502 [2024-11-16 16:44:15.147178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.502 #62 NEW cov: 11896 ft: 15629 corp: 41/1511b lim: 85 exec/s: 62 rss: 69Mb L: 17/85 MS: 1 EraseBytes- 00:08:29.502 [2024-11-16 16:44:15.177234] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.502 [2024-11-16 16:44:15.177260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.502 #63 NEW cov: 11896 ft: 15634 corp: 42/1530b lim: 85 exec/s: 63 rss: 69Mb L: 19/85 MS: 1 ShuffleBytes- 00:08:29.502 [2024-11-16 16:44:15.217507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.502 [2024-11-16 16:44:15.217534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.502 [2024-11-16 16:44:15.217582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.502 [2024-11-16 16:44:15.217598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.502 #64 pulse cov: 11896 ft: 15644 corp: 42/1530b lim: 85 exec/s: 32 rss: 69Mb 00:08:29.502 #64 NEW cov: 11896 ft: 15644 corp: 43/1565b lim: 85 exec/s: 32 rss: 69Mb L: 35/85 MS: 1 ChangeBit- 00:08:29.502 #64 DONE cov: 11896 ft: 15644 corp: 43/1565b lim: 85 exec/s: 32 rss: 69Mb 00:08:29.502 ###### Recommended dictionary. ###### 00:08:29.502 "G\000\000\000\000\000\000\000" # Uses: 2 00:08:29.502 ###### End of recommended dictionary. ###### 00:08:29.502 Done 64 runs in 2 second(s) 00:08:29.762 16:44:15 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:08:29.762 16:44:15 -- ../common.sh@72 -- # (( i++ )) 00:08:29.762 16:44:15 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:29.762 16:44:15 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:29.762 16:44:15 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:29.762 16:44:15 -- nvmf/run.sh@24 -- # local timen=1 00:08:29.762 16:44:15 -- nvmf/run.sh@25 -- # local core=0x1 00:08:29.762 16:44:15 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:29.762 16:44:15 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:29.762 16:44:15 -- nvmf/run.sh@29 -- # printf %02d 23 00:08:29.762 16:44:15 -- nvmf/run.sh@29 -- # port=4423 00:08:29.762 16:44:15 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:29.762 16:44:15 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:29.762 16:44:15 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:29.762 16:44:15 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:08:29.762 [2024-11-16 16:44:15.394854] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:29.762 [2024-11-16 16:44:15.394919] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid493346 ] 00:08:29.762 EAL: No free 2048 kB hugepages reported on node 1 00:08:30.022 [2024-11-16 16:44:15.575495] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:30.022 [2024-11-16 16:44:15.595516] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:30.022 [2024-11-16 16:44:15.595629] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:30.022 [2024-11-16 16:44:15.646921] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:30.022 [2024-11-16 16:44:15.663254] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:30.022 INFO: Running with entropic power schedule (0xFF, 100). 00:08:30.022 INFO: Seed: 3517692171 00:08:30.022 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:30.022 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:30.022 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:30.022 INFO: A corpus is not provided, starting from an empty corpus 00:08:30.022 #2 INITED exec/s: 0 rss: 59Mb 00:08:30.022 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:30.022 This may also happen if the target rejected all inputs we tried so far 00:08:30.022 [2024-11-16 16:44:15.712009] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:30.022 [2024-11-16 16:44:15.712040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.281 NEW_FUNC[1/671]: 0x47bcd8 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:30.281 NEW_FUNC[2/671]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:30.281 #7 NEW cov: 11602 ft: 11603 corp: 2/9b lim: 25 exec/s: 0 rss: 67Mb L: 8/8 MS: 5 ShuffleBytes-CopyPart-EraseBytes-InsertByte-InsertRepeatedBytes- 00:08:30.281 [2024-11-16 16:44:16.012993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:30.281 [2024-11-16 16:44:16.013027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.281 [2024-11-16 16:44:16.013095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:30.281 [2024-11-16 16:44:16.013115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.281 [2024-11-16 16:44:16.013183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:30.281 [2024-11-16 16:44:16.013205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.281 [2024-11-16 16:44:16.013287] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:30.281 [2024-11-16 16:44:16.013310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.550 #8 NEW cov: 11715 ft: 12726 corp: 3/33b lim: 25 exec/s: 0 rss: 67Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:08:30.550 [2024-11-16 16:44:16.062984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:30.550 [2024-11-16 16:44:16.063015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.550 [2024-11-16 16:44:16.063080] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:30.550 [2024-11-16 16:44:16.063102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.550 [2024-11-16 16:44:16.063169] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:30.550 [2024-11-16 16:44:16.063187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.550 #9 NEW cov: 11721 ft: 13173 corp: 4/48b lim: 25 exec/s: 0 rss: 67Mb L: 15/24 MS: 1 CopyPart- 00:08:30.550 [2024-11-16 16:44:16.102860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:30.550 [2024-11-16 16:44:16.102892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.551 #11 NEW cov: 11806 ft: 13563 corp: 5/57b lim: 25 exec/s: 0 rss: 67Mb L: 9/24 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:30.551 [2024-11-16 16:44:16.143207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:30.551 [2024-11-16 16:44:16.143236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.551 [2024-11-16 16:44:16.143302] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:30.551 [2024-11-16 16:44:16.143324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.551 [2024-11-16 16:44:16.143390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:30.551 [2024-11-16 16:44:16.143409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.551 #12 NEW cov: 11806 ft: 13729 corp: 6/72b lim: 25 exec/s: 0 rss: 67Mb L: 15/24 MS: 1 CMP- DE: "\000\212\0273\221o+\370"- 00:08:30.551 [2024-11-16 16:44:16.183062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:30.551 [2024-11-16 16:44:16.183092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.551 #13 NEW cov: 11806 ft: 13877 corp: 7/79b lim: 25 exec/s: 0 rss: 67Mb L: 7/24 MS: 1 CrossOver- 00:08:30.551 [2024-11-16 16:44:16.223192] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:30.551 [2024-11-16 16:44:16.223222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.551 #14 NEW cov: 11806 ft: 14017 corp: 8/88b lim: 25 exec/s: 0 rss: 67Mb L: 9/24 MS: 1 ChangeBinInt- 00:08:30.551 [2024-11-16 16:44:16.263290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:30.551 [2024-11-16 16:44:16.263319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.551 #15 NEW cov: 11806 ft: 14090 corp: 9/97b lim: 25 exec/s: 0 rss: 67Mb L: 9/24 MS: 1 ChangeByte- 00:08:30.809 [2024-11-16 16:44:16.303449] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:30.809 [2024-11-16 16:44:16.303478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.809 #16 NEW cov: 11806 ft: 14188 corp: 10/106b lim: 25 exec/s: 0 rss: 67Mb L: 9/24 MS: 1 CopyPart- 00:08:30.809 [2024-11-16 16:44:16.343636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:30.809 [2024-11-16 16:44:16.343664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.809 [2024-11-16 16:44:16.343739] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:30.809 [2024-11-16 16:44:16.343760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.809 #17 NEW cov: 11806 ft: 14427 corp: 11/116b lim: 25 exec/s: 0 rss: 67Mb L: 10/24 MS: 1 InsertByte- 00:08:30.809 [2024-11-16 16:44:16.383945] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:30.809 [2024-11-16 16:44:16.383973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.809 [2024-11-16 16:44:16.384039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:30.809 [2024-11-16 16:44:16.384060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.809 [2024-11-16 16:44:16.384129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:30.809 [2024-11-16 16:44:16.384148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.809 #18 NEW cov: 11806 ft: 14447 corp: 12/132b lim: 25 exec/s: 0 rss: 68Mb L: 16/24 MS: 1 InsertRepeatedBytes- 00:08:30.809 [2024-11-16 16:44:16.423787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:30.809 [2024-11-16 16:44:16.423816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.809 #19 NEW cov: 11806 ft: 14507 corp: 13/141b lim: 25 exec/s: 0 rss: 68Mb L: 9/24 MS: 1 ChangeByte- 00:08:30.809 [2024-11-16 16:44:16.464123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:30.809 [2024-11-16 16:44:16.464152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.809 [2024-11-16 16:44:16.464216] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:30.809 [2024-11-16 16:44:16.464238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.809 [2024-11-16 16:44:16.464304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:30.809 [2024-11-16 16:44:16.464325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.809 #20 NEW cov: 11806 ft: 14535 corp: 14/158b lim: 25 exec/s: 0 rss: 68Mb L: 17/24 MS: 1 PersAutoDict- DE: "\000\212\0273\221o+\370"- 00:08:30.809 [2024-11-16 16:44:16.504119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:30.809 [2024-11-16 16:44:16.504147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.809 [2024-11-16 16:44:16.504215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:30.809 [2024-11-16 16:44:16.504237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.809 #21 NEW cov: 11806 ft: 14579 corp: 15/168b lim: 25 exec/s: 0 rss: 68Mb L: 10/24 MS: 1 InsertByte- 00:08:30.809 [2024-11-16 16:44:16.544329] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:30.809 [2024-11-16 16:44:16.544358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.809 [2024-11-16 16:44:16.544425] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:30.809 [2024-11-16 16:44:16.544447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.809 [2024-11-16 16:44:16.544515] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:30.809 [2024-11-16 16:44:16.544535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.069 #22 NEW cov: 11806 ft: 14603 corp: 16/186b lim: 25 exec/s: 0 rss: 68Mb L: 18/24 MS: 1 InsertByte- 00:08:31.069 [2024-11-16 16:44:16.584629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.069 [2024-11-16 16:44:16.584657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.069 [2024-11-16 16:44:16.584722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.069 [2024-11-16 16:44:16.584743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.069 [2024-11-16 16:44:16.584815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.069 [2024-11-16 16:44:16.584835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.069 [2024-11-16 16:44:16.584900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:31.069 [2024-11-16 16:44:16.584919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.069 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:31.069 #23 NEW cov: 11829 ft: 14650 corp: 17/208b lim: 25 exec/s: 0 rss: 68Mb L: 22/24 MS: 1 EraseBytes- 00:08:31.069 [2024-11-16 16:44:16.634752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.069 [2024-11-16 16:44:16.634782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.069 [2024-11-16 16:44:16.634849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.069 [2024-11-16 16:44:16.634871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.069 [2024-11-16 16:44:16.634937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.069 [2024-11-16 16:44:16.634956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.069 [2024-11-16 16:44:16.635023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:31.069 [2024-11-16 16:44:16.635042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.069 #24 NEW cov: 11829 ft: 14681 corp: 18/232b lim: 25 exec/s: 0 rss: 68Mb L: 24/24 MS: 1 ChangeBit- 00:08:31.069 [2024-11-16 16:44:16.674702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.069 [2024-11-16 16:44:16.674732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.069 [2024-11-16 16:44:16.674800] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.069 [2024-11-16 16:44:16.674823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.069 #25 NEW cov: 11829 ft: 14709 corp: 19/242b lim: 25 exec/s: 25 rss: 68Mb L: 10/24 MS: 1 ChangeByte- 00:08:31.069 [2024-11-16 16:44:16.714899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.069 [2024-11-16 16:44:16.714929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.069 [2024-11-16 16:44:16.714996] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.069 [2024-11-16 16:44:16.715020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.069 [2024-11-16 16:44:16.715088] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.069 [2024-11-16 16:44:16.715105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.069 #26 NEW cov: 11829 ft: 14766 corp: 20/259b lim: 25 exec/s: 26 rss: 68Mb L: 17/24 MS: 1 EraseBytes- 00:08:31.069 [2024-11-16 16:44:16.754774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.069 [2024-11-16 16:44:16.754803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.069 #27 NEW cov: 11829 ft: 14780 corp: 21/268b lim: 25 exec/s: 27 rss: 68Mb L: 9/24 MS: 1 CopyPart- 00:08:31.069 [2024-11-16 16:44:16.785079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.069 [2024-11-16 16:44:16.785109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.069 [2024-11-16 16:44:16.785174] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.069 [2024-11-16 16:44:16.785196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.069 [2024-11-16 16:44:16.785260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.069 [2024-11-16 16:44:16.785279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.070 #33 NEW cov: 11829 ft: 14833 corp: 22/284b lim: 25 exec/s: 33 rss: 68Mb L: 16/24 MS: 1 InsertRepeatedBytes- 00:08:31.329 [2024-11-16 16:44:16.824974] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.329 [2024-11-16 16:44:16.825004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.329 #34 NEW cov: 11829 ft: 14836 corp: 23/293b lim: 25 exec/s: 34 rss: 68Mb L: 9/24 MS: 1 CMP- DE: "\001\006"- 00:08:31.329 [2024-11-16 16:44:16.865258] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.329 [2024-11-16 16:44:16.865287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.329 [2024-11-16 16:44:16.865352] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.329 [2024-11-16 16:44:16.865373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.329 [2024-11-16 16:44:16.865440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.329 [2024-11-16 16:44:16.865459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.329 #35 NEW cov: 11829 ft: 14853 corp: 24/309b lim: 25 exec/s: 35 rss: 68Mb L: 16/24 MS: 1 CopyPart- 00:08:31.329 [2024-11-16 16:44:16.905397] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.329 [2024-11-16 16:44:16.905427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.329 [2024-11-16 16:44:16.905494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.329 [2024-11-16 16:44:16.905516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.329 [2024-11-16 16:44:16.905583] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.329 [2024-11-16 16:44:16.905603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.329 #36 NEW cov: 11829 ft: 14876 corp: 25/325b lim: 25 exec/s: 36 rss: 68Mb L: 16/24 MS: 1 InsertByte- 00:08:31.329 [2024-11-16 16:44:16.945529] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.329 [2024-11-16 16:44:16.945559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.329 [2024-11-16 16:44:16.945626] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.330 [2024-11-16 16:44:16.945648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.330 [2024-11-16 16:44:16.945730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.330 [2024-11-16 16:44:16.945749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.330 #37 NEW cov: 11829 ft: 14889 corp: 26/341b lim: 25 exec/s: 37 rss: 68Mb L: 16/24 MS: 1 ChangeBinInt- 00:08:31.330 [2024-11-16 16:44:16.985542] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.330 [2024-11-16 16:44:16.985571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.330 [2024-11-16 16:44:16.985638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.330 [2024-11-16 16:44:16.985660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.330 #38 NEW cov: 11829 ft: 14900 corp: 27/353b lim: 25 exec/s: 38 rss: 69Mb L: 12/24 MS: 1 InsertRepeatedBytes- 00:08:31.330 [2024-11-16 16:44:17.025607] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.330 [2024-11-16 16:44:17.025636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.330 [2024-11-16 16:44:17.025707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.330 [2024-11-16 16:44:17.025730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.330 #39 NEW cov: 11829 ft: 14906 corp: 28/364b lim: 25 exec/s: 39 rss: 69Mb L: 11/24 MS: 1 EraseBytes- 00:08:31.330 [2024-11-16 16:44:17.065788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.330 [2024-11-16 16:44:17.065817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.330 [2024-11-16 16:44:17.065885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.330 [2024-11-16 16:44:17.065906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.589 [2024-11-16 16:44:17.105899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.589 [2024-11-16 16:44:17.105929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.589 [2024-11-16 16:44:17.106001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.589 [2024-11-16 16:44:17.106023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.589 #41 NEW cov: 11829 ft: 14969 corp: 29/376b lim: 25 exec/s: 41 rss: 69Mb L: 12/24 MS: 2 ChangeByte-ChangeByte- 00:08:31.589 [2024-11-16 16:44:17.146098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.589 [2024-11-16 16:44:17.146126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.589 [2024-11-16 16:44:17.146187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.589 [2024-11-16 16:44:17.146207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.589 [2024-11-16 16:44:17.146273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.589 [2024-11-16 16:44:17.146291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.589 #42 NEW cov: 11829 ft: 15019 corp: 30/394b lim: 25 exec/s: 42 rss: 69Mb L: 18/24 MS: 1 CopyPart- 00:08:31.589 [2024-11-16 16:44:17.186239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.589 [2024-11-16 16:44:17.186272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.589 [2024-11-16 16:44:17.186340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.590 [2024-11-16 16:44:17.186365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.590 [2024-11-16 16:44:17.186430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.590 [2024-11-16 16:44:17.186449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.590 #43 NEW cov: 11829 ft: 15028 corp: 31/412b lim: 25 exec/s: 43 rss: 69Mb L: 18/24 MS: 1 ChangeBinInt- 00:08:31.590 [2024-11-16 16:44:17.226087] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.590 [2024-11-16 16:44:17.226117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.590 #44 NEW cov: 11829 ft: 15038 corp: 32/420b lim: 25 exec/s: 44 rss: 69Mb L: 8/24 MS: 1 ShuffleBytes- 00:08:31.590 [2024-11-16 16:44:17.266161] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.590 [2024-11-16 16:44:17.266190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.590 #45 NEW cov: 11829 ft: 15108 corp: 33/428b lim: 25 exec/s: 45 rss: 69Mb L: 8/24 MS: 1 ChangeBit- 00:08:31.590 [2024-11-16 16:44:17.306313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.590 [2024-11-16 16:44:17.306342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.590 #46 NEW cov: 11829 ft: 15130 corp: 34/437b lim: 25 exec/s: 46 rss: 69Mb L: 9/24 MS: 1 CrossOver- 00:08:31.849 [2024-11-16 16:44:17.346430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.849 [2024-11-16 16:44:17.346459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.849 #47 NEW cov: 11829 ft: 15142 corp: 35/443b lim: 25 exec/s: 47 rss: 69Mb L: 6/24 MS: 1 EraseBytes- 00:08:31.849 [2024-11-16 16:44:17.386573] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.849 [2024-11-16 16:44:17.386602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.849 #48 NEW cov: 11829 ft: 15147 corp: 36/449b lim: 25 exec/s: 48 rss: 69Mb L: 6/24 MS: 1 EraseBytes- 00:08:31.849 [2024-11-16 16:44:17.416987] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.849 [2024-11-16 16:44:17.417016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.849 [2024-11-16 16:44:17.417077] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.849 [2024-11-16 16:44:17.417098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.849 [2024-11-16 16:44:17.417163] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.849 [2024-11-16 16:44:17.417180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.849 [2024-11-16 16:44:17.417245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:31.849 [2024-11-16 16:44:17.417264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.849 #49 NEW cov: 11829 ft: 15162 corp: 37/473b lim: 25 exec/s: 49 rss: 69Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:08:31.850 [2024-11-16 16:44:17.456863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.850 [2024-11-16 16:44:17.456892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.850 [2024-11-16 16:44:17.456962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.850 [2024-11-16 16:44:17.456985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.850 #50 NEW cov: 11829 ft: 15173 corp: 38/486b lim: 25 exec/s: 50 rss: 69Mb L: 13/24 MS: 1 InsertRepeatedBytes- 00:08:31.850 [2024-11-16 16:44:17.497188] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.850 [2024-11-16 16:44:17.497217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.850 [2024-11-16 16:44:17.497284] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.850 [2024-11-16 16:44:17.497305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.850 [2024-11-16 16:44:17.497371] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.850 [2024-11-16 16:44:17.497389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.850 [2024-11-16 16:44:17.497458] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:31.850 [2024-11-16 16:44:17.497477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.850 #51 NEW cov: 11829 ft: 15181 corp: 39/506b lim: 25 exec/s: 51 rss: 69Mb L: 20/24 MS: 1 CopyPart- 00:08:31.850 [2024-11-16 16:44:17.537349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.850 [2024-11-16 16:44:17.537377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.850 [2024-11-16 16:44:17.537437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.850 [2024-11-16 16:44:17.537457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.850 [2024-11-16 16:44:17.537523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.850 [2024-11-16 16:44:17.537543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.850 [2024-11-16 16:44:17.537607] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:31.850 [2024-11-16 16:44:17.537626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.850 #52 NEW cov: 11829 ft: 15191 corp: 40/526b lim: 25 exec/s: 52 rss: 69Mb L: 20/24 MS: 1 ChangeBinInt- 00:08:31.850 [2024-11-16 16:44:17.577343] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.850 [2024-11-16 16:44:17.577372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.850 [2024-11-16 16:44:17.577438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:31.850 [2024-11-16 16:44:17.577460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.850 [2024-11-16 16:44:17.577526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:31.850 [2024-11-16 16:44:17.577551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.110 #53 NEW cov: 11829 ft: 15200 corp: 41/544b lim: 25 exec/s: 53 rss: 69Mb L: 18/24 MS: 1 CrossOver- 00:08:32.110 [2024-11-16 16:44:17.617319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.110 [2024-11-16 16:44:17.617348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.110 [2024-11-16 16:44:17.617417] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.110 [2024-11-16 16:44:17.617440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.110 #54 NEW cov: 11829 ft: 15217 corp: 42/555b lim: 25 exec/s: 54 rss: 69Mb L: 11/24 MS: 1 PersAutoDict- DE: "\001\006"- 00:08:32.110 [2024-11-16 16:44:17.657562] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.110 [2024-11-16 16:44:17.657591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.110 [2024-11-16 16:44:17.657657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.110 [2024-11-16 16:44:17.657685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.110 [2024-11-16 16:44:17.657754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.110 [2024-11-16 16:44:17.657776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.110 #55 NEW cov: 11829 ft: 15231 corp: 43/571b lim: 25 exec/s: 55 rss: 70Mb L: 16/24 MS: 1 CopyPart- 00:08:32.110 [2024-11-16 16:44:17.697710] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.110 [2024-11-16 16:44:17.697738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.110 [2024-11-16 16:44:17.697801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.110 [2024-11-16 16:44:17.697823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.111 [2024-11-16 16:44:17.697889] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.111 [2024-11-16 16:44:17.697909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.111 #56 NEW cov: 11829 ft: 15237 corp: 44/587b lim: 25 exec/s: 28 rss: 70Mb L: 16/24 MS: 1 ChangeByte- 00:08:32.111 #56 DONE cov: 11829 ft: 15237 corp: 44/587b lim: 25 exec/s: 28 rss: 70Mb 00:08:32.111 ###### Recommended dictionary. ###### 00:08:32.111 "\000\212\0273\221o+\370" # Uses: 1 00:08:32.111 "\001\006" # Uses: 1 00:08:32.111 ###### End of recommended dictionary. ###### 00:08:32.111 Done 56 runs in 2 second(s) 00:08:32.111 16:44:17 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:08:32.111 16:44:17 -- ../common.sh@72 -- # (( i++ )) 00:08:32.111 16:44:17 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:32.111 16:44:17 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:32.111 16:44:17 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:32.111 16:44:17 -- nvmf/run.sh@24 -- # local timen=1 00:08:32.111 16:44:17 -- nvmf/run.sh@25 -- # local core=0x1 00:08:32.111 16:44:17 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:32.111 16:44:17 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:32.111 16:44:17 -- nvmf/run.sh@29 -- # printf %02d 24 00:08:32.111 16:44:17 -- nvmf/run.sh@29 -- # port=4424 00:08:32.111 16:44:17 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:32.111 16:44:17 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:32.111 16:44:17 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:32.111 16:44:17 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:08:32.371 [2024-11-16 16:44:17.881226] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:32.371 [2024-11-16 16:44:17.881326] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid493750 ] 00:08:32.371 EAL: No free 2048 kB hugepages reported on node 1 00:08:32.371 [2024-11-16 16:44:18.057310] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:32.371 [2024-11-16 16:44:18.076700] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:32.371 [2024-11-16 16:44:18.076812] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.631 [2024-11-16 16:44:18.128077] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:32.631 [2024-11-16 16:44:18.144411] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:32.631 INFO: Running with entropic power schedule (0xFF, 100). 00:08:32.631 INFO: Seed: 1704712987 00:08:32.631 INFO: Loaded 1 modules (344599 inline 8-bit counters): 344599 [0x2679d4c, 0x26cdf63), 00:08:32.631 INFO: Loaded 1 PC tables (344599 PCs): 344599 [0x26cdf68,0x2c100d8), 00:08:32.631 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:32.631 INFO: A corpus is not provided, starting from an empty corpus 00:08:32.631 #2 INITED exec/s: 0 rss: 59Mb 00:08:32.631 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:32.631 This may also happen if the target rejected all inputs we tried so far 00:08:32.631 [2024-11-16 16:44:18.199437] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446463689654534143 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.631 [2024-11-16 16:44:18.199468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.889 NEW_FUNC[1/672]: 0x47cdc8 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:32.889 NEW_FUNC[2/672]: 0x48da48 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:32.889 #11 NEW cov: 11674 ft: 11673 corp: 2/23b lim: 100 exec/s: 0 rss: 67Mb L: 22/22 MS: 4 ChangeByte-InsertRepeatedBytes-ChangeBinInt-InsertRepeatedBytes- 00:08:32.889 [2024-11-16 16:44:18.510382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.889 [2024-11-16 16:44:18.510441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.889 #13 NEW cov: 11787 ft: 12265 corp: 3/57b lim: 100 exec/s: 0 rss: 67Mb L: 34/34 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:32.889 [2024-11-16 16:44:18.550513] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.889 [2024-11-16 16:44:18.550542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.889 [2024-11-16 16:44:18.550577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.889 [2024-11-16 16:44:18.550592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.889 [2024-11-16 16:44:18.550647] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.889 [2024-11-16 16:44:18.550662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.889 #14 NEW cov: 11793 ft: 13257 corp: 4/120b lim: 100 exec/s: 0 rss: 67Mb L: 63/63 MS: 1 InsertRepeatedBytes- 00:08:32.889 [2024-11-16 16:44:18.590414] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069583023359 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.889 [2024-11-16 16:44:18.590442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.889 #15 NEW cov: 11878 ft: 13542 corp: 5/154b lim: 100 exec/s: 0 rss: 67Mb L: 34/63 MS: 1 ChangeByte- 00:08:32.889 [2024-11-16 16:44:18.630598] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:3761688987579986996 len:13568 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.889 [2024-11-16 16:44:18.630625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.889 [2024-11-16 16:44:18.630676] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.889 [2024-11-16 16:44:18.630692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.147 #18 NEW cov: 11878 ft: 13987 corp: 6/195b lim: 100 exec/s: 0 rss: 67Mb L: 41/63 MS: 3 InsertRepeatedBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:33.147 [2024-11-16 16:44:18.670730] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:3761688987579986996 len:13568 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.147 [2024-11-16 16:44:18.670757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.147 [2024-11-16 16:44:18.670795] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446743163176484863 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.147 [2024-11-16 16:44:18.670811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.147 #19 NEW cov: 11878 ft: 14119 corp: 7/236b lim: 100 exec/s: 0 rss: 67Mb L: 41/63 MS: 1 ChangeByte- 00:08:33.147 [2024-11-16 16:44:18.710713] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446463689654534143 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.147 [2024-11-16 16:44:18.710740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.147 #20 NEW cov: 11878 ft: 14179 corp: 8/258b lim: 100 exec/s: 0 rss: 67Mb L: 22/63 MS: 1 ChangeBit- 00:08:33.147 [2024-11-16 16:44:18.750813] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446463689654534143 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.147 [2024-11-16 16:44:18.750840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.147 #21 NEW cov: 11878 ft: 14256 corp: 9/280b lim: 100 exec/s: 0 rss: 67Mb L: 22/63 MS: 1 ChangeByte- 00:08:33.147 [2024-11-16 16:44:18.790920] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:8862803682610118655 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.147 [2024-11-16 16:44:18.790947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.147 #27 NEW cov: 11878 ft: 14284 corp: 10/302b lim: 100 exec/s: 0 rss: 67Mb L: 22/63 MS: 1 ChangeByte- 00:08:33.147 [2024-11-16 16:44:18.831318] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:3255307777713450285 len:11566 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.147 [2024-11-16 16:44:18.831347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.147 [2024-11-16 16:44:18.831383] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:3255307777713450285 len:11566 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.147 [2024-11-16 16:44:18.831398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.147 [2024-11-16 16:44:18.831448] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:3255307777713450285 len:11566 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.147 [2024-11-16 16:44:18.831464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.147 #28 NEW cov: 11878 ft: 14342 corp: 11/372b lim: 100 exec/s: 0 rss: 67Mb L: 70/70 MS: 1 InsertRepeatedBytes- 00:08:33.147 [2024-11-16 16:44:18.871139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446463689654534143 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.147 [2024-11-16 16:44:18.871166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.147 #29 NEW cov: 11878 ft: 14447 corp: 12/394b lim: 100 exec/s: 0 rss: 67Mb L: 22/70 MS: 1 ShuffleBytes- 00:08:33.405 [2024-11-16 16:44:18.911282] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:8862803682610118655 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.405 [2024-11-16 16:44:18.911307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.405 #30 NEW cov: 11878 ft: 14503 corp: 13/416b lim: 100 exec/s: 0 rss: 67Mb L: 22/70 MS: 1 ChangeByte- 00:08:33.405 [2024-11-16 16:44:18.951658] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069599070975 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.405 [2024-11-16 16:44:18.951688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.405 [2024-11-16 16:44:18.951736] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.405 [2024-11-16 16:44:18.951752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.405 [2024-11-16 16:44:18.951800] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.405 [2024-11-16 16:44:18.951815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.405 #31 NEW cov: 11878 ft: 14551 corp: 14/479b lim: 100 exec/s: 0 rss: 67Mb L: 63/70 MS: 1 ShuffleBytes- 00:08:33.405 [2024-11-16 16:44:19.001544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:8862803648250380287 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.405 [2024-11-16 16:44:19.001571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.405 #32 NEW cov: 11878 ft: 14587 corp: 15/501b lim: 100 exec/s: 0 rss: 67Mb L: 22/70 MS: 1 ChangeBit- 00:08:33.405 [2024-11-16 16:44:19.041647] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1657044278817325055 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.405 [2024-11-16 16:44:19.041677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.405 #33 NEW cov: 11878 ft: 14596 corp: 16/523b lim: 100 exec/s: 0 rss: 67Mb L: 22/70 MS: 1 ChangeBinInt- 00:08:33.405 [2024-11-16 16:44:19.082030] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.405 [2024-11-16 16:44:19.082060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.405 [2024-11-16 16:44:19.082096] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.405 [2024-11-16 16:44:19.082111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.405 [2024-11-16 16:44:19.082161] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.405 [2024-11-16 16:44:19.082176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.405 NEW_FUNC[1/1]: 0x1967b18 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:33.405 #34 NEW cov: 11901 ft: 14685 corp: 17/586b lim: 100 exec/s: 0 rss: 68Mb L: 63/70 MS: 1 ShuffleBytes- 00:08:33.405 [2024-11-16 16:44:19.121889] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.405 [2024-11-16 16:44:19.121916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.405 #35 NEW cov: 11901 ft: 14728 corp: 18/621b lim: 100 exec/s: 0 rss: 68Mb L: 35/70 MS: 1 InsertByte- 00:08:33.662 [2024-11-16 16:44:19.162018] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:8862803648250380287 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.662 [2024-11-16 16:44:19.162046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.662 #36 NEW cov: 11901 ft: 14739 corp: 19/643b lim: 100 exec/s: 36 rss: 68Mb L: 22/70 MS: 1 ChangeASCIIInt- 00:08:33.662 [2024-11-16 16:44:19.202413] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.662 [2024-11-16 16:44:19.202439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.662 [2024-11-16 16:44:19.202475] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.662 [2024-11-16 16:44:19.202489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.662 [2024-11-16 16:44:19.202540] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.662 [2024-11-16 16:44:19.202554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.662 #37 NEW cov: 11901 ft: 14748 corp: 20/706b lim: 100 exec/s: 37 rss: 68Mb L: 63/70 MS: 1 ShuffleBytes- 00:08:33.662 [2024-11-16 16:44:19.242220] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446463689654534143 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.662 [2024-11-16 16:44:19.242247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.662 #38 NEW cov: 11901 ft: 14754 corp: 21/728b lim: 100 exec/s: 38 rss: 68Mb L: 22/70 MS: 1 ChangeASCIIInt- 00:08:33.662 [2024-11-16 16:44:19.272749] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.662 [2024-11-16 16:44:19.272776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.662 [2024-11-16 16:44:19.272823] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.662 [2024-11-16 16:44:19.272844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.662 [2024-11-16 16:44:19.272893] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.662 [2024-11-16 16:44:19.272909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.662 [2024-11-16 16:44:19.272956] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.662 [2024-11-16 16:44:19.272973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.662 #39 NEW cov: 11901 ft: 15143 corp: 22/809b lim: 100 exec/s: 39 rss: 68Mb L: 81/81 MS: 1 CopyPart- 00:08:33.662 [2024-11-16 16:44:19.312396] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446463689654534143 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.662 [2024-11-16 16:44:19.312424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.662 #40 NEW cov: 11901 ft: 15163 corp: 23/831b lim: 100 exec/s: 40 rss: 68Mb L: 22/81 MS: 1 CMP- DE: "\377\006"- 00:08:33.662 [2024-11-16 16:44:19.352566] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:8863079660028690431 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.662 [2024-11-16 16:44:19.352594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.662 #41 NEW cov: 11901 ft: 15170 corp: 24/853b lim: 100 exec/s: 41 rss: 68Mb L: 22/81 MS: 1 ChangeBinInt- 00:08:33.662 [2024-11-16 16:44:19.392966] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:3255307777713450285 len:11566 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.662 [2024-11-16 16:44:19.392993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.662 [2024-11-16 16:44:19.393027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:3255307777713450285 len:11566 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.662 [2024-11-16 16:44:19.393043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.662 [2024-11-16 16:44:19.393093] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:3255307777713450285 len:11566 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.662 [2024-11-16 16:44:19.393109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.921 #42 NEW cov: 11901 ft: 15192 corp: 25/922b lim: 100 exec/s: 42 rss: 68Mb L: 69/81 MS: 1 EraseBytes- 00:08:33.921 [2024-11-16 16:44:19.442800] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446463689654534143 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.921 [2024-11-16 16:44:19.442827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.921 #43 NEW cov: 11901 ft: 15219 corp: 26/945b lim: 100 exec/s: 43 rss: 68Mb L: 23/81 MS: 1 InsertByte- 00:08:33.921 [2024-11-16 16:44:19.483212] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:3255307777713450285 len:11566 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.921 [2024-11-16 16:44:19.483240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.921 [2024-11-16 16:44:19.483281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:3255307777713450285 len:11566 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.921 [2024-11-16 16:44:19.483299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.921 [2024-11-16 16:44:19.483352] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:3255307777713450285 len:11566 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.921 [2024-11-16 16:44:19.483367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.921 #44 NEW cov: 11901 ft: 15254 corp: 27/1015b lim: 100 exec/s: 44 rss: 68Mb L: 70/81 MS: 1 ShuffleBytes- 00:08:33.921 [2024-11-16 16:44:19.523168] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446463689654534143 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.922 [2024-11-16 16:44:19.523195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.922 [2024-11-16 16:44:19.523229] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744069431230463 len:35467 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.922 [2024-11-16 16:44:19.523244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.922 #45 NEW cov: 11901 ft: 15265 corp: 28/1056b lim: 100 exec/s: 45 rss: 68Mb L: 41/81 MS: 1 CopyPart- 00:08:33.922 [2024-11-16 16:44:19.563320] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:3761688987579986996 len:13568 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.922 [2024-11-16 16:44:19.563347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.922 [2024-11-16 16:44:19.563394] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446743163176484863 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.922 [2024-11-16 16:44:19.563411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.922 #46 NEW cov: 11901 ft: 15273 corp: 29/1097b lim: 100 exec/s: 46 rss: 68Mb L: 41/81 MS: 1 ShuffleBytes- 00:08:33.922 [2024-11-16 16:44:19.603259] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446463414776627199 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.922 [2024-11-16 16:44:19.603285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.922 #47 NEW cov: 11901 ft: 15326 corp: 30/1119b lim: 100 exec/s: 47 rss: 68Mb L: 22/81 MS: 1 ChangeBit- 00:08:33.922 [2024-11-16 16:44:19.643848] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.922 [2024-11-16 16:44:19.643875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.922 [2024-11-16 16:44:19.643923] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:7740398493674204011 len:27500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.922 [2024-11-16 16:44:19.643939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.922 [2024-11-16 16:44:19.643989] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.922 [2024-11-16 16:44:19.644003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.922 [2024-11-16 16:44:19.644054] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.922 [2024-11-16 16:44:19.644070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.922 #48 NEW cov: 11901 ft: 15342 corp: 31/1203b lim: 100 exec/s: 48 rss: 68Mb L: 84/84 MS: 1 InsertRepeatedBytes- 00:08:34.180 [2024-11-16 16:44:19.683608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:3761688987582411828 len:13568 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.180 [2024-11-16 16:44:19.683635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.180 [2024-11-16 16:44:19.683679] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.180 [2024-11-16 16:44:19.683695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.180 #49 NEW cov: 11901 ft: 15358 corp: 32/1244b lim: 100 exec/s: 49 rss: 68Mb L: 41/84 MS: 1 ChangeByte- 00:08:34.180 [2024-11-16 16:44:19.723730] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:3761688987579986996 len:13568 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.180 [2024-11-16 16:44:19.723757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.180 [2024-11-16 16:44:19.723791] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446743163176484863 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.180 [2024-11-16 16:44:19.723806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.180 #50 NEW cov: 11901 ft: 15387 corp: 33/1285b lim: 100 exec/s: 50 rss: 68Mb L: 41/84 MS: 1 ShuffleBytes- 00:08:34.180 [2024-11-16 16:44:19.763703] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:8862803682610118655 len:65533 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.180 [2024-11-16 16:44:19.763731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.180 #51 NEW cov: 11901 ft: 15455 corp: 34/1307b lim: 100 exec/s: 51 rss: 68Mb L: 22/84 MS: 1 ChangeByte- 00:08:34.180 [2024-11-16 16:44:19.803850] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446463689654534143 len:1792 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.180 [2024-11-16 16:44:19.803877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.180 #52 NEW cov: 11901 ft: 15525 corp: 35/1329b lim: 100 exec/s: 52 rss: 68Mb L: 22/84 MS: 1 PersAutoDict- DE: "\377\006"- 00:08:34.180 [2024-11-16 16:44:19.844243] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:792633530290732810 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.180 [2024-11-16 16:44:19.844271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.180 [2024-11-16 16:44:19.844313] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.180 [2024-11-16 16:44:19.844328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.180 [2024-11-16 16:44:19.844377] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.180 [2024-11-16 16:44:19.844392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.180 #53 NEW cov: 11901 ft: 15548 corp: 36/1392b lim: 100 exec/s: 53 rss: 68Mb L: 63/84 MS: 1 CopyPart- 00:08:34.180 [2024-11-16 16:44:19.884358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:8862803648250380287 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.181 [2024-11-16 16:44:19.884385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.181 [2024-11-16 16:44:19.884425] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:15770157678700714714 len:56027 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.181 [2024-11-16 16:44:19.884442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.181 [2024-11-16 16:44:19.884493] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:15770157678700714714 len:56027 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.181 [2024-11-16 16:44:19.884507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.181 #54 NEW cov: 11901 ft: 15565 corp: 37/1457b lim: 100 exec/s: 54 rss: 69Mb L: 65/84 MS: 1 InsertRepeatedBytes- 00:08:34.181 [2024-11-16 16:44:19.924168] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:8862803648250380287 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.181 [2024-11-16 16:44:19.924195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.439 #55 NEW cov: 11901 ft: 15574 corp: 38/1480b lim: 100 exec/s: 55 rss: 69Mb L: 23/84 MS: 1 InsertByte- 00:08:34.439 [2024-11-16 16:44:19.964431] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:3761688987579986996 len:13568 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.439 [2024-11-16 16:44:19.964460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.439 [2024-11-16 16:44:19.964510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446743163176484863 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.439 [2024-11-16 16:44:19.964528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.439 #56 NEW cov: 11901 ft: 15594 corp: 39/1521b lim: 100 exec/s: 56 rss: 69Mb L: 41/84 MS: 1 ChangeByte- 00:08:34.439 [2024-11-16 16:44:20.004405] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18387914708360364031 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.439 [2024-11-16 16:44:20.004433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.439 #57 NEW cov: 11901 ft: 15600 corp: 40/1545b lim: 100 exec/s: 57 rss: 69Mb L: 24/84 MS: 1 InsertByte- 00:08:34.439 [2024-11-16 16:44:20.044659] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18387914708360364031 len:65465 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.439 [2024-11-16 16:44:20.044692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.439 [2024-11-16 16:44:20.044723] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13310591802206107832 len:47289 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.439 [2024-11-16 16:44:20.044738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.439 #63 NEW cov: 11901 ft: 15651 corp: 41/1601b lim: 100 exec/s: 63 rss: 69Mb L: 56/84 MS: 1 InsertRepeatedBytes- 00:08:34.439 [2024-11-16 16:44:20.094800] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:3761688987576576308 len:13568 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.439 [2024-11-16 16:44:20.094828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.439 [2024-11-16 16:44:20.094866] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.439 [2024-11-16 16:44:20.094882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.439 #64 NEW cov: 11901 ft: 15661 corp: 42/1642b lim: 100 exec/s: 64 rss: 69Mb L: 41/84 MS: 1 ChangeBinInt- 00:08:34.439 [2024-11-16 16:44:20.144786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446463414776627199 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.439 [2024-11-16 16:44:20.144814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.439 #65 NEW cov: 11901 ft: 15670 corp: 43/1664b lim: 100 exec/s: 65 rss: 69Mb L: 22/84 MS: 1 PersAutoDict- DE: "\377\006"- 00:08:34.439 [2024-11-16 16:44:20.184882] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:8863084066665136127 len:65419 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.439 [2024-11-16 16:44:20.184911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.698 #66 NEW cov: 11901 ft: 15683 corp: 44/1686b lim: 100 exec/s: 33 rss: 69Mb L: 22/84 MS: 1 CopyPart- 00:08:34.698 #66 DONE cov: 11901 ft: 15683 corp: 44/1686b lim: 100 exec/s: 33 rss: 69Mb 00:08:34.698 ###### Recommended dictionary. ###### 00:08:34.698 "\377\006" # Uses: 2 00:08:34.698 ###### End of recommended dictionary. ###### 00:08:34.698 Done 66 runs in 2 second(s) 00:08:34.698 16:44:20 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:08:34.698 16:44:20 -- ../common.sh@72 -- # (( i++ )) 00:08:34.698 16:44:20 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:34.698 16:44:20 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:08:34.698 00:08:34.698 real 1m2.408s 00:08:34.698 user 1m39.271s 00:08:34.698 sys 0m6.913s 00:08:34.698 16:44:20 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:34.698 16:44:20 -- common/autotest_common.sh@10 -- # set +x 00:08:34.698 ************************************ 00:08:34.698 END TEST nvmf_fuzz 00:08:34.698 ************************************ 00:08:34.698 16:44:20 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:34.698 16:44:20 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:34.698 16:44:20 -- fuzz/llvm.sh@20 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:34.698 16:44:20 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:08:34.698 16:44:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:34.698 16:44:20 -- common/autotest_common.sh@10 -- # set +x 00:08:34.698 ************************************ 00:08:34.698 START TEST vfio_fuzz 00:08:34.698 ************************************ 00:08:34.698 16:44:20 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:34.960 * Looking for test storage... 00:08:34.960 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:34.960 16:44:20 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:08:34.960 16:44:20 -- common/autotest_common.sh@1690 -- # lcov --version 00:08:34.960 16:44:20 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:08:34.960 16:44:20 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:08:34.960 16:44:20 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:08:34.960 16:44:20 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:08:34.960 16:44:20 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:08:34.960 16:44:20 -- scripts/common.sh@335 -- # IFS=.-: 00:08:34.960 16:44:20 -- scripts/common.sh@335 -- # read -ra ver1 00:08:34.960 16:44:20 -- scripts/common.sh@336 -- # IFS=.-: 00:08:34.960 16:44:20 -- scripts/common.sh@336 -- # read -ra ver2 00:08:34.960 16:44:20 -- scripts/common.sh@337 -- # local 'op=<' 00:08:34.960 16:44:20 -- scripts/common.sh@339 -- # ver1_l=2 00:08:34.960 16:44:20 -- scripts/common.sh@340 -- # ver2_l=1 00:08:34.960 16:44:20 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:08:34.960 16:44:20 -- scripts/common.sh@343 -- # case "$op" in 00:08:34.960 16:44:20 -- scripts/common.sh@344 -- # : 1 00:08:34.960 16:44:20 -- scripts/common.sh@363 -- # (( v = 0 )) 00:08:34.960 16:44:20 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:34.960 16:44:20 -- scripts/common.sh@364 -- # decimal 1 00:08:34.960 16:44:20 -- scripts/common.sh@352 -- # local d=1 00:08:34.960 16:44:20 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:34.960 16:44:20 -- scripts/common.sh@354 -- # echo 1 00:08:34.960 16:44:20 -- scripts/common.sh@364 -- # ver1[v]=1 00:08:34.960 16:44:20 -- scripts/common.sh@365 -- # decimal 2 00:08:34.960 16:44:20 -- scripts/common.sh@352 -- # local d=2 00:08:34.960 16:44:20 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:34.960 16:44:20 -- scripts/common.sh@354 -- # echo 2 00:08:34.960 16:44:20 -- scripts/common.sh@365 -- # ver2[v]=2 00:08:34.960 16:44:20 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:08:34.960 16:44:20 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:08:34.960 16:44:20 -- scripts/common.sh@367 -- # return 0 00:08:34.960 16:44:20 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:34.960 16:44:20 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:08:34.960 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:34.960 --rc genhtml_branch_coverage=1 00:08:34.960 --rc genhtml_function_coverage=1 00:08:34.960 --rc genhtml_legend=1 00:08:34.960 --rc geninfo_all_blocks=1 00:08:34.960 --rc geninfo_unexecuted_blocks=1 00:08:34.960 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:34.960 ' 00:08:34.960 16:44:20 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:08:34.960 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:34.960 --rc genhtml_branch_coverage=1 00:08:34.960 --rc genhtml_function_coverage=1 00:08:34.960 --rc genhtml_legend=1 00:08:34.960 --rc geninfo_all_blocks=1 00:08:34.960 --rc geninfo_unexecuted_blocks=1 00:08:34.960 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:34.960 ' 00:08:34.960 16:44:20 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:08:34.960 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:34.960 --rc genhtml_branch_coverage=1 00:08:34.960 --rc genhtml_function_coverage=1 00:08:34.960 --rc genhtml_legend=1 00:08:34.960 --rc geninfo_all_blocks=1 00:08:34.960 --rc geninfo_unexecuted_blocks=1 00:08:34.960 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:34.960 ' 00:08:34.960 16:44:20 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:08:34.960 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:34.960 --rc genhtml_branch_coverage=1 00:08:34.960 --rc genhtml_function_coverage=1 00:08:34.960 --rc genhtml_legend=1 00:08:34.960 --rc geninfo_all_blocks=1 00:08:34.960 --rc geninfo_unexecuted_blocks=1 00:08:34.960 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:34.960 ' 00:08:34.960 16:44:20 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:34.960 16:44:20 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:34.961 16:44:20 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:34.961 16:44:20 -- common/autotest_common.sh@34 -- # set -e 00:08:34.961 16:44:20 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:34.961 16:44:20 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:34.961 16:44:20 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:34.961 16:44:20 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:34.961 16:44:20 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:34.961 16:44:20 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:34.961 16:44:20 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:34.961 16:44:20 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:34.961 16:44:20 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:34.961 16:44:20 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:34.961 16:44:20 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:34.961 16:44:20 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:34.961 16:44:20 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:34.961 16:44:20 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:34.961 16:44:20 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:34.961 16:44:20 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:34.961 16:44:20 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:34.961 16:44:20 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:34.961 16:44:20 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:34.961 16:44:20 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:34.961 16:44:20 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:34.961 16:44:20 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:34.961 16:44:20 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:34.961 16:44:20 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:34.961 16:44:20 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:34.961 16:44:20 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:34.961 16:44:20 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:34.961 16:44:20 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:34.961 16:44:20 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:34.961 16:44:20 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:34.961 16:44:20 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:34.961 16:44:20 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:34.961 16:44:20 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:34.961 16:44:20 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:34.961 16:44:20 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:34.961 16:44:20 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:34.961 16:44:20 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:34.961 16:44:20 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:34.961 16:44:20 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:34.961 16:44:20 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:34.961 16:44:20 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:34.961 16:44:20 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:34.961 16:44:20 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:34.961 16:44:20 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:34.961 16:44:20 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:34.961 16:44:20 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:34.961 16:44:20 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:34.961 16:44:20 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:34.961 16:44:20 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:34.961 16:44:20 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:34.961 16:44:20 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:34.961 16:44:20 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:34.961 16:44:20 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:34.961 16:44:20 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:34.961 16:44:20 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:34.961 16:44:20 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:34.961 16:44:20 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:08:34.961 16:44:20 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:08:34.961 16:44:20 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:08:34.961 16:44:20 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:08:34.961 16:44:20 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:08:34.961 16:44:20 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:08:34.961 16:44:20 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:08:34.961 16:44:20 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:08:34.961 16:44:20 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:34.961 16:44:20 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:08:34.961 16:44:20 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:08:34.961 16:44:20 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:08:34.961 16:44:20 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:08:34.961 16:44:20 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:34.961 16:44:20 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:08:34.961 16:44:20 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:08:34.961 16:44:20 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:08:34.961 16:44:20 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:08:34.961 16:44:20 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:08:34.961 16:44:20 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:08:34.961 16:44:20 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:08:34.961 16:44:20 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:08:34.961 16:44:20 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:08:34.961 16:44:20 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:08:34.961 16:44:20 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:34.961 16:44:20 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:08:34.961 16:44:20 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:08:34.961 16:44:20 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:34.961 16:44:20 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:34.961 16:44:20 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:34.961 16:44:20 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:34.961 16:44:20 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:34.961 16:44:20 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:34.961 16:44:20 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:34.961 16:44:20 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:34.961 16:44:20 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:34.961 16:44:20 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:34.961 16:44:20 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:34.961 16:44:20 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:34.961 16:44:20 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:34.961 16:44:20 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:34.961 16:44:20 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:34.961 16:44:20 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:34.961 #define SPDK_CONFIG_H 00:08:34.961 #define SPDK_CONFIG_APPS 1 00:08:34.961 #define SPDK_CONFIG_ARCH native 00:08:34.961 #undef SPDK_CONFIG_ASAN 00:08:34.961 #undef SPDK_CONFIG_AVAHI 00:08:34.961 #undef SPDK_CONFIG_CET 00:08:34.961 #define SPDK_CONFIG_COVERAGE 1 00:08:34.961 #define SPDK_CONFIG_CROSS_PREFIX 00:08:34.961 #undef SPDK_CONFIG_CRYPTO 00:08:34.961 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:34.961 #undef SPDK_CONFIG_CUSTOMOCF 00:08:34.961 #undef SPDK_CONFIG_DAOS 00:08:34.961 #define SPDK_CONFIG_DAOS_DIR 00:08:34.961 #define SPDK_CONFIG_DEBUG 1 00:08:34.961 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:34.961 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:34.961 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:34.961 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:34.961 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:34.961 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:34.961 #define SPDK_CONFIG_EXAMPLES 1 00:08:34.961 #undef SPDK_CONFIG_FC 00:08:34.961 #define SPDK_CONFIG_FC_PATH 00:08:34.961 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:34.961 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:34.961 #undef SPDK_CONFIG_FUSE 00:08:34.961 #define SPDK_CONFIG_FUZZER 1 00:08:34.961 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:34.961 #undef SPDK_CONFIG_GOLANG 00:08:34.961 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:34.961 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:34.961 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:34.961 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:34.961 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:34.961 #define SPDK_CONFIG_IDXD 1 00:08:34.961 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:34.961 #undef SPDK_CONFIG_IPSEC_MB 00:08:34.961 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:34.961 #define SPDK_CONFIG_ISAL 1 00:08:34.961 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:34.961 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:34.961 #define SPDK_CONFIG_LIBDIR 00:08:34.961 #undef SPDK_CONFIG_LTO 00:08:34.961 #define SPDK_CONFIG_MAX_LCORES 00:08:34.961 #define SPDK_CONFIG_NVME_CUSE 1 00:08:34.961 #undef SPDK_CONFIG_OCF 00:08:34.961 #define SPDK_CONFIG_OCF_PATH 00:08:34.961 #define SPDK_CONFIG_OPENSSL_PATH 00:08:34.961 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:34.961 #undef SPDK_CONFIG_PGO_USE 00:08:34.961 #define SPDK_CONFIG_PREFIX /usr/local 00:08:34.961 #undef SPDK_CONFIG_RAID5F 00:08:34.961 #undef SPDK_CONFIG_RBD 00:08:34.961 #define SPDK_CONFIG_RDMA 1 00:08:34.961 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:34.961 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:34.961 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:34.962 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:34.962 #undef SPDK_CONFIG_SHARED 00:08:34.962 #undef SPDK_CONFIG_SMA 00:08:34.962 #define SPDK_CONFIG_TESTS 1 00:08:34.962 #undef SPDK_CONFIG_TSAN 00:08:34.962 #define SPDK_CONFIG_UBLK 1 00:08:34.962 #define SPDK_CONFIG_UBSAN 1 00:08:34.962 #undef SPDK_CONFIG_UNIT_TESTS 00:08:34.962 #undef SPDK_CONFIG_URING 00:08:34.962 #define SPDK_CONFIG_URING_PATH 00:08:34.962 #undef SPDK_CONFIG_URING_ZNS 00:08:34.962 #undef SPDK_CONFIG_USDT 00:08:34.962 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:34.962 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:34.962 #define SPDK_CONFIG_VFIO_USER 1 00:08:34.962 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:34.962 #define SPDK_CONFIG_VHOST 1 00:08:34.962 #define SPDK_CONFIG_VIRTIO 1 00:08:34.962 #undef SPDK_CONFIG_VTUNE 00:08:34.962 #define SPDK_CONFIG_VTUNE_DIR 00:08:34.962 #define SPDK_CONFIG_WERROR 1 00:08:34.962 #define SPDK_CONFIG_WPDK_DIR 00:08:34.962 #undef SPDK_CONFIG_XNVME 00:08:34.962 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:34.962 16:44:20 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:34.962 16:44:20 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:34.962 16:44:20 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:34.962 16:44:20 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:34.962 16:44:20 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:34.962 16:44:20 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:34.962 16:44:20 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:34.962 16:44:20 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:34.962 16:44:20 -- paths/export.sh@5 -- # export PATH 00:08:34.962 16:44:20 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:34.962 16:44:20 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:34.962 16:44:20 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:34.962 16:44:20 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:34.962 16:44:20 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:34.962 16:44:20 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:34.962 16:44:20 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:34.962 16:44:20 -- pm/common@16 -- # TEST_TAG=N/A 00:08:34.962 16:44:20 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:34.962 16:44:20 -- common/autotest_common.sh@52 -- # : 1 00:08:34.962 16:44:20 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:08:34.962 16:44:20 -- common/autotest_common.sh@56 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:34.962 16:44:20 -- common/autotest_common.sh@58 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:08:34.962 16:44:20 -- common/autotest_common.sh@60 -- # : 1 00:08:34.962 16:44:20 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:34.962 16:44:20 -- common/autotest_common.sh@62 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:08:34.962 16:44:20 -- common/autotest_common.sh@64 -- # : 00:08:34.962 16:44:20 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:08:34.962 16:44:20 -- common/autotest_common.sh@66 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:08:34.962 16:44:20 -- common/autotest_common.sh@68 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:08:34.962 16:44:20 -- common/autotest_common.sh@70 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:08:34.962 16:44:20 -- common/autotest_common.sh@72 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:34.962 16:44:20 -- common/autotest_common.sh@74 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:08:34.962 16:44:20 -- common/autotest_common.sh@76 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:08:34.962 16:44:20 -- common/autotest_common.sh@78 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:08:34.962 16:44:20 -- common/autotest_common.sh@80 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:08:34.962 16:44:20 -- common/autotest_common.sh@82 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:08:34.962 16:44:20 -- common/autotest_common.sh@84 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:08:34.962 16:44:20 -- common/autotest_common.sh@86 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:08:34.962 16:44:20 -- common/autotest_common.sh@88 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:08:34.962 16:44:20 -- common/autotest_common.sh@90 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:34.962 16:44:20 -- common/autotest_common.sh@92 -- # : 1 00:08:34.962 16:44:20 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:08:34.962 16:44:20 -- common/autotest_common.sh@94 -- # : 1 00:08:34.962 16:44:20 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:08:34.962 16:44:20 -- common/autotest_common.sh@96 -- # : rdma 00:08:34.962 16:44:20 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:34.962 16:44:20 -- common/autotest_common.sh@98 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:08:34.962 16:44:20 -- common/autotest_common.sh@100 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:08:34.962 16:44:20 -- common/autotest_common.sh@102 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:08:34.962 16:44:20 -- common/autotest_common.sh@104 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:08:34.962 16:44:20 -- common/autotest_common.sh@106 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:08:34.962 16:44:20 -- common/autotest_common.sh@108 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:08:34.962 16:44:20 -- common/autotest_common.sh@110 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:08:34.962 16:44:20 -- common/autotest_common.sh@112 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:34.962 16:44:20 -- common/autotest_common.sh@114 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:08:34.962 16:44:20 -- common/autotest_common.sh@116 -- # : 1 00:08:34.962 16:44:20 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:08:34.962 16:44:20 -- common/autotest_common.sh@118 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:34.962 16:44:20 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:34.962 16:44:20 -- common/autotest_common.sh@120 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:08:34.962 16:44:20 -- common/autotest_common.sh@122 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:08:34.962 16:44:20 -- common/autotest_common.sh@124 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:08:34.962 16:44:20 -- common/autotest_common.sh@126 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:08:34.962 16:44:20 -- common/autotest_common.sh@128 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:08:34.962 16:44:20 -- common/autotest_common.sh@130 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:08:34.962 16:44:20 -- common/autotest_common.sh@132 -- # : v22.11.4 00:08:34.962 16:44:20 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:08:34.962 16:44:20 -- common/autotest_common.sh@134 -- # : true 00:08:34.962 16:44:20 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:08:34.962 16:44:20 -- common/autotest_common.sh@136 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:08:34.962 16:44:20 -- common/autotest_common.sh@138 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:08:34.962 16:44:20 -- common/autotest_common.sh@140 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:08:34.962 16:44:20 -- common/autotest_common.sh@142 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:08:34.962 16:44:20 -- common/autotest_common.sh@144 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:08:34.962 16:44:20 -- common/autotest_common.sh@146 -- # : 0 00:08:34.962 16:44:20 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:08:34.962 16:44:20 -- common/autotest_common.sh@148 -- # : 00:08:34.962 16:44:20 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:08:34.962 16:44:20 -- common/autotest_common.sh@150 -- # : 0 00:08:34.963 16:44:20 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:08:34.963 16:44:20 -- common/autotest_common.sh@152 -- # : 0 00:08:34.963 16:44:20 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:08:34.963 16:44:20 -- common/autotest_common.sh@154 -- # : 0 00:08:34.963 16:44:20 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:08:34.963 16:44:20 -- common/autotest_common.sh@156 -- # : 0 00:08:34.963 16:44:20 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:08:34.963 16:44:20 -- common/autotest_common.sh@158 -- # : 0 00:08:34.963 16:44:20 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:08:34.963 16:44:20 -- common/autotest_common.sh@160 -- # : 0 00:08:34.963 16:44:20 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:08:34.963 16:44:20 -- common/autotest_common.sh@163 -- # : 00:08:34.963 16:44:20 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:08:34.963 16:44:20 -- common/autotest_common.sh@165 -- # : 0 00:08:34.963 16:44:20 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:08:34.963 16:44:20 -- common/autotest_common.sh@167 -- # : 0 00:08:34.963 16:44:20 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:34.963 16:44:20 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:34.963 16:44:20 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:34.963 16:44:20 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:34.963 16:44:20 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:34.963 16:44:20 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:34.963 16:44:20 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:34.963 16:44:20 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:34.963 16:44:20 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:34.963 16:44:20 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:34.963 16:44:20 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:34.963 16:44:20 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:34.963 16:44:20 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:34.963 16:44:20 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:34.963 16:44:20 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:08:34.963 16:44:20 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:34.963 16:44:20 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:34.963 16:44:20 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:34.963 16:44:20 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:34.963 16:44:20 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:34.963 16:44:20 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:08:34.963 16:44:20 -- common/autotest_common.sh@196 -- # cat 00:08:34.963 16:44:20 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:08:34.963 16:44:20 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:34.963 16:44:20 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:34.963 16:44:20 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:34.963 16:44:20 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:34.963 16:44:20 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:08:34.963 16:44:20 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:08:34.963 16:44:20 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:34.963 16:44:20 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:34.963 16:44:20 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:34.963 16:44:20 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:34.963 16:44:20 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:34.963 16:44:20 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:34.963 16:44:20 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:34.963 16:44:20 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:34.963 16:44:20 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:34.963 16:44:20 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:34.963 16:44:20 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:34.963 16:44:20 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:34.963 16:44:20 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:08:34.963 16:44:20 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:08:34.963 16:44:20 -- common/autotest_common.sh@249 -- # _LCOV= 00:08:34.963 16:44:20 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:08:34.963 16:44:20 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:08:34.963 16:44:20 -- common/autotest_common.sh@250 -- # _LCOV=1 00:08:34.963 16:44:20 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:34.963 16:44:20 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:08:34.963 16:44:20 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:34.963 16:44:20 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:08:34.963 16:44:20 -- common/autotest_common.sh@259 -- # export valgrind= 00:08:34.963 16:44:20 -- common/autotest_common.sh@259 -- # valgrind= 00:08:34.963 16:44:20 -- common/autotest_common.sh@265 -- # uname -s 00:08:34.963 16:44:20 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:08:34.963 16:44:20 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:08:34.963 16:44:20 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:08:34.963 16:44:20 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:08:34.963 16:44:20 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:08:34.963 16:44:20 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:08:34.963 16:44:20 -- common/autotest_common.sh@275 -- # MAKE=make 00:08:34.963 16:44:20 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:08:34.963 16:44:20 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:08:34.963 16:44:20 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:08:34.963 16:44:20 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:34.963 16:44:20 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:08:34.963 16:44:20 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:08:34.963 16:44:20 -- common/autotest_common.sh@319 -- # [[ -z 494325 ]] 00:08:34.963 16:44:20 -- common/autotest_common.sh@319 -- # kill -0 494325 00:08:34.963 16:44:20 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:08:34.963 16:44:20 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:08:34.963 16:44:20 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:08:34.963 16:44:20 -- common/autotest_common.sh@332 -- # local mount target_dir 00:08:34.963 16:44:20 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:08:34.963 16:44:20 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:08:34.963 16:44:20 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:08:34.963 16:44:20 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:08:34.963 16:44:20 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.RvNnBP 00:08:34.963 16:44:20 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:34.963 16:44:20 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:08:34.963 16:44:20 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:08:34.963 16:44:20 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.RvNnBP/tests/vfio /tmp/spdk.RvNnBP 00:08:34.963 16:44:20 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:08:34.963 16:44:20 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:34.963 16:44:20 -- common/autotest_common.sh@328 -- # df -T 00:08:34.964 16:44:20 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:08:34.964 16:44:20 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:08:34.964 16:44:20 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:08:34.964 16:44:20 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:08:34.964 16:44:20 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:08:34.964 16:44:20 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:08:34.964 16:44:20 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:34.964 16:44:20 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:08:34.964 16:44:20 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:08:34.964 16:44:20 -- common/autotest_common.sh@363 -- # avails["$mount"]=4096 00:08:34.964 16:44:20 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:08:34.964 16:44:20 -- common/autotest_common.sh@364 -- # uses["$mount"]=5284425728 00:08:34.964 16:44:20 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:34.964 16:44:20 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:08:34.964 16:44:20 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:08:34.964 16:44:20 -- common/autotest_common.sh@363 -- # avails["$mount"]=53097160704 00:08:34.964 16:44:20 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730578432 00:08:34.964 16:44:20 -- common/autotest_common.sh@364 -- # uses["$mount"]=8633417728 00:08:34.964 16:44:20 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:34.964 16:44:20 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:34.964 16:44:20 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:34.964 16:44:20 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864031744 00:08:34.964 16:44:20 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865289216 00:08:34.964 16:44:20 -- common/autotest_common.sh@364 -- # uses["$mount"]=1257472 00:08:34.964 16:44:20 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:34.964 16:44:20 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:34.964 16:44:20 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:34.964 16:44:20 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340117504 00:08:34.964 16:44:20 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346118144 00:08:34.964 16:44:20 -- common/autotest_common.sh@364 -- # uses["$mount"]=6000640 00:08:34.964 16:44:20 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:34.964 16:44:20 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:34.964 16:44:20 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:34.964 16:44:20 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864969728 00:08:34.964 16:44:20 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865289216 00:08:34.964 16:44:20 -- common/autotest_common.sh@364 -- # uses["$mount"]=319488 00:08:34.964 16:44:20 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:34.964 16:44:20 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:34.964 16:44:20 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:34.964 16:44:20 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:08:34.964 16:44:20 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:08:34.964 16:44:20 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:08:34.964 16:44:20 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:34.964 16:44:20 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:08:34.964 * Looking for test storage... 00:08:34.964 16:44:20 -- common/autotest_common.sh@369 -- # local target_space new_size 00:08:34.964 16:44:20 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:08:34.964 16:44:20 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:34.964 16:44:20 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:34.964 16:44:20 -- common/autotest_common.sh@373 -- # mount=/ 00:08:34.964 16:44:20 -- common/autotest_common.sh@375 -- # target_space=53097160704 00:08:34.964 16:44:20 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:08:34.964 16:44:20 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:08:34.964 16:44:20 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:08:34.964 16:44:20 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:08:34.964 16:44:20 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:08:34.964 16:44:20 -- common/autotest_common.sh@382 -- # new_size=10848010240 00:08:34.964 16:44:20 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:34.964 16:44:20 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:34.964 16:44:20 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:34.964 16:44:20 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:34.964 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:34.964 16:44:20 -- common/autotest_common.sh@390 -- # return 0 00:08:34.964 16:44:20 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:08:34.964 16:44:20 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:08:34.964 16:44:20 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:34.964 16:44:20 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:34.964 16:44:20 -- common/autotest_common.sh@1682 -- # true 00:08:34.964 16:44:20 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:08:34.964 16:44:20 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:34.964 16:44:20 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:34.964 16:44:20 -- common/autotest_common.sh@27 -- # exec 00:08:34.964 16:44:20 -- common/autotest_common.sh@29 -- # exec 00:08:34.964 16:44:20 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:34.964 16:44:20 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:34.964 16:44:20 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:34.964 16:44:20 -- common/autotest_common.sh@18 -- # set -x 00:08:34.964 16:44:20 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:08:35.224 16:44:20 -- common/autotest_common.sh@1690 -- # lcov --version 00:08:35.224 16:44:20 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:08:35.224 16:44:20 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:08:35.224 16:44:20 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:08:35.225 16:44:20 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:08:35.225 16:44:20 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:08:35.225 16:44:20 -- scripts/common.sh@335 -- # IFS=.-: 00:08:35.225 16:44:20 -- scripts/common.sh@335 -- # read -ra ver1 00:08:35.225 16:44:20 -- scripts/common.sh@336 -- # IFS=.-: 00:08:35.225 16:44:20 -- scripts/common.sh@336 -- # read -ra ver2 00:08:35.225 16:44:20 -- scripts/common.sh@337 -- # local 'op=<' 00:08:35.225 16:44:20 -- scripts/common.sh@339 -- # ver1_l=2 00:08:35.225 16:44:20 -- scripts/common.sh@340 -- # ver2_l=1 00:08:35.225 16:44:20 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:08:35.225 16:44:20 -- scripts/common.sh@343 -- # case "$op" in 00:08:35.225 16:44:20 -- scripts/common.sh@344 -- # : 1 00:08:35.225 16:44:20 -- scripts/common.sh@363 -- # (( v = 0 )) 00:08:35.225 16:44:20 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:35.225 16:44:20 -- scripts/common.sh@364 -- # decimal 1 00:08:35.225 16:44:20 -- scripts/common.sh@352 -- # local d=1 00:08:35.225 16:44:20 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:35.225 16:44:20 -- scripts/common.sh@354 -- # echo 1 00:08:35.225 16:44:20 -- scripts/common.sh@364 -- # ver1[v]=1 00:08:35.225 16:44:20 -- scripts/common.sh@365 -- # decimal 2 00:08:35.225 16:44:20 -- scripts/common.sh@352 -- # local d=2 00:08:35.225 16:44:20 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:35.225 16:44:20 -- scripts/common.sh@354 -- # echo 2 00:08:35.225 16:44:20 -- scripts/common.sh@365 -- # ver2[v]=2 00:08:35.225 16:44:20 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:08:35.225 16:44:20 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:08:35.225 16:44:20 -- scripts/common.sh@367 -- # return 0 00:08:35.225 16:44:20 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:35.225 16:44:20 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:08:35.225 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.225 --rc genhtml_branch_coverage=1 00:08:35.225 --rc genhtml_function_coverage=1 00:08:35.225 --rc genhtml_legend=1 00:08:35.225 --rc geninfo_all_blocks=1 00:08:35.225 --rc geninfo_unexecuted_blocks=1 00:08:35.225 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.225 ' 00:08:35.225 16:44:20 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:08:35.225 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.225 --rc genhtml_branch_coverage=1 00:08:35.225 --rc genhtml_function_coverage=1 00:08:35.225 --rc genhtml_legend=1 00:08:35.225 --rc geninfo_all_blocks=1 00:08:35.225 --rc geninfo_unexecuted_blocks=1 00:08:35.225 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.225 ' 00:08:35.225 16:44:20 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:08:35.225 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.225 --rc genhtml_branch_coverage=1 00:08:35.225 --rc genhtml_function_coverage=1 00:08:35.225 --rc genhtml_legend=1 00:08:35.225 --rc geninfo_all_blocks=1 00:08:35.225 --rc geninfo_unexecuted_blocks=1 00:08:35.225 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.225 ' 00:08:35.225 16:44:20 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:08:35.225 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:35.225 --rc genhtml_branch_coverage=1 00:08:35.225 --rc genhtml_function_coverage=1 00:08:35.225 --rc genhtml_legend=1 00:08:35.225 --rc geninfo_all_blocks=1 00:08:35.225 --rc geninfo_unexecuted_blocks=1 00:08:35.225 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:35.225 ' 00:08:35.225 16:44:20 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:35.225 16:44:20 -- ../common.sh@8 -- # pids=() 00:08:35.225 16:44:20 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:35.225 16:44:20 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:35.225 16:44:20 -- vfio/run.sh@59 -- # fuzz_num=7 00:08:35.225 16:44:20 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:08:35.225 16:44:20 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:08:35.225 16:44:20 -- vfio/run.sh@65 -- # mem_size=0 00:08:35.225 16:44:20 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:08:35.225 16:44:20 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:08:35.225 16:44:20 -- ../common.sh@69 -- # local fuzz_num=7 00:08:35.225 16:44:20 -- ../common.sh@70 -- # local time=1 00:08:35.225 16:44:20 -- ../common.sh@72 -- # (( i = 0 )) 00:08:35.225 16:44:20 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:35.225 16:44:20 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:35.225 16:44:20 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:35.225 16:44:20 -- vfio/run.sh@23 -- # local timen=1 00:08:35.225 16:44:20 -- vfio/run.sh@24 -- # local core=0x1 00:08:35.225 16:44:20 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:35.225 16:44:20 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:35.225 16:44:20 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:35.225 16:44:20 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:35.225 16:44:20 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:35.225 16:44:20 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:35.225 16:44:20 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:35.225 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:35.225 16:44:20 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:35.225 [2024-11-16 16:44:20.848052] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:35.225 [2024-11-16 16:44:20.848148] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid494378 ] 00:08:35.225 EAL: No free 2048 kB hugepages reported on node 1 00:08:35.225 [2024-11-16 16:44:20.922436] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:35.225 [2024-11-16 16:44:20.963179] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:35.225 [2024-11-16 16:44:20.963324] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.485 INFO: Running with entropic power schedule (0xFF, 100). 00:08:35.485 INFO: Seed: 390743607 00:08:35.485 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:35.485 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:35.485 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:35.485 INFO: A corpus is not provided, starting from an empty corpus 00:08:35.485 #2 INITED exec/s: 0 rss: 60Mb 00:08:35.485 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:35.485 This may also happen if the target rejected all inputs we tried so far 00:08:36.004 NEW_FUNC[1/625]: 0x450dd8 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:08:36.004 NEW_FUNC[2/625]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:36.004 #3 NEW cov: 10661 ft: 10737 corp: 2/39b lim: 60 exec/s: 0 rss: 65Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:36.263 NEW_FUNC[1/6]: 0x13292c8 in nvmf_vfio_user_req_complete /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:5306 00:08:36.263 NEW_FUNC[2/6]: 0x1629fd8 in nvme_pcie_ctrlr /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_pcie_internal.h:210 00:08:36.263 #4 NEW cov: 10776 ft: 13593 corp: 3/77b lim: 60 exec/s: 0 rss: 67Mb L: 38/38 MS: 1 CrossOver- 00:08:36.522 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:36.522 #8 NEW cov: 10793 ft: 15434 corp: 4/116b lim: 60 exec/s: 0 rss: 68Mb L: 39/39 MS: 4 ChangeByte-ShuffleBytes-ChangeBit-CrossOver- 00:08:36.522 #9 NEW cov: 10793 ft: 16389 corp: 5/154b lim: 60 exec/s: 9 rss: 68Mb L: 38/39 MS: 1 CMP- DE: "\000\000\000\000\004e\271\364"- 00:08:36.781 #10 NEW cov: 10793 ft: 16688 corp: 6/193b lim: 60 exec/s: 10 rss: 68Mb L: 39/39 MS: 1 CopyPart- 00:08:37.048 #11 NEW cov: 10796 ft: 16883 corp: 7/232b lim: 60 exec/s: 11 rss: 68Mb L: 39/39 MS: 1 ShuffleBytes- 00:08:37.049 #12 NEW cov: 10796 ft: 16998 corp: 8/284b lim: 60 exec/s: 12 rss: 68Mb L: 52/52 MS: 1 InsertRepeatedBytes- 00:08:37.319 #16 NEW cov: 10796 ft: 17234 corp: 9/336b lim: 60 exec/s: 16 rss: 68Mb L: 52/52 MS: 4 ChangeByte-ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:08:37.585 #17 NEW cov: 10803 ft: 17504 corp: 10/388b lim: 60 exec/s: 17 rss: 68Mb L: 52/52 MS: 1 ChangeByte- 00:08:37.585 #20 NEW cov: 10803 ft: 17596 corp: 11/406b lim: 60 exec/s: 10 rss: 68Mb L: 18/52 MS: 3 CopyPart-ChangeBit-CrossOver- 00:08:37.585 #20 DONE cov: 10803 ft: 17596 corp: 11/406b lim: 60 exec/s: 10 rss: 68Mb 00:08:37.585 ###### Recommended dictionary. ###### 00:08:37.585 "\000\000\000\000\004e\271\364" # Uses: 0 00:08:37.585 ###### End of recommended dictionary. ###### 00:08:37.585 Done 20 runs in 2 second(s) 00:08:37.858 16:44:23 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:08:37.858 16:44:23 -- ../common.sh@72 -- # (( i++ )) 00:08:37.858 16:44:23 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:37.859 16:44:23 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:37.859 16:44:23 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:37.859 16:44:23 -- vfio/run.sh@23 -- # local timen=1 00:08:37.859 16:44:23 -- vfio/run.sh@24 -- # local core=0x1 00:08:37.859 16:44:23 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:37.859 16:44:23 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:37.859 16:44:23 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:37.859 16:44:23 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:37.859 16:44:23 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:37.859 16:44:23 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:37.859 16:44:23 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:37.859 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:37.859 16:44:23 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:37.859 [2024-11-16 16:44:23.594306] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:37.859 [2024-11-16 16:44:23.594403] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid494922 ] 00:08:38.120 EAL: No free 2048 kB hugepages reported on node 1 00:08:38.120 [2024-11-16 16:44:23.665460] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:38.120 [2024-11-16 16:44:23.700694] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:38.121 [2024-11-16 16:44:23.700856] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.386 INFO: Running with entropic power schedule (0xFF, 100). 00:08:38.386 INFO: Seed: 3128744512 00:08:38.386 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:38.386 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:38.386 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:38.386 INFO: A corpus is not provided, starting from an empty corpus 00:08:38.386 #2 INITED exec/s: 0 rss: 60Mb 00:08:38.386 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:38.386 This may also happen if the target rejected all inputs we tried so far 00:08:38.386 [2024-11-16 16:44:23.992744] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:38.386 [2024-11-16 16:44:23.992776] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:38.386 [2024-11-16 16:44:23.992795] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:38.689 NEW_FUNC[1/638]: 0x451378 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:08:38.689 NEW_FUNC[2/638]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:38.689 #3 NEW cov: 10775 ft: 10769 corp: 2/9b lim: 40 exec/s: 0 rss: 66Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:08:38.958 [2024-11-16 16:44:24.438932] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:38.958 [2024-11-16 16:44:24.438967] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:38.958 [2024-11-16 16:44:24.438986] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:38.958 #4 NEW cov: 10789 ft: 14499 corp: 3/17b lim: 40 exec/s: 0 rss: 67Mb L: 8/8 MS: 1 ChangeBit- 00:08:38.958 [2024-11-16 16:44:24.614027] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:38.958 [2024-11-16 16:44:24.614062] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:38.958 [2024-11-16 16:44:24.614082] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:39.216 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:39.216 #6 NEW cov: 10809 ft: 15435 corp: 4/27b lim: 40 exec/s: 0 rss: 68Mb L: 10/10 MS: 2 CopyPart-CrossOver- 00:08:39.216 [2024-11-16 16:44:24.800213] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.216 [2024-11-16 16:44:24.800236] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.216 [2024-11-16 16:44:24.800256] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:39.216 #7 NEW cov: 10809 ft: 16319 corp: 5/35b lim: 40 exec/s: 7 rss: 68Mb L: 8/10 MS: 1 ShuffleBytes- 00:08:39.473 [2024-11-16 16:44:24.974296] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.473 [2024-11-16 16:44:24.974318] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.473 [2024-11-16 16:44:24.974336] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:39.473 #8 NEW cov: 10809 ft: 16450 corp: 6/44b lim: 40 exec/s: 8 rss: 68Mb L: 9/10 MS: 1 CrossOver- 00:08:39.473 [2024-11-16 16:44:25.158847] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.473 [2024-11-16 16:44:25.158870] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.473 [2024-11-16 16:44:25.158890] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:39.732 #9 NEW cov: 10809 ft: 16557 corp: 7/52b lim: 40 exec/s: 9 rss: 68Mb L: 8/10 MS: 1 ChangeBinInt- 00:08:39.732 [2024-11-16 16:44:25.333429] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.732 [2024-11-16 16:44:25.333450] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.732 [2024-11-16 16:44:25.333468] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:39.732 #10 NEW cov: 10809 ft: 16765 corp: 8/59b lim: 40 exec/s: 10 rss: 68Mb L: 7/10 MS: 1 EraseBytes- 00:08:39.991 [2024-11-16 16:44:25.506877] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.991 [2024-11-16 16:44:25.506899] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.991 [2024-11-16 16:44:25.506918] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:39.991 #11 NEW cov: 10809 ft: 17117 corp: 9/67b lim: 40 exec/s: 11 rss: 68Mb L: 8/10 MS: 1 ChangeByte- 00:08:39.991 [2024-11-16 16:44:25.679108] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.991 [2024-11-16 16:44:25.679129] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.991 [2024-11-16 16:44:25.679148] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.250 #12 NEW cov: 10816 ft: 17296 corp: 10/75b lim: 40 exec/s: 12 rss: 68Mb L: 8/10 MS: 1 CrossOver- 00:08:40.250 [2024-11-16 16:44:25.854650] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.250 [2024-11-16 16:44:25.854676] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.250 [2024-11-16 16:44:25.854695] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.250 #13 NEW cov: 10816 ft: 17448 corp: 11/82b lim: 40 exec/s: 6 rss: 68Mb L: 7/10 MS: 1 CopyPart- 00:08:40.250 #13 DONE cov: 10816 ft: 17448 corp: 11/82b lim: 40 exec/s: 6 rss: 68Mb 00:08:40.250 Done 13 runs in 2 second(s) 00:08:40.510 16:44:26 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:08:40.510 16:44:26 -- ../common.sh@72 -- # (( i++ )) 00:08:40.510 16:44:26 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:40.510 16:44:26 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:40.510 16:44:26 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:40.510 16:44:26 -- vfio/run.sh@23 -- # local timen=1 00:08:40.510 16:44:26 -- vfio/run.sh@24 -- # local core=0x1 00:08:40.510 16:44:26 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:40.510 16:44:26 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:40.510 16:44:26 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:40.510 16:44:26 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:40.510 16:44:26 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:40.510 16:44:26 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:40.510 16:44:26 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:40.510 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:40.510 16:44:26 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:40.769 [2024-11-16 16:44:26.260802] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:40.769 [2024-11-16 16:44:26.260876] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid495360 ] 00:08:40.769 EAL: No free 2048 kB hugepages reported on node 1 00:08:40.769 [2024-11-16 16:44:26.332560] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:40.769 [2024-11-16 16:44:26.367907] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:40.769 [2024-11-16 16:44:26.368070] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:41.029 INFO: Running with entropic power schedule (0xFF, 100). 00:08:41.029 INFO: Seed: 1496771416 00:08:41.029 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:41.029 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:41.029 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:41.029 INFO: A corpus is not provided, starting from an empty corpus 00:08:41.029 #2 INITED exec/s: 0 rss: 60Mb 00:08:41.029 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:41.029 This may also happen if the target rejected all inputs we tried so far 00:08:41.029 [2024-11-16 16:44:26.641830] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:41.288 NEW_FUNC[1/636]: 0x451d68 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:08:41.288 NEW_FUNC[2/636]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:41.288 #7 NEW cov: 10755 ft: 10575 corp: 2/39b lim: 80 exec/s: 0 rss: 65Mb L: 38/38 MS: 5 CopyPart-CrossOver-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:08:41.547 [2024-11-16 16:44:27.094680] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:41.547 #8 NEW cov: 10772 ft: 13494 corp: 3/77b lim: 80 exec/s: 0 rss: 67Mb L: 38/38 MS: 1 ChangeBinInt- 00:08:41.547 [2024-11-16 16:44:27.278856] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:41.806 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:41.806 #19 NEW cov: 10789 ft: 14550 corp: 4/115b lim: 80 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 ChangeBit- 00:08:41.806 [2024-11-16 16:44:27.462745] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:41.806 [2024-11-16 16:44:27.462780] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:42.065 NEW_FUNC[1/2]: 0x1330b08 in endpoint_id /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:638 00:08:42.065 NEW_FUNC[2/2]: 0x1330da8 in vfio_user_log /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:3084 00:08:42.065 #20 NEW cov: 10802 ft: 14773 corp: 5/191b lim: 80 exec/s: 20 rss: 68Mb L: 76/76 MS: 1 InsertRepeatedBytes- 00:08:42.065 [2024-11-16 16:44:27.658570] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:42.065 #21 NEW cov: 10802 ft: 15517 corp: 6/229b lim: 80 exec/s: 21 rss: 68Mb L: 38/76 MS: 1 ShuffleBytes- 00:08:42.324 [2024-11-16 16:44:27.843858] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:42.324 #22 NEW cov: 10802 ft: 15571 corp: 7/268b lim: 80 exec/s: 22 rss: 68Mb L: 39/76 MS: 1 InsertByte- 00:08:42.324 [2024-11-16 16:44:28.032914] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:42.583 #23 NEW cov: 10802 ft: 15683 corp: 8/327b lim: 80 exec/s: 23 rss: 68Mb L: 59/76 MS: 1 InsertRepeatedBytes- 00:08:42.583 [2024-11-16 16:44:28.218724] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:42.842 #24 NEW cov: 10802 ft: 17377 corp: 9/365b lim: 80 exec/s: 24 rss: 68Mb L: 38/76 MS: 1 ShuffleBytes- 00:08:42.842 [2024-11-16 16:44:28.415057] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:42.842 #25 NEW cov: 10809 ft: 17760 corp: 10/433b lim: 80 exec/s: 25 rss: 68Mb L: 68/76 MS: 1 InsertRepeatedBytes- 00:08:43.102 [2024-11-16 16:44:28.599701] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.102 #26 NEW cov: 10809 ft: 17970 corp: 11/472b lim: 80 exec/s: 13 rss: 68Mb L: 39/76 MS: 1 InsertByte- 00:08:43.102 #26 DONE cov: 10809 ft: 17970 corp: 11/472b lim: 80 exec/s: 13 rss: 68Mb 00:08:43.102 Done 26 runs in 2 second(s) 00:08:43.362 16:44:28 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:08:43.362 16:44:28 -- ../common.sh@72 -- # (( i++ )) 00:08:43.362 16:44:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:43.362 16:44:28 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:43.362 16:44:28 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:43.362 16:44:28 -- vfio/run.sh@23 -- # local timen=1 00:08:43.362 16:44:28 -- vfio/run.sh@24 -- # local core=0x1 00:08:43.362 16:44:28 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:43.362 16:44:28 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:43.362 16:44:28 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:43.362 16:44:28 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:43.362 16:44:28 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:43.362 16:44:28 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:43.362 16:44:28 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:43.362 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:43.362 16:44:28 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:43.362 [2024-11-16 16:44:29.010488] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:43.362 [2024-11-16 16:44:29.010585] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid495766 ] 00:08:43.362 EAL: No free 2048 kB hugepages reported on node 1 00:08:43.362 [2024-11-16 16:44:29.083656] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:43.621 [2024-11-16 16:44:29.120624] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:43.621 [2024-11-16 16:44:29.120770] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.621 INFO: Running with entropic power schedule (0xFF, 100). 00:08:43.621 INFO: Seed: 4255797081 00:08:43.621 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:43.621 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:43.621 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:43.621 INFO: A corpus is not provided, starting from an empty corpus 00:08:43.621 #2 INITED exec/s: 0 rss: 60Mb 00:08:43.621 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:43.621 This may also happen if the target rejected all inputs we tried so far 00:08:43.879 [2024-11-16 16:44:29.409717] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=323 offset=0 prot=0x3: Invalid argument 00:08:43.879 [2024-11-16 16:44:29.409751] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:43.879 [2024-11-16 16:44:29.409762] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:43.879 [2024-11-16 16:44:29.409779] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:44.139 NEW_FUNC[1/637]: 0x452458 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:08:44.139 NEW_FUNC[2/637]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:44.139 #11 NEW cov: 10717 ft: 10745 corp: 2/129b lim: 320 exec/s: 0 rss: 65Mb L: 128/128 MS: 4 ShuffleBytes-CrossOver-CopyPart-InsertRepeatedBytes- 00:08:44.139 [2024-11-16 16:44:29.859580] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [0x88b02b05, 0x88b02b05) fd=325 offset=0 prot=0x3: Invalid argument 00:08:44.139 [2024-11-16 16:44:29.859612] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x88b02b05, 0x88b02b05) offset=0 flags=0x3: Invalid argument 00:08:44.139 [2024-11-16 16:44:29.859623] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:44.139 [2024-11-16 16:44:29.859640] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:44.397 NEW_FUNC[1/1]: 0x164e868 in nvme_pcie_qpair_submit_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_pcie_common.c:1622 00:08:44.397 #12 NEW cov: 10794 ft: 14422 corp: 3/257b lim: 320 exec/s: 0 rss: 66Mb L: 128/128 MS: 1 CMP- DE: "\000\000\000\000\005+\260\210"- 00:08:44.397 [2024-11-16 16:44:30.058683] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [0x88b02b05, 0x88b02b05) fd=325 offset=0 prot=0x3: Invalid argument 00:08:44.397 [2024-11-16 16:44:30.058708] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x88b02b05, 0x88b02b05) offset=0 flags=0x3: Invalid argument 00:08:44.397 [2024-11-16 16:44:30.058719] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:44.397 [2024-11-16 16:44:30.058736] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:44.657 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:44.657 #13 NEW cov: 10811 ft: 14879 corp: 4/393b lim: 320 exec/s: 0 rss: 67Mb L: 136/136 MS: 1 PersAutoDict- DE: "\000\000\000\000\005+\260\210"- 00:08:44.657 [2024-11-16 16:44:30.249682] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:44.657 [2024-11-16 16:44:30.249710] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:44.657 [2024-11-16 16:44:30.249721] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:44.657 [2024-11-16 16:44:30.249738] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:44.657 #14 NEW cov: 10811 ft: 15570 corp: 5/521b lim: 320 exec/s: 14 rss: 67Mb L: 128/136 MS: 1 ChangeBit- 00:08:44.916 [2024-11-16 16:44:30.440911] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [0x88b02b05, 0x88b02b05) fd=325 offset=0 prot=0x3: Invalid argument 00:08:44.916 [2024-11-16 16:44:30.440934] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x88b02b05, 0x88b02b05) offset=0 flags=0x3: Invalid argument 00:08:44.916 [2024-11-16 16:44:30.440944] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:44.916 [2024-11-16 16:44:30.440960] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:44.916 #20 NEW cov: 10811 ft: 16164 corp: 6/657b lim: 320 exec/s: 20 rss: 67Mb L: 136/136 MS: 1 ChangeBit- 00:08:44.916 [2024-11-16 16:44:30.632387] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:44.916 [2024-11-16 16:44:30.632412] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:44.916 [2024-11-16 16:44:30.632423] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:44.916 [2024-11-16 16:44:30.632440] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:45.175 #21 NEW cov: 10811 ft: 16601 corp: 7/869b lim: 320 exec/s: 21 rss: 67Mb L: 212/212 MS: 1 InsertRepeatedBytes- 00:08:45.175 [2024-11-16 16:44:30.824469] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:45.175 [2024-11-16 16:44:30.824493] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:45.175 [2024-11-16 16:44:30.824503] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:45.175 [2024-11-16 16:44:30.824520] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:45.435 #22 NEW cov: 10811 ft: 16831 corp: 8/1081b lim: 320 exec/s: 22 rss: 67Mb L: 212/212 MS: 1 ChangeBit- 00:08:45.435 [2024-11-16 16:44:31.016615] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [0x88b02b05, 0x88b02b05) fd=325 offset=0 prot=0x3: Invalid argument 00:08:45.435 [2024-11-16 16:44:31.016638] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x88b02b05, 0x88b02b05) offset=0 flags=0x3: Invalid argument 00:08:45.435 [2024-11-16 16:44:31.016649] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:45.435 [2024-11-16 16:44:31.016666] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:45.435 #23 NEW cov: 10818 ft: 17196 corp: 9/1217b lim: 320 exec/s: 23 rss: 67Mb L: 136/212 MS: 1 ChangeByte- 00:08:45.695 [2024-11-16 16:44:31.207054] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [0x88b02b05, 0x88b02b05) fd=325 offset=0 prot=0x3: Invalid argument 00:08:45.695 [2024-11-16 16:44:31.207076] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x88b02b05, 0x88b02b05) offset=0 flags=0x3: Invalid argument 00:08:45.695 [2024-11-16 16:44:31.207086] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:45.695 [2024-11-16 16:44:31.207103] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:45.695 #24 NEW cov: 10818 ft: 17468 corp: 10/1354b lim: 320 exec/s: 24 rss: 67Mb L: 137/212 MS: 1 InsertByte- 00:08:45.695 [2024-11-16 16:44:31.396718] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:08:45.695 [2024-11-16 16:44:31.396744] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:08:45.695 [2024-11-16 16:44:31.396753] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:08:45.695 [2024-11-16 16:44:31.396785] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:08:45.954 #26 NEW cov: 10818 ft: 17778 corp: 11/1412b lim: 320 exec/s: 13 rss: 68Mb L: 58/212 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:45.954 #26 DONE cov: 10818 ft: 17778 corp: 11/1412b lim: 320 exec/s: 13 rss: 68Mb 00:08:45.954 ###### Recommended dictionary. ###### 00:08:45.954 "\000\000\000\000\005+\260\210" # Uses: 1 00:08:45.954 ###### End of recommended dictionary. ###### 00:08:45.954 Done 26 runs in 2 second(s) 00:08:46.214 16:44:31 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:08:46.214 16:44:31 -- ../common.sh@72 -- # (( i++ )) 00:08:46.214 16:44:31 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:46.214 16:44:31 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:46.214 16:44:31 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:46.214 16:44:31 -- vfio/run.sh@23 -- # local timen=1 00:08:46.214 16:44:31 -- vfio/run.sh@24 -- # local core=0x1 00:08:46.214 16:44:31 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:46.215 16:44:31 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:46.215 16:44:31 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:46.215 16:44:31 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:46.215 16:44:31 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:46.215 16:44:31 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:46.215 16:44:31 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:46.215 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:46.215 16:44:31 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:46.215 [2024-11-16 16:44:31.821455] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:46.215 [2024-11-16 16:44:31.821547] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid496309 ] 00:08:46.215 EAL: No free 2048 kB hugepages reported on node 1 00:08:46.215 [2024-11-16 16:44:31.893616] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:46.215 [2024-11-16 16:44:31.928919] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:46.215 [2024-11-16 16:44:31.929079] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:46.474 INFO: Running with entropic power schedule (0xFF, 100). 00:08:46.474 INFO: Seed: 2763821538 00:08:46.474 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:46.474 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:46.474 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:46.474 INFO: A corpus is not provided, starting from an empty corpus 00:08:46.474 #2 INITED exec/s: 0 rss: 60Mb 00:08:46.474 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:46.474 This may also happen if the target rejected all inputs we tried so far 00:08:46.993 NEW_FUNC[1/632]: 0x452cd8 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:08:46.993 NEW_FUNC[2/632]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:46.993 #8 NEW cov: 10751 ft: 10717 corp: 2/56b lim: 320 exec/s: 0 rss: 66Mb L: 55/55 MS: 1 InsertRepeatedBytes- 00:08:47.252 #13 NEW cov: 10765 ft: 13645 corp: 3/124b lim: 320 exec/s: 0 rss: 67Mb L: 68/68 MS: 5 ChangeBit-ChangeByte-CopyPart-ChangeBinInt-InsertRepeatedBytes- 00:08:47.252 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:47.252 #14 NEW cov: 10782 ft: 14739 corp: 4/247b lim: 320 exec/s: 0 rss: 68Mb L: 123/123 MS: 1 CrossOver- 00:08:47.511 #15 NEW cov: 10782 ft: 15727 corp: 5/302b lim: 320 exec/s: 15 rss: 68Mb L: 55/123 MS: 1 CopyPart- 00:08:47.770 #16 NEW cov: 10785 ft: 15910 corp: 6/357b lim: 320 exec/s: 16 rss: 70Mb L: 55/123 MS: 1 ChangeBinInt- 00:08:48.029 #17 NEW cov: 10785 ft: 16256 corp: 7/412b lim: 320 exec/s: 17 rss: 70Mb L: 55/123 MS: 1 ShuffleBytes- 00:08:48.029 #18 NEW cov: 10785 ft: 16704 corp: 8/480b lim: 320 exec/s: 18 rss: 70Mb L: 68/123 MS: 1 ChangeBit- 00:08:48.288 #19 NEW cov: 10785 ft: 17012 corp: 9/541b lim: 320 exec/s: 19 rss: 70Mb L: 61/123 MS: 1 CopyPart- 00:08:48.548 #20 NEW cov: 10792 ft: 17056 corp: 10/609b lim: 320 exec/s: 20 rss: 70Mb L: 68/123 MS: 1 ChangeByte- 00:08:48.548 #21 NEW cov: 10792 ft: 17380 corp: 11/672b lim: 320 exec/s: 10 rss: 70Mb L: 63/123 MS: 1 CMP- DE: "\001\000\000\000\000\000\000s"- 00:08:48.548 #21 DONE cov: 10792 ft: 17380 corp: 11/672b lim: 320 exec/s: 10 rss: 70Mb 00:08:48.548 ###### Recommended dictionary. ###### 00:08:48.548 "\001\000\000\000\000\000\000s" # Uses: 0 00:08:48.548 ###### End of recommended dictionary. ###### 00:08:48.548 Done 21 runs in 2 second(s) 00:08:48.807 16:44:34 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:08:48.807 16:44:34 -- ../common.sh@72 -- # (( i++ )) 00:08:48.807 16:44:34 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:48.807 16:44:34 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:48.807 16:44:34 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:48.807 16:44:34 -- vfio/run.sh@23 -- # local timen=1 00:08:48.807 16:44:34 -- vfio/run.sh@24 -- # local core=0x1 00:08:48.807 16:44:34 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:48.807 16:44:34 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:48.807 16:44:34 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:48.807 16:44:34 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:48.807 16:44:34 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:48.807 16:44:34 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:48.807 16:44:34 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:48.807 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:48.807 16:44:34 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:48.807 [2024-11-16 16:44:34.537985] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:48.807 [2024-11-16 16:44:34.538057] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid496848 ] 00:08:49.067 EAL: No free 2048 kB hugepages reported on node 1 00:08:49.067 [2024-11-16 16:44:34.610267] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:49.067 [2024-11-16 16:44:34.646866] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:49.067 [2024-11-16 16:44:34.647024] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:49.326 INFO: Running with entropic power schedule (0xFF, 100). 00:08:49.326 INFO: Seed: 1184871241 00:08:49.326 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:49.326 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:49.326 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:49.326 INFO: A corpus is not provided, starting from an empty corpus 00:08:49.326 #2 INITED exec/s: 0 rss: 59Mb 00:08:49.326 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:49.326 This may also happen if the target rejected all inputs we tried so far 00:08:49.326 [2024-11-16 16:44:34.928745] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:49.326 [2024-11-16 16:44:34.928789] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:49.586 NEW_FUNC[1/638]: 0x4536d8 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:08:49.586 NEW_FUNC[2/638]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:49.586 #7 NEW cov: 10783 ft: 10670 corp: 2/85b lim: 120 exec/s: 0 rss: 66Mb L: 84/84 MS: 5 CrossOver-CopyPart-EraseBytes-ChangeBit-InsertRepeatedBytes- 00:08:49.845 [2024-11-16 16:44:35.370393] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:49.845 [2024-11-16 16:44:35.370436] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:49.845 #10 NEW cov: 10798 ft: 13721 corp: 3/161b lim: 120 exec/s: 0 rss: 67Mb L: 76/84 MS: 3 InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:08:49.845 [2024-11-16 16:44:35.549317] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:49.845 [2024-11-16 16:44:35.549348] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.103 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:50.103 #12 NEW cov: 10815 ft: 15044 corp: 4/213b lim: 120 exec/s: 0 rss: 68Mb L: 52/84 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:50.103 [2024-11-16 16:44:35.729436] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.103 [2024-11-16 16:44:35.729465] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.103 #13 NEW cov: 10815 ft: 15549 corp: 5/265b lim: 120 exec/s: 0 rss: 68Mb L: 52/84 MS: 1 ChangeBit- 00:08:50.362 [2024-11-16 16:44:35.899010] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.362 [2024-11-16 16:44:35.899039] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.362 #14 NEW cov: 10815 ft: 16127 corp: 6/327b lim: 120 exec/s: 14 rss: 68Mb L: 62/84 MS: 1 CrossOver- 00:08:50.362 [2024-11-16 16:44:36.068213] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.362 [2024-11-16 16:44:36.068243] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.621 #25 NEW cov: 10815 ft: 16875 corp: 7/382b lim: 120 exec/s: 25 rss: 68Mb L: 55/84 MS: 1 EraseBytes- 00:08:50.621 [2024-11-16 16:44:36.235179] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.621 [2024-11-16 16:44:36.235208] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.621 #26 NEW cov: 10815 ft: 17046 corp: 8/434b lim: 120 exec/s: 26 rss: 68Mb L: 52/84 MS: 1 ChangeByte- 00:08:50.881 [2024-11-16 16:44:36.403391] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.881 [2024-11-16 16:44:36.403420] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.881 #27 NEW cov: 10815 ft: 17237 corp: 9/497b lim: 120 exec/s: 27 rss: 68Mb L: 63/84 MS: 1 InsertByte- 00:08:50.881 [2024-11-16 16:44:36.570528] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.881 [2024-11-16 16:44:36.570559] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.140 #28 NEW cov: 10822 ft: 17616 corp: 10/573b lim: 120 exec/s: 28 rss: 68Mb L: 76/84 MS: 1 CopyPart- 00:08:51.140 [2024-11-16 16:44:36.737711] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.140 [2024-11-16 16:44:36.737743] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.140 #29 NEW cov: 10822 ft: 17635 corp: 11/668b lim: 120 exec/s: 14 rss: 68Mb L: 95/95 MS: 1 CrossOver- 00:08:51.140 #29 DONE cov: 10822 ft: 17635 corp: 11/668b lim: 120 exec/s: 14 rss: 68Mb 00:08:51.140 Done 29 runs in 2 second(s) 00:08:51.400 16:44:37 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:08:51.400 16:44:37 -- ../common.sh@72 -- # (( i++ )) 00:08:51.400 16:44:37 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:51.400 16:44:37 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:51.400 16:44:37 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:51.400 16:44:37 -- vfio/run.sh@23 -- # local timen=1 00:08:51.400 16:44:37 -- vfio/run.sh@24 -- # local core=0x1 00:08:51.400 16:44:37 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:51.400 16:44:37 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:51.400 16:44:37 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:51.400 16:44:37 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:51.400 16:44:37 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:51.400 16:44:37 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:51.400 16:44:37 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:51.400 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:51.400 16:44:37 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:51.400 [2024-11-16 16:44:37.141646] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 22.11.4 initialization... 00:08:51.400 [2024-11-16 16:44:37.141728] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid497330 ] 00:08:51.660 EAL: No free 2048 kB hugepages reported on node 1 00:08:51.660 [2024-11-16 16:44:37.212054] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:51.660 [2024-11-16 16:44:37.248202] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:51.660 [2024-11-16 16:44:37.248364] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:51.918 INFO: Running with entropic power schedule (0xFF, 100). 00:08:51.918 INFO: Seed: 3799839169 00:08:51.918 INFO: Loaded 1 modules (341841 inline 8-bit counters): 341841 [0x263b5cc, 0x268ed1d), 00:08:51.918 INFO: Loaded 1 PC tables (341841 PCs): 341841 [0x268ed20,0x2bc6230), 00:08:51.918 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:51.918 INFO: A corpus is not provided, starting from an empty corpus 00:08:51.918 #2 INITED exec/s: 0 rss: 60Mb 00:08:51.918 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:51.918 This may also happen if the target rejected all inputs we tried so far 00:08:51.918 [2024-11-16 16:44:37.538733] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.918 [2024-11-16 16:44:37.538773] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:52.176 NEW_FUNC[1/638]: 0x4543c8 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:52.176 NEW_FUNC[2/638]: 0x456978 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:52.176 #12 NEW cov: 10769 ft: 10655 corp: 2/11b lim: 90 exec/s: 0 rss: 66Mb L: 10/10 MS: 5 ChangeBit-CopyPart-CopyPart-ChangeBit-CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:52.435 [2024-11-16 16:44:37.994956] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:52.435 [2024-11-16 16:44:37.994997] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:52.435 #13 NEW cov: 10783 ft: 13371 corp: 3/56b lim: 90 exec/s: 0 rss: 67Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:08:52.435 [2024-11-16 16:44:38.172720] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:52.435 [2024-11-16 16:44:38.172749] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:52.694 NEW_FUNC[1/1]: 0x19341e8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:52.694 #14 NEW cov: 10800 ft: 14218 corp: 4/143b lim: 90 exec/s: 0 rss: 68Mb L: 87/87 MS: 1 InsertRepeatedBytes- 00:08:52.694 [2024-11-16 16:44:38.350362] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:52.694 [2024-11-16 16:44:38.350393] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:52.954 #15 NEW cov: 10803 ft: 14958 corp: 5/152b lim: 90 exec/s: 0 rss: 68Mb L: 9/87 MS: 1 EraseBytes- 00:08:52.954 [2024-11-16 16:44:38.526819] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:52.954 [2024-11-16 16:44:38.526850] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:52.954 #16 NEW cov: 10803 ft: 15152 corp: 6/162b lim: 90 exec/s: 16 rss: 68Mb L: 10/87 MS: 1 ChangeByte- 00:08:53.212 [2024-11-16 16:44:38.705156] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.212 [2024-11-16 16:44:38.705187] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.212 #17 NEW cov: 10803 ft: 15256 corp: 7/180b lim: 90 exec/s: 17 rss: 68Mb L: 18/87 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:53.213 [2024-11-16 16:44:38.880682] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.213 [2024-11-16 16:44:38.880712] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.472 #22 NEW cov: 10803 ft: 15343 corp: 8/194b lim: 90 exec/s: 22 rss: 68Mb L: 14/87 MS: 5 CrossOver-CopyPart-EraseBytes-CopyPart-PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:53.472 [2024-11-16 16:44:39.058006] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.472 [2024-11-16 16:44:39.058035] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.472 #27 NEW cov: 10803 ft: 15451 corp: 9/205b lim: 90 exec/s: 27 rss: 68Mb L: 11/87 MS: 5 ChangeBit-CopyPart-CopyPart-ChangeByte-CrossOver- 00:08:53.729 [2024-11-16 16:44:39.243807] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.729 [2024-11-16 16:44:39.243835] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.729 #28 NEW cov: 10810 ft: 15602 corp: 10/292b lim: 90 exec/s: 28 rss: 68Mb L: 87/87 MS: 1 ChangeBit- 00:08:53.729 [2024-11-16 16:44:39.418888] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.729 [2024-11-16 16:44:39.418916] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.988 #29 NEW cov: 10810 ft: 15724 corp: 11/303b lim: 90 exec/s: 14 rss: 68Mb L: 11/87 MS: 1 ChangeASCIIInt- 00:08:53.988 #29 DONE cov: 10810 ft: 15724 corp: 11/303b lim: 90 exec/s: 14 rss: 68Mb 00:08:53.988 ###### Recommended dictionary. ###### 00:08:53.988 "\001\000\000\000\000\000\000\000" # Uses: 2 00:08:53.988 ###### End of recommended dictionary. ###### 00:08:53.988 Done 29 runs in 2 second(s) 00:08:54.247 16:44:39 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:08:54.247 16:44:39 -- ../common.sh@72 -- # (( i++ )) 00:08:54.247 16:44:39 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:54.247 16:44:39 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:08:54.247 00:08:54.247 real 0m19.421s 00:08:54.247 user 0m27.453s 00:08:54.247 sys 0m1.880s 00:08:54.247 16:44:39 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:54.247 16:44:39 -- common/autotest_common.sh@10 -- # set +x 00:08:54.247 ************************************ 00:08:54.247 END TEST vfio_fuzz 00:08:54.247 ************************************ 00:08:54.247 00:08:54.247 real 1m22.099s 00:08:54.247 user 2m6.863s 00:08:54.247 sys 0m8.956s 00:08:54.247 16:44:39 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:54.247 16:44:39 -- common/autotest_common.sh@10 -- # set +x 00:08:54.247 ************************************ 00:08:54.247 END TEST llvm_fuzz 00:08:54.247 ************************************ 00:08:54.247 16:44:39 -- spdk/autotest.sh@365 -- # [[ 0 -eq 1 ]] 00:08:54.247 16:44:39 -- spdk/autotest.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:08:54.247 16:44:39 -- spdk/autotest.sh@372 -- # timing_enter post_cleanup 00:08:54.247 16:44:39 -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:54.247 16:44:39 -- common/autotest_common.sh@10 -- # set +x 00:08:54.247 16:44:39 -- spdk/autotest.sh@373 -- # autotest_cleanup 00:08:54.247 16:44:39 -- common/autotest_common.sh@1381 -- # local autotest_es=0 00:08:54.247 16:44:39 -- common/autotest_common.sh@1382 -- # xtrace_disable 00:08:54.247 16:44:39 -- common/autotest_common.sh@10 -- # set +x 00:09:00.822 INFO: APP EXITING 00:09:00.822 INFO: killing all VMs 00:09:00.822 INFO: killing vhost app 00:09:00.822 INFO: EXIT DONE 00:09:04.116 Waiting for block devices as requested 00:09:04.116 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:04.116 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:04.116 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:04.116 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:04.116 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:04.116 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:04.375 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:04.376 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:04.376 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:04.635 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:04.635 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:04.635 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:04.895 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:04.895 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:04.895 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:05.155 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:05.155 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:09.352 Cleaning 00:09:09.352 Removing: /dev/shm/spdk_tgt_trace.pid459985 00:09:09.352 Removing: /var/run/dpdk/spdk_pid457490 00:09:09.352 Removing: /var/run/dpdk/spdk_pid458756 00:09:09.352 Removing: /var/run/dpdk/spdk_pid459985 00:09:09.352 Removing: /var/run/dpdk/spdk_pid460783 00:09:09.352 Removing: /var/run/dpdk/spdk_pid461112 00:09:09.352 Removing: /var/run/dpdk/spdk_pid461449 00:09:09.352 Removing: /var/run/dpdk/spdk_pid461788 00:09:09.352 Removing: /var/run/dpdk/spdk_pid462127 00:09:09.352 Removing: /var/run/dpdk/spdk_pid462420 00:09:09.352 Removing: /var/run/dpdk/spdk_pid462703 00:09:09.352 Removing: /var/run/dpdk/spdk_pid463021 00:09:09.352 Removing: /var/run/dpdk/spdk_pid463902 00:09:09.352 Removing: /var/run/dpdk/spdk_pid467100 00:09:09.352 Removing: /var/run/dpdk/spdk_pid467433 00:09:09.352 Removing: /var/run/dpdk/spdk_pid467938 00:09:09.352 Removing: /var/run/dpdk/spdk_pid468104 00:09:09.352 Removing: /var/run/dpdk/spdk_pid468900 00:09:09.352 Removing: /var/run/dpdk/spdk_pid469106 00:09:09.352 Removing: /var/run/dpdk/spdk_pid469686 00:09:09.352 Removing: /var/run/dpdk/spdk_pid469950 00:09:09.352 Removing: /var/run/dpdk/spdk_pid470248 00:09:09.352 Removing: /var/run/dpdk/spdk_pid470271 00:09:09.352 Removing: /var/run/dpdk/spdk_pid470559 00:09:09.352 Removing: /var/run/dpdk/spdk_pid470767 00:09:09.352 Removing: /var/run/dpdk/spdk_pid471212 00:09:09.352 Removing: /var/run/dpdk/spdk_pid471496 00:09:09.352 Removing: /var/run/dpdk/spdk_pid471784 00:09:09.352 Removing: /var/run/dpdk/spdk_pid471865 00:09:09.352 Removing: /var/run/dpdk/spdk_pid472165 00:09:09.352 Removing: /var/run/dpdk/spdk_pid472220 00:09:09.352 Removing: /var/run/dpdk/spdk_pid472496 00:09:09.352 Removing: /var/run/dpdk/spdk_pid472656 00:09:09.352 Removing: /var/run/dpdk/spdk_pid472825 00:09:09.352 Removing: /var/run/dpdk/spdk_pid473064 00:09:09.352 Removing: /var/run/dpdk/spdk_pid473353 00:09:09.352 Removing: /var/run/dpdk/spdk_pid473619 00:09:09.352 Removing: /var/run/dpdk/spdk_pid473904 00:09:09.352 Removing: /var/run/dpdk/spdk_pid474136 00:09:09.352 Removing: /var/run/dpdk/spdk_pid474317 00:09:09.352 Removing: /var/run/dpdk/spdk_pid474481 00:09:09.352 Removing: /var/run/dpdk/spdk_pid474766 00:09:09.352 Removing: /var/run/dpdk/spdk_pid475034 00:09:09.352 Removing: /var/run/dpdk/spdk_pid475321 00:09:09.352 Removing: /var/run/dpdk/spdk_pid475591 00:09:09.352 Removing: /var/run/dpdk/spdk_pid475821 00:09:09.352 Removing: /var/run/dpdk/spdk_pid475989 00:09:09.352 Removing: /var/run/dpdk/spdk_pid476183 00:09:09.352 Removing: /var/run/dpdk/spdk_pid476449 00:09:09.352 Removing: /var/run/dpdk/spdk_pid476732 00:09:09.352 Removing: /var/run/dpdk/spdk_pid477004 00:09:09.352 Removing: /var/run/dpdk/spdk_pid477283 00:09:09.352 Removing: /var/run/dpdk/spdk_pid477443 00:09:09.352 Removing: /var/run/dpdk/spdk_pid477616 00:09:09.352 Removing: /var/run/dpdk/spdk_pid477868 00:09:09.352 Removing: /var/run/dpdk/spdk_pid478149 00:09:09.352 Removing: /var/run/dpdk/spdk_pid478417 00:09:09.352 Removing: /var/run/dpdk/spdk_pid478698 00:09:09.352 Removing: /var/run/dpdk/spdk_pid478917 00:09:09.352 Removing: /var/run/dpdk/spdk_pid479093 00:09:09.352 Removing: /var/run/dpdk/spdk_pid479279 00:09:09.352 Removing: /var/run/dpdk/spdk_pid479560 00:09:09.352 Removing: /var/run/dpdk/spdk_pid479834 00:09:09.352 Removing: /var/run/dpdk/spdk_pid480115 00:09:09.352 Removing: /var/run/dpdk/spdk_pid480385 00:09:09.352 Removing: /var/run/dpdk/spdk_pid480591 00:09:09.352 Removing: /var/run/dpdk/spdk_pid480747 00:09:09.352 Removing: /var/run/dpdk/spdk_pid480990 00:09:09.352 Removing: /var/run/dpdk/spdk_pid481256 00:09:09.352 Removing: /var/run/dpdk/spdk_pid481545 00:09:09.352 Removing: /var/run/dpdk/spdk_pid481811 00:09:09.352 Removing: /var/run/dpdk/spdk_pid482093 00:09:09.352 Removing: /var/run/dpdk/spdk_pid482176 00:09:09.352 Removing: /var/run/dpdk/spdk_pid482512 00:09:09.352 Removing: /var/run/dpdk/spdk_pid483024 00:09:09.352 Removing: /var/run/dpdk/spdk_pid483557 00:09:09.352 Removing: /var/run/dpdk/spdk_pid484019 00:09:09.352 Removing: /var/run/dpdk/spdk_pid484386 00:09:09.352 Removing: /var/run/dpdk/spdk_pid484938 00:09:09.352 Removing: /var/run/dpdk/spdk_pid485278 00:09:09.352 Removing: /var/run/dpdk/spdk_pid485776 00:09:09.352 Removing: /var/run/dpdk/spdk_pid486309 00:09:09.352 Removing: /var/run/dpdk/spdk_pid486605 00:09:09.352 Removing: /var/run/dpdk/spdk_pid487145 00:09:09.352 Removing: /var/run/dpdk/spdk_pid487591 00:09:09.352 Removing: /var/run/dpdk/spdk_pid487978 00:09:09.352 Removing: /var/run/dpdk/spdk_pid488515 00:09:09.352 Removing: /var/run/dpdk/spdk_pid488830 00:09:09.352 Removing: /var/run/dpdk/spdk_pid489347 00:09:09.352 Removing: /var/run/dpdk/spdk_pid489813 00:09:09.352 Removing: /var/run/dpdk/spdk_pid490179 00:09:09.352 Removing: /var/run/dpdk/spdk_pid490716 00:09:09.352 Removing: /var/run/dpdk/spdk_pid491104 00:09:09.352 Removing: /var/run/dpdk/spdk_pid491549 00:09:09.352 Removing: /var/run/dpdk/spdk_pid492094 00:09:09.352 Removing: /var/run/dpdk/spdk_pid492384 00:09:09.352 Removing: /var/run/dpdk/spdk_pid492917 00:09:09.352 Removing: /var/run/dpdk/spdk_pid493346 00:09:09.352 Removing: /var/run/dpdk/spdk_pid493750 00:09:09.352 Removing: /var/run/dpdk/spdk_pid494378 00:09:09.352 Removing: /var/run/dpdk/spdk_pid494922 00:09:09.352 Removing: /var/run/dpdk/spdk_pid495360 00:09:09.352 Removing: /var/run/dpdk/spdk_pid495766 00:09:09.352 Removing: /var/run/dpdk/spdk_pid496309 00:09:09.352 Removing: /var/run/dpdk/spdk_pid496848 00:09:09.352 Removing: /var/run/dpdk/spdk_pid497330 00:09:09.353 Clean 00:09:09.353 killing process with pid 408764 00:09:13.549 killing process with pid 408761 00:09:13.549 killing process with pid 408763 00:09:13.549 killing process with pid 408762 00:09:13.549 16:44:58 -- common/autotest_common.sh@1446 -- # return 0 00:09:13.549 16:44:58 -- spdk/autotest.sh@374 -- # timing_exit post_cleanup 00:09:13.549 16:44:58 -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:13.549 16:44:58 -- common/autotest_common.sh@10 -- # set +x 00:09:13.549 16:44:58 -- spdk/autotest.sh@376 -- # timing_exit autotest 00:09:13.549 16:44:58 -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:13.549 16:44:58 -- common/autotest_common.sh@10 -- # set +x 00:09:13.549 16:44:58 -- spdk/autotest.sh@377 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:13.549 16:44:58 -- spdk/autotest.sh@379 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:13.549 16:44:58 -- spdk/autotest.sh@379 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:13.549 16:44:58 -- spdk/autotest.sh@381 -- # [[ y == y ]] 00:09:13.549 16:44:58 -- spdk/autotest.sh@383 -- # hostname 00:09:13.549 16:44:58 -- spdk/autotest.sh@383 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:09:13.549 geninfo: WARNING: invalid characters removed from testname! 00:09:14.486 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcda 00:09:14.486 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcda 00:09:14.486 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcda 00:09:24.473 16:45:10 -- spdk/autotest.sh@384 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:32.598 16:45:16 -- spdk/autotest.sh@385 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:35.887 16:45:21 -- spdk/autotest.sh@389 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:41.162 16:45:26 -- spdk/autotest.sh@390 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:45.356 16:45:30 -- spdk/autotest.sh@391 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:50.633 16:45:35 -- spdk/autotest.sh@392 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:54.918 16:45:40 -- spdk/autotest.sh@393 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:09:54.918 16:45:40 -- common/autotest_common.sh@1689 -- $ [[ y == y ]] 00:09:54.918 16:45:40 -- common/autotest_common.sh@1690 -- $ awk '{print $NF}' 00:09:54.918 16:45:40 -- common/autotest_common.sh@1690 -- $ lcov --version 00:09:54.918 16:45:40 -- common/autotest_common.sh@1690 -- $ lt 1.15 2 00:09:54.918 16:45:40 -- scripts/common.sh@372 -- $ cmp_versions 1.15 '<' 2 00:09:54.918 16:45:40 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:09:54.918 16:45:40 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:09:54.918 16:45:40 -- scripts/common.sh@335 -- $ IFS=.-: 00:09:54.918 16:45:40 -- scripts/common.sh@335 -- $ read -ra ver1 00:09:54.918 16:45:40 -- scripts/common.sh@336 -- $ IFS=.-: 00:09:54.918 16:45:40 -- scripts/common.sh@336 -- $ read -ra ver2 00:09:54.919 16:45:40 -- scripts/common.sh@337 -- $ local 'op=<' 00:09:54.919 16:45:40 -- scripts/common.sh@339 -- $ ver1_l=2 00:09:54.919 16:45:40 -- scripts/common.sh@340 -- $ ver2_l=1 00:09:54.919 16:45:40 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:09:54.919 16:45:40 -- scripts/common.sh@343 -- $ case "$op" in 00:09:54.919 16:45:40 -- scripts/common.sh@344 -- $ : 1 00:09:54.919 16:45:40 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:09:54.919 16:45:40 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:54.919 16:45:40 -- scripts/common.sh@364 -- $ decimal 1 00:09:54.919 16:45:40 -- scripts/common.sh@352 -- $ local d=1 00:09:54.919 16:45:40 -- scripts/common.sh@353 -- $ [[ 1 =~ ^[0-9]+$ ]] 00:09:54.919 16:45:40 -- scripts/common.sh@354 -- $ echo 1 00:09:54.919 16:45:40 -- scripts/common.sh@364 -- $ ver1[v]=1 00:09:54.919 16:45:40 -- scripts/common.sh@365 -- $ decimal 2 00:09:54.919 16:45:40 -- scripts/common.sh@352 -- $ local d=2 00:09:54.919 16:45:40 -- scripts/common.sh@353 -- $ [[ 2 =~ ^[0-9]+$ ]] 00:09:54.919 16:45:40 -- scripts/common.sh@354 -- $ echo 2 00:09:54.919 16:45:40 -- scripts/common.sh@365 -- $ ver2[v]=2 00:09:54.919 16:45:40 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:09:54.919 16:45:40 -- scripts/common.sh@367 -- $ (( ver1[v] < ver2[v] )) 00:09:54.919 16:45:40 -- scripts/common.sh@367 -- $ return 0 00:09:54.919 16:45:40 -- common/autotest_common.sh@1691 -- $ lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:54.919 16:45:40 -- common/autotest_common.sh@1703 -- $ export 'LCOV_OPTS= 00:09:54.919 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:54.919 --rc genhtml_branch_coverage=1 00:09:54.919 --rc genhtml_function_coverage=1 00:09:54.919 --rc genhtml_legend=1 00:09:54.919 --rc geninfo_all_blocks=1 00:09:54.919 --rc geninfo_unexecuted_blocks=1 00:09:54.919 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:54.919 ' 00:09:54.919 16:45:40 -- common/autotest_common.sh@1703 -- $ LCOV_OPTS=' 00:09:54.919 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:54.919 --rc genhtml_branch_coverage=1 00:09:54.919 --rc genhtml_function_coverage=1 00:09:54.919 --rc genhtml_legend=1 00:09:54.919 --rc geninfo_all_blocks=1 00:09:54.919 --rc geninfo_unexecuted_blocks=1 00:09:54.919 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:54.919 ' 00:09:54.919 16:45:40 -- common/autotest_common.sh@1704 -- $ export 'LCOV=lcov 00:09:54.919 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:54.919 --rc genhtml_branch_coverage=1 00:09:54.919 --rc genhtml_function_coverage=1 00:09:54.919 --rc genhtml_legend=1 00:09:54.919 --rc geninfo_all_blocks=1 00:09:54.919 --rc geninfo_unexecuted_blocks=1 00:09:54.919 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:54.919 ' 00:09:54.919 16:45:40 -- common/autotest_common.sh@1704 -- $ LCOV='lcov 00:09:54.919 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:54.919 --rc genhtml_branch_coverage=1 00:09:54.919 --rc genhtml_function_coverage=1 00:09:54.919 --rc genhtml_legend=1 00:09:54.919 --rc geninfo_all_blocks=1 00:09:54.919 --rc geninfo_unexecuted_blocks=1 00:09:54.919 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:54.919 ' 00:09:54.919 16:45:40 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:54.919 16:45:40 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:09:54.919 16:45:40 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:54.919 16:45:40 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:54.919 16:45:40 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:54.919 16:45:40 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:54.919 16:45:40 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:54.919 16:45:40 -- paths/export.sh@5 -- $ export PATH 00:09:54.919 16:45:40 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:54.919 16:45:40 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:09:54.919 16:45:40 -- common/autobuild_common.sh@440 -- $ date +%s 00:09:54.919 16:45:40 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1731771940.XXXXXX 00:09:54.919 16:45:40 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1731771940.Z4KlAa 00:09:54.919 16:45:40 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:09:54.919 16:45:40 -- common/autobuild_common.sh@446 -- $ '[' -n v22.11.4 ']' 00:09:54.919 16:45:40 -- common/autobuild_common.sh@447 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:54.919 16:45:40 -- common/autobuild_common.sh@447 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:09:54.919 16:45:40 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:09:54.919 16:45:40 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:09:54.919 16:45:40 -- common/autobuild_common.sh@456 -- $ get_config_params 00:09:54.919 16:45:40 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:09:54.919 16:45:40 -- common/autotest_common.sh@10 -- $ set +x 00:09:54.919 16:45:40 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:09:54.919 16:45:40 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:09:54.919 16:45:40 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:54.919 16:45:40 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:09:54.919 16:45:40 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:09:54.919 16:45:40 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:09:54.919 16:45:40 -- spdk/autopackage.sh@19 -- $ timing_finish 00:09:54.919 16:45:40 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:54.919 16:45:40 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:09:54.919 16:45:40 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:54.919 16:45:40 -- spdk/autopackage.sh@20 -- $ exit 0 00:09:54.919 + [[ -n 353445 ]] 00:09:54.919 + sudo kill 353445 00:09:54.987 [Pipeline] } 00:09:55.001 [Pipeline] // stage 00:09:55.006 [Pipeline] } 00:09:55.020 [Pipeline] // timeout 00:09:55.025 [Pipeline] } 00:09:55.038 [Pipeline] // catchError 00:09:55.042 [Pipeline] } 00:09:55.056 [Pipeline] // wrap 00:09:55.061 [Pipeline] } 00:09:55.075 [Pipeline] // catchError 00:09:55.083 [Pipeline] stage 00:09:55.085 [Pipeline] { (Epilogue) 00:09:55.098 [Pipeline] catchError 00:09:55.099 [Pipeline] { 00:09:55.112 [Pipeline] echo 00:09:55.114 Cleanup processes 00:09:55.120 [Pipeline] sh 00:09:55.458 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:55.458 507488 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:55.513 [Pipeline] sh 00:09:55.888 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:55.888 ++ grep -v 'sudo pgrep' 00:09:55.888 ++ awk '{print $1}' 00:09:55.888 + sudo kill -9 00:09:55.888 + true 00:09:55.901 [Pipeline] sh 00:09:56.187 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:56.187 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:56.187 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:57.565 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:07.559 [Pipeline] sh 00:10:07.846 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:10:07.846 Artifacts sizes are good 00:10:07.860 [Pipeline] archiveArtifacts 00:10:07.867 Archiving artifacts 00:10:07.997 [Pipeline] sh 00:10:08.281 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:10:08.296 [Pipeline] cleanWs 00:10:08.306 [WS-CLEANUP] Deleting project workspace... 00:10:08.306 [WS-CLEANUP] Deferred wipeout is used... 00:10:08.312 [WS-CLEANUP] done 00:10:08.314 [Pipeline] } 00:10:08.330 [Pipeline] // catchError 00:10:08.342 [Pipeline] sh 00:10:08.626 + logger -p user.info -t JENKINS-CI 00:10:08.635 [Pipeline] } 00:10:08.649 [Pipeline] // stage 00:10:08.654 [Pipeline] } 00:10:08.668 [Pipeline] // node 00:10:08.673 [Pipeline] End of Pipeline 00:10:08.726 Finished: SUCCESS