00:00:00.001 Started by upstream project "autotest-spdk-v24.09-vs-dpdk-v22.11" build number 173 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3675 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.086 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.087 The recommended git tool is: git 00:00:00.087 using credential 00000000-0000-0000-0000-000000000002 00:00:00.091 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.107 Fetching changes from the remote Git repository 00:00:00.109 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.133 Using shallow fetch with depth 1 00:00:00.133 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.133 > git --version # timeout=10 00:00:00.157 > git --version # 'git version 2.39.2' 00:00:00.157 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.181 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.181 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:05.603 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:05.613 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:05.624 Checking out Revision db4637e8b949f278f369ec13f70585206ccd9507 (FETCH_HEAD) 00:00:05.624 > git config core.sparsecheckout # timeout=10 00:00:05.634 > git read-tree -mu HEAD # timeout=10 00:00:05.649 > git checkout -f db4637e8b949f278f369ec13f70585206ccd9507 # timeout=5 00:00:05.671 Commit message: "jenkins/jjb-config: Add missing SPDK_TEST_NVME_INTERRUPT flag" 00:00:05.671 > git rev-list --no-walk db4637e8b949f278f369ec13f70585206ccd9507 # timeout=10 00:00:05.772 [Pipeline] Start of Pipeline 00:00:05.786 [Pipeline] library 00:00:05.788 Loading library shm_lib@master 00:00:05.788 Library shm_lib@master is cached. Copying from home. 00:00:05.809 [Pipeline] node 00:00:05.831 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:05.833 [Pipeline] { 00:00:05.843 [Pipeline] catchError 00:00:05.845 [Pipeline] { 00:00:05.860 [Pipeline] wrap 00:00:05.872 [Pipeline] { 00:00:05.883 [Pipeline] stage 00:00:05.886 [Pipeline] { (Prologue) 00:00:06.140 [Pipeline] sh 00:00:06.424 + logger -p user.info -t JENKINS-CI 00:00:06.441 [Pipeline] echo 00:00:06.443 Node: WFP20 00:00:06.449 [Pipeline] sh 00:00:06.747 [Pipeline] setCustomBuildProperty 00:00:06.759 [Pipeline] echo 00:00:06.761 Cleanup processes 00:00:06.767 [Pipeline] sh 00:00:07.051 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:07.051 3612605 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:07.064 [Pipeline] sh 00:00:07.347 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:07.347 ++ grep -v 'sudo pgrep' 00:00:07.347 ++ awk '{print $1}' 00:00:07.347 + sudo kill -9 00:00:07.347 + true 00:00:07.362 [Pipeline] cleanWs 00:00:07.372 [WS-CLEANUP] Deleting project workspace... 00:00:07.372 [WS-CLEANUP] Deferred wipeout is used... 00:00:07.379 [WS-CLEANUP] done 00:00:07.384 [Pipeline] setCustomBuildProperty 00:00:07.403 [Pipeline] sh 00:00:07.688 + sudo git config --global --replace-all safe.directory '*' 00:00:07.752 [Pipeline] httpRequest 00:00:08.128 [Pipeline] echo 00:00:08.129 Sorcerer 10.211.164.20 is alive 00:00:08.136 [Pipeline] retry 00:00:08.137 [Pipeline] { 00:00:08.145 [Pipeline] httpRequest 00:00:08.149 HttpMethod: GET 00:00:08.149 URL: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:08.150 Sending request to url: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:08.164 Response Code: HTTP/1.1 200 OK 00:00:08.164 Success: Status code 200 is in the accepted range: 200,404 00:00:08.165 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:31.739 [Pipeline] } 00:00:31.757 [Pipeline] // retry 00:00:31.764 [Pipeline] sh 00:00:32.050 + tar --no-same-owner -xf jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:32.066 [Pipeline] httpRequest 00:00:32.583 [Pipeline] echo 00:00:32.586 Sorcerer 10.211.164.20 is alive 00:00:32.596 [Pipeline] retry 00:00:32.599 [Pipeline] { 00:00:32.613 [Pipeline] httpRequest 00:00:32.618 HttpMethod: GET 00:00:32.619 URL: http://10.211.164.20/packages/spdk_b18e1bd6297ec2f89ab275de3193457af1c946df.tar.gz 00:00:32.619 Sending request to url: http://10.211.164.20/packages/spdk_b18e1bd6297ec2f89ab275de3193457af1c946df.tar.gz 00:00:32.634 Response Code: HTTP/1.1 200 OK 00:00:32.634 Success: Status code 200 is in the accepted range: 200,404 00:00:32.634 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_b18e1bd6297ec2f89ab275de3193457af1c946df.tar.gz 00:01:38.975 [Pipeline] } 00:01:38.993 [Pipeline] // retry 00:01:39.001 [Pipeline] sh 00:01:39.287 + tar --no-same-owner -xf spdk_b18e1bd6297ec2f89ab275de3193457af1c946df.tar.gz 00:01:41.842 [Pipeline] sh 00:01:42.128 + git -C spdk log --oneline -n5 00:01:42.128 b18e1bd62 version: v24.09.1-pre 00:01:42.128 19524ad45 version: v24.09 00:01:42.128 9756b40a3 dpdk: update submodule to include alarm_cancel fix 00:01:42.128 a808500d2 test/nvmf: disable nvmf_shutdown_tc4 on e810 00:01:42.128 3024272c6 bdev/nvme: take nvme_ctrlr.mutex when setting keys 00:01:42.148 [Pipeline] withCredentials 00:01:42.162 > git --version # timeout=10 00:01:42.176 > git --version # 'git version 2.39.2' 00:01:42.194 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:01:42.197 [Pipeline] { 00:01:42.206 [Pipeline] retry 00:01:42.208 [Pipeline] { 00:01:42.224 [Pipeline] sh 00:01:42.508 + git ls-remote http://dpdk.org/git/dpdk-stable v22.11.4 00:01:42.782 [Pipeline] } 00:01:42.798 [Pipeline] // retry 00:01:42.803 [Pipeline] } 00:01:42.819 [Pipeline] // withCredentials 00:01:42.828 [Pipeline] httpRequest 00:01:43.407 [Pipeline] echo 00:01:43.408 Sorcerer 10.211.164.20 is alive 00:01:43.418 [Pipeline] retry 00:01:43.420 [Pipeline] { 00:01:43.434 [Pipeline] httpRequest 00:01:43.439 HttpMethod: GET 00:01:43.440 URL: http://10.211.164.20/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:43.440 Sending request to url: http://10.211.164.20/packages/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:43.447 Response Code: HTTP/1.1 200 OK 00:01:43.448 Success: Status code 200 is in the accepted range: 200,404 00:01:43.448 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:51.598 [Pipeline] } 00:01:51.616 [Pipeline] // retry 00:01:51.623 [Pipeline] sh 00:01:51.909 + tar --no-same-owner -xf dpdk_fee0f13c213d0584f0c42a51d0e0625d99a0b2f1.tar.gz 00:01:53.301 [Pipeline] sh 00:01:53.599 + git -C dpdk log --oneline -n5 00:01:53.599 caf0f5d395 version: 22.11.4 00:01:53.599 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:01:53.599 dc9c799c7d vhost: fix missing spinlock unlock 00:01:53.599 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:01:53.599 6ef77f2a5e net/gve: fix RX buffer size alignment 00:01:53.609 [Pipeline] } 00:01:53.624 [Pipeline] // stage 00:01:53.635 [Pipeline] stage 00:01:53.637 [Pipeline] { (Prepare) 00:01:53.664 [Pipeline] writeFile 00:01:53.684 [Pipeline] sh 00:01:53.973 + logger -p user.info -t JENKINS-CI 00:01:53.986 [Pipeline] sh 00:01:54.271 + logger -p user.info -t JENKINS-CI 00:01:54.283 [Pipeline] sh 00:01:54.567 + cat autorun-spdk.conf 00:01:54.567 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:54.567 SPDK_TEST_FUZZER_SHORT=1 00:01:54.567 SPDK_TEST_FUZZER=1 00:01:54.567 SPDK_TEST_SETUP=1 00:01:54.567 SPDK_RUN_UBSAN=1 00:01:54.567 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:54.567 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:54.573 RUN_NIGHTLY=1 00:01:54.577 [Pipeline] readFile 00:01:54.599 [Pipeline] withEnv 00:01:54.601 [Pipeline] { 00:01:54.612 [Pipeline] sh 00:01:54.898 + set -ex 00:01:54.898 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:54.898 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:54.898 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:54.898 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:54.898 ++ SPDK_TEST_FUZZER=1 00:01:54.898 ++ SPDK_TEST_SETUP=1 00:01:54.898 ++ SPDK_RUN_UBSAN=1 00:01:54.898 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:54.898 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:54.898 ++ RUN_NIGHTLY=1 00:01:54.898 + case $SPDK_TEST_NVMF_NICS in 00:01:54.898 + DRIVERS= 00:01:54.898 + [[ -n '' ]] 00:01:54.898 + exit 0 00:01:54.908 [Pipeline] } 00:01:54.921 [Pipeline] // withEnv 00:01:54.927 [Pipeline] } 00:01:54.941 [Pipeline] // stage 00:01:54.951 [Pipeline] catchError 00:01:54.953 [Pipeline] { 00:01:54.968 [Pipeline] timeout 00:01:54.969 Timeout set to expire in 30 min 00:01:54.971 [Pipeline] { 00:01:54.985 [Pipeline] stage 00:01:54.988 [Pipeline] { (Tests) 00:01:55.002 [Pipeline] sh 00:01:55.390 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:55.390 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:55.390 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:55.390 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:55.390 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:55.390 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:55.390 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:55.390 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:55.390 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:55.390 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:55.390 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:55.390 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:55.390 + source /etc/os-release 00:01:55.390 ++ NAME='Fedora Linux' 00:01:55.390 ++ VERSION='39 (Cloud Edition)' 00:01:55.390 ++ ID=fedora 00:01:55.390 ++ VERSION_ID=39 00:01:55.390 ++ VERSION_CODENAME= 00:01:55.390 ++ PLATFORM_ID=platform:f39 00:01:55.390 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:01:55.390 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:55.390 ++ LOGO=fedora-logo-icon 00:01:55.390 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:01:55.390 ++ HOME_URL=https://fedoraproject.org/ 00:01:55.390 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:01:55.390 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:55.390 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:55.390 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:55.390 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:01:55.390 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:55.390 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:01:55.390 ++ SUPPORT_END=2024-11-12 00:01:55.390 ++ VARIANT='Cloud Edition' 00:01:55.390 ++ VARIANT_ID=cloud 00:01:55.390 + uname -a 00:01:55.390 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:01:55.390 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:58.683 Hugepages 00:01:58.683 node hugesize free / total 00:01:58.683 node0 1048576kB 0 / 0 00:01:58.683 node0 2048kB 0 / 0 00:01:58.683 node1 1048576kB 0 / 0 00:01:58.683 node1 2048kB 0 / 0 00:01:58.683 00:01:58.683 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:58.683 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:58.683 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:58.683 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:58.683 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:58.683 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:58.683 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:58.683 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:58.683 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:58.683 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:58.683 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:58.683 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:58.683 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:58.683 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:58.683 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:58.683 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:58.683 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:58.683 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:58.683 + rm -f /tmp/spdk-ld-path 00:01:58.683 + source autorun-spdk.conf 00:01:58.683 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:58.683 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:58.683 ++ SPDK_TEST_FUZZER=1 00:01:58.683 ++ SPDK_TEST_SETUP=1 00:01:58.683 ++ SPDK_RUN_UBSAN=1 00:01:58.683 ++ SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:58.683 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:58.683 ++ RUN_NIGHTLY=1 00:01:58.683 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:58.683 + [[ -n '' ]] 00:01:58.683 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:58.683 + for M in /var/spdk/build-*-manifest.txt 00:01:58.683 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:01:58.683 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:58.683 + for M in /var/spdk/build-*-manifest.txt 00:01:58.683 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:58.683 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:58.683 + for M in /var/spdk/build-*-manifest.txt 00:01:58.683 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:58.683 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:58.683 ++ uname 00:01:58.683 + [[ Linux == \L\i\n\u\x ]] 00:01:58.683 + sudo dmesg -T 00:01:58.683 + sudo dmesg --clear 00:01:58.683 + dmesg_pid=3613806 00:01:58.683 + [[ Fedora Linux == FreeBSD ]] 00:01:58.683 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:58.683 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:58.683 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:58.683 + [[ -x /usr/src/fio-static/fio ]] 00:01:58.683 + export FIO_BIN=/usr/src/fio-static/fio 00:01:58.683 + FIO_BIN=/usr/src/fio-static/fio 00:01:58.683 + sudo dmesg -Tw 00:01:58.683 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:58.683 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:58.683 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:58.684 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:58.684 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:58.684 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:58.684 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:58.684 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:58.684 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:58.684 Test configuration: 00:01:58.684 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:58.684 SPDK_TEST_FUZZER_SHORT=1 00:01:58.684 SPDK_TEST_FUZZER=1 00:01:58.684 SPDK_TEST_SETUP=1 00:01:58.684 SPDK_RUN_UBSAN=1 00:01:58.684 SPDK_TEST_NATIVE_DPDK=v22.11.4 00:01:58.684 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:58.684 RUN_NIGHTLY=1 16:25:56 -- common/autotest_common.sh@1680 -- $ [[ n == y ]] 00:01:58.684 16:25:56 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:58.684 16:25:56 -- scripts/common.sh@15 -- $ shopt -s extglob 00:01:58.684 16:25:56 -- scripts/common.sh@544 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:58.684 16:25:56 -- scripts/common.sh@552 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:58.684 16:25:56 -- scripts/common.sh@553 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:58.684 16:25:56 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:58.684 16:25:56 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:58.684 16:25:56 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:58.684 16:25:56 -- paths/export.sh@5 -- $ export PATH 00:01:58.684 16:25:56 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:58.684 16:25:56 -- common/autobuild_common.sh@478 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:58.684 16:25:56 -- common/autobuild_common.sh@479 -- $ date +%s 00:01:58.684 16:25:56 -- common/autobuild_common.sh@479 -- $ mktemp -dt spdk_1732807556.XXXXXX 00:01:58.684 16:25:56 -- common/autobuild_common.sh@479 -- $ SPDK_WORKSPACE=/tmp/spdk_1732807556.gH04jx 00:01:58.684 16:25:56 -- common/autobuild_common.sh@481 -- $ [[ -n '' ]] 00:01:58.684 16:25:56 -- common/autobuild_common.sh@485 -- $ '[' -n v22.11.4 ']' 00:01:58.684 16:25:56 -- common/autobuild_common.sh@486 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:58.684 16:25:56 -- common/autobuild_common.sh@486 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:01:58.684 16:25:56 -- common/autobuild_common.sh@492 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:58.684 16:25:56 -- common/autobuild_common.sh@494 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:58.684 16:25:56 -- common/autobuild_common.sh@495 -- $ get_config_params 00:01:58.684 16:25:56 -- common/autotest_common.sh@407 -- $ xtrace_disable 00:01:58.684 16:25:56 -- common/autotest_common.sh@10 -- $ set +x 00:01:58.684 16:25:56 -- common/autobuild_common.sh@495 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:01:58.684 16:25:56 -- common/autobuild_common.sh@497 -- $ start_monitor_resources 00:01:58.684 16:25:56 -- pm/common@17 -- $ local monitor 00:01:58.684 16:25:56 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:58.684 16:25:56 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:58.684 16:25:56 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:58.684 16:25:56 -- pm/common@21 -- $ date +%s 00:01:58.684 16:25:56 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:58.684 16:25:56 -- pm/common@21 -- $ date +%s 00:01:58.684 16:25:56 -- pm/common@21 -- $ date +%s 00:01:58.684 16:25:56 -- pm/common@25 -- $ sleep 1 00:01:58.684 16:25:56 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732807556 00:01:58.684 16:25:56 -- pm/common@21 -- $ date +%s 00:01:58.684 16:25:56 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732807556 00:01:58.684 16:25:56 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732807556 00:01:58.684 16:25:56 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732807556 00:01:58.684 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732807556_collect-cpu-temp.pm.log 00:01:58.943 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732807556_collect-cpu-load.pm.log 00:01:58.943 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732807556_collect-vmstat.pm.log 00:01:58.943 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732807556_collect-bmc-pm.bmc.pm.log 00:01:59.885 16:25:57 -- common/autobuild_common.sh@498 -- $ trap stop_monitor_resources EXIT 00:01:59.885 16:25:57 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:59.885 16:25:57 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:59.885 16:25:57 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:59.885 16:25:57 -- spdk/autobuild.sh@16 -- $ date -u 00:01:59.885 Thu Nov 28 03:25:57 PM UTC 2024 00:01:59.885 16:25:57 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:59.885 v24.09-1-gb18e1bd62 00:01:59.885 16:25:57 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:59.885 16:25:57 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:59.885 16:25:57 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:59.885 16:25:57 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:01:59.885 16:25:57 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:01:59.885 16:25:57 -- common/autotest_common.sh@10 -- $ set +x 00:01:59.885 ************************************ 00:01:59.885 START TEST ubsan 00:01:59.885 ************************************ 00:01:59.885 16:25:57 ubsan -- common/autotest_common.sh@1125 -- $ echo 'using ubsan' 00:01:59.885 using ubsan 00:01:59.885 00:01:59.885 real 0m0.001s 00:01:59.885 user 0m0.000s 00:01:59.885 sys 0m0.000s 00:01:59.885 16:25:57 ubsan -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:01:59.885 16:25:57 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:01:59.885 ************************************ 00:01:59.885 END TEST ubsan 00:01:59.885 ************************************ 00:01:59.885 16:25:57 -- spdk/autobuild.sh@27 -- $ '[' -n v22.11.4 ']' 00:01:59.885 16:25:57 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:59.885 16:25:57 -- common/autobuild_common.sh@442 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:59.885 16:25:57 -- common/autotest_common.sh@1101 -- $ '[' 2 -le 1 ']' 00:01:59.885 16:25:57 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:01:59.885 16:25:57 -- common/autotest_common.sh@10 -- $ set +x 00:01:59.885 ************************************ 00:01:59.885 START TEST build_native_dpdk 00:01:59.885 ************************************ 00:01:59.885 16:25:57 build_native_dpdk -- common/autotest_common.sh@1125 -- $ _build_native_dpdk 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@51 -- $ local compiler 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:01:59.885 caf0f5d395 version: 22.11.4 00:01:59.885 7d6f1cc05f Revert "net/iavf: fix abnormal disable HW interrupt" 00:01:59.885 dc9c799c7d vhost: fix missing spinlock unlock 00:01:59.885 4307659a90 net/mlx5: fix LACP redirection in Rx domain 00:01:59.885 6ef77f2a5e net/gve: fix RX buffer size alignment 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@87 -- $ dpdk_ver=22.11.4 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@100 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base") 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@102 -- $ local mlx5_libs_added=n 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@103 -- $ [[ 0 -eq 1 ]] 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@139 -- $ [[ 0 -eq 1 ]] 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@167 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@168 -- $ uname -s 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@168 -- $ '[' Linux = Linux ']' 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@169 -- $ lt 22.11.4 21.11.0 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 22.11.4 '<' 21.11.0 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 22 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@353 -- $ local d=22 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@355 -- $ echo 22 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=22 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 21 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@353 -- $ local d=21 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@355 -- $ echo 21 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=21 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@367 -- $ return 1 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@173 -- $ patch -p1 00:01:59.885 patching file config/rte_config.h 00:01:59.885 Hunk #1 succeeded at 60 (offset 1 line). 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@176 -- $ lt 22.11.4 24.07.0 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 22.11.4 '<' 24.07.0 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 22 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@353 -- $ local d=22 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@355 -- $ echo 22 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=22 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:01:59.885 16:25:57 build_native_dpdk -- scripts/common.sh@368 -- $ return 0 00:01:59.885 16:25:57 build_native_dpdk -- common/autobuild_common.sh@177 -- $ patch -p1 00:01:59.886 patching file lib/pcapng/rte_pcapng.c 00:01:59.886 Hunk #1 succeeded at 110 (offset -18 lines). 00:01:59.886 16:25:57 build_native_dpdk -- common/autobuild_common.sh@179 -- $ ge 22.11.4 24.07.0 00:01:59.886 16:25:57 build_native_dpdk -- scripts/common.sh@376 -- $ cmp_versions 22.11.4 '>=' 24.07.0 00:01:59.886 16:25:57 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:01:59.886 16:25:57 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:01:59.886 16:25:57 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:00.145 16:25:57 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:00.145 16:25:57 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:00.145 16:25:57 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:00.145 16:25:57 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=>=' 00:02:00.145 16:25:57 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:00.146 16:25:57 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:00.146 16:25:57 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:00.146 16:25:57 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:00.146 16:25:57 build_native_dpdk -- scripts/common.sh@348 -- $ : 1 00:02:00.146 16:25:57 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:00.146 16:25:57 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:00.146 16:25:57 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 22 00:02:00.146 16:25:57 build_native_dpdk -- scripts/common.sh@353 -- $ local d=22 00:02:00.146 16:25:57 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 22 =~ ^[0-9]+$ ]] 00:02:00.146 16:25:57 build_native_dpdk -- scripts/common.sh@355 -- $ echo 22 00:02:00.146 16:25:57 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=22 00:02:00.146 16:25:57 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:02:00.146 16:25:57 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:02:00.146 16:25:57 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:02:00.146 16:25:57 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:02:00.146 16:25:57 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:02:00.146 16:25:57 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:00.146 16:25:57 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:02:00.146 16:25:57 build_native_dpdk -- scripts/common.sh@368 -- $ return 1 00:02:00.146 16:25:57 build_native_dpdk -- common/autobuild_common.sh@183 -- $ dpdk_kmods=false 00:02:00.146 16:25:57 build_native_dpdk -- common/autobuild_common.sh@184 -- $ uname -s 00:02:00.146 16:25:57 build_native_dpdk -- common/autobuild_common.sh@184 -- $ '[' Linux = FreeBSD ']' 00:02:00.146 16:25:57 build_native_dpdk -- common/autobuild_common.sh@188 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base 00:02:00.146 16:25:57 build_native_dpdk -- common/autobuild_common.sh@188 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:02:05.431 The Meson build system 00:02:05.431 Version: 1.5.0 00:02:05.431 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:05.431 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:02:05.431 Build type: native build 00:02:05.431 Program cat found: YES (/usr/bin/cat) 00:02:05.431 Project name: DPDK 00:02:05.431 Project version: 22.11.4 00:02:05.431 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:02:05.431 C linker for the host machine: gcc ld.bfd 2.40-14 00:02:05.431 Host machine cpu family: x86_64 00:02:05.431 Host machine cpu: x86_64 00:02:05.431 Message: ## Building in Developer Mode ## 00:02:05.431 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:05.431 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:02:05.431 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:02:05.431 Program objdump found: YES (/usr/bin/objdump) 00:02:05.431 Program python3 found: YES (/usr/bin/python3) 00:02:05.431 Program cat found: YES (/usr/bin/cat) 00:02:05.431 config/meson.build:83: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:02:05.431 Checking for size of "void *" : 8 00:02:05.431 Checking for size of "void *" : 8 (cached) 00:02:05.431 Library m found: YES 00:02:05.431 Library numa found: YES 00:02:05.431 Has header "numaif.h" : YES 00:02:05.431 Library fdt found: NO 00:02:05.431 Library execinfo found: NO 00:02:05.431 Has header "execinfo.h" : YES 00:02:05.431 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:05.431 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:05.431 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:05.431 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:05.431 Run-time dependency openssl found: YES 3.1.1 00:02:05.431 Run-time dependency libpcap found: YES 1.10.4 00:02:05.431 Has header "pcap.h" with dependency libpcap: YES 00:02:05.431 Compiler for C supports arguments -Wcast-qual: YES 00:02:05.431 Compiler for C supports arguments -Wdeprecated: YES 00:02:05.431 Compiler for C supports arguments -Wformat: YES 00:02:05.431 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:05.431 Compiler for C supports arguments -Wformat-security: NO 00:02:05.431 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:05.431 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:05.431 Compiler for C supports arguments -Wnested-externs: YES 00:02:05.431 Compiler for C supports arguments -Wold-style-definition: YES 00:02:05.431 Compiler for C supports arguments -Wpointer-arith: YES 00:02:05.431 Compiler for C supports arguments -Wsign-compare: YES 00:02:05.431 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:05.431 Compiler for C supports arguments -Wundef: YES 00:02:05.431 Compiler for C supports arguments -Wwrite-strings: YES 00:02:05.431 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:05.431 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:05.431 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:05.431 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:05.431 Compiler for C supports arguments -mavx512f: YES 00:02:05.431 Checking if "AVX512 checking" compiles: YES 00:02:05.431 Fetching value of define "__SSE4_2__" : 1 00:02:05.431 Fetching value of define "__AES__" : 1 00:02:05.431 Fetching value of define "__AVX__" : 1 00:02:05.431 Fetching value of define "__AVX2__" : 1 00:02:05.431 Fetching value of define "__AVX512BW__" : 1 00:02:05.432 Fetching value of define "__AVX512CD__" : 1 00:02:05.432 Fetching value of define "__AVX512DQ__" : 1 00:02:05.432 Fetching value of define "__AVX512F__" : 1 00:02:05.432 Fetching value of define "__AVX512VL__" : 1 00:02:05.432 Fetching value of define "__PCLMUL__" : 1 00:02:05.432 Fetching value of define "__RDRND__" : 1 00:02:05.432 Fetching value of define "__RDSEED__" : 1 00:02:05.432 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:05.432 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:05.432 Message: lib/kvargs: Defining dependency "kvargs" 00:02:05.432 Message: lib/telemetry: Defining dependency "telemetry" 00:02:05.432 Checking for function "getentropy" : YES 00:02:05.432 Message: lib/eal: Defining dependency "eal" 00:02:05.432 Message: lib/ring: Defining dependency "ring" 00:02:05.432 Message: lib/rcu: Defining dependency "rcu" 00:02:05.432 Message: lib/mempool: Defining dependency "mempool" 00:02:05.432 Message: lib/mbuf: Defining dependency "mbuf" 00:02:05.432 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:05.432 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:05.432 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:05.432 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:05.432 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:05.432 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:05.432 Compiler for C supports arguments -mpclmul: YES 00:02:05.432 Compiler for C supports arguments -maes: YES 00:02:05.432 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:05.432 Compiler for C supports arguments -mavx512bw: YES 00:02:05.432 Compiler for C supports arguments -mavx512dq: YES 00:02:05.432 Compiler for C supports arguments -mavx512vl: YES 00:02:05.432 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:05.432 Compiler for C supports arguments -mavx2: YES 00:02:05.432 Compiler for C supports arguments -mavx: YES 00:02:05.432 Message: lib/net: Defining dependency "net" 00:02:05.432 Message: lib/meter: Defining dependency "meter" 00:02:05.432 Message: lib/ethdev: Defining dependency "ethdev" 00:02:05.432 Message: lib/pci: Defining dependency "pci" 00:02:05.432 Message: lib/cmdline: Defining dependency "cmdline" 00:02:05.432 Message: lib/metrics: Defining dependency "metrics" 00:02:05.432 Message: lib/hash: Defining dependency "hash" 00:02:05.432 Message: lib/timer: Defining dependency "timer" 00:02:05.432 Fetching value of define "__AVX2__" : 1 (cached) 00:02:05.432 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:05.432 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:05.432 Fetching value of define "__AVX512CD__" : 1 (cached) 00:02:05.432 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:05.432 Message: lib/acl: Defining dependency "acl" 00:02:05.432 Message: lib/bbdev: Defining dependency "bbdev" 00:02:05.432 Message: lib/bitratestats: Defining dependency "bitratestats" 00:02:05.432 Run-time dependency libelf found: YES 0.191 00:02:05.432 Message: lib/bpf: Defining dependency "bpf" 00:02:05.432 Message: lib/cfgfile: Defining dependency "cfgfile" 00:02:05.432 Message: lib/compressdev: Defining dependency "compressdev" 00:02:05.432 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:05.432 Message: lib/distributor: Defining dependency "distributor" 00:02:05.432 Message: lib/efd: Defining dependency "efd" 00:02:05.432 Message: lib/eventdev: Defining dependency "eventdev" 00:02:05.432 Message: lib/gpudev: Defining dependency "gpudev" 00:02:05.432 Message: lib/gro: Defining dependency "gro" 00:02:05.432 Message: lib/gso: Defining dependency "gso" 00:02:05.432 Message: lib/ip_frag: Defining dependency "ip_frag" 00:02:05.432 Message: lib/jobstats: Defining dependency "jobstats" 00:02:05.432 Message: lib/latencystats: Defining dependency "latencystats" 00:02:05.432 Message: lib/lpm: Defining dependency "lpm" 00:02:05.432 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:05.432 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:05.432 Fetching value of define "__AVX512IFMA__" : (undefined) 00:02:05.432 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:02:05.432 Message: lib/member: Defining dependency "member" 00:02:05.432 Message: lib/pcapng: Defining dependency "pcapng" 00:02:05.432 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:05.432 Message: lib/power: Defining dependency "power" 00:02:05.432 Message: lib/rawdev: Defining dependency "rawdev" 00:02:05.432 Message: lib/regexdev: Defining dependency "regexdev" 00:02:05.432 Message: lib/dmadev: Defining dependency "dmadev" 00:02:05.432 Message: lib/rib: Defining dependency "rib" 00:02:05.432 Message: lib/reorder: Defining dependency "reorder" 00:02:05.432 Message: lib/sched: Defining dependency "sched" 00:02:05.432 Message: lib/security: Defining dependency "security" 00:02:05.432 Message: lib/stack: Defining dependency "stack" 00:02:05.432 Has header "linux/userfaultfd.h" : YES 00:02:05.432 Message: lib/vhost: Defining dependency "vhost" 00:02:05.432 Message: lib/ipsec: Defining dependency "ipsec" 00:02:05.432 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:05.432 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:05.432 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:05.432 Message: lib/fib: Defining dependency "fib" 00:02:05.432 Message: lib/port: Defining dependency "port" 00:02:05.432 Message: lib/pdump: Defining dependency "pdump" 00:02:05.432 Message: lib/table: Defining dependency "table" 00:02:05.432 Message: lib/pipeline: Defining dependency "pipeline" 00:02:05.432 Message: lib/graph: Defining dependency "graph" 00:02:05.432 Message: lib/node: Defining dependency "node" 00:02:05.432 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:05.432 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:05.432 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:05.432 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:05.432 Compiler for C supports arguments -Wno-sign-compare: YES 00:02:05.432 Compiler for C supports arguments -Wno-unused-value: YES 00:02:05.432 Compiler for C supports arguments -Wno-format: YES 00:02:05.432 Compiler for C supports arguments -Wno-format-security: YES 00:02:05.432 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:02:06.010 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:02:06.010 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:02:06.010 Compiler for C supports arguments -Wno-unused-parameter: YES 00:02:06.010 Fetching value of define "__AVX2__" : 1 (cached) 00:02:06.010 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:06.010 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:06.010 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:06.010 Compiler for C supports arguments -mavx512bw: YES (cached) 00:02:06.010 Compiler for C supports arguments -march=skylake-avx512: YES 00:02:06.010 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:02:06.010 Program doxygen found: YES (/usr/local/bin/doxygen) 00:02:06.010 Configuring doxy-api.conf using configuration 00:02:06.010 Program sphinx-build found: NO 00:02:06.010 Configuring rte_build_config.h using configuration 00:02:06.010 Message: 00:02:06.010 ================= 00:02:06.010 Applications Enabled 00:02:06.010 ================= 00:02:06.010 00:02:06.010 apps: 00:02:06.010 dumpcap, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, test-crypto-perf, 00:02:06.010 test-eventdev, test-fib, test-flow-perf, test-gpudev, test-pipeline, test-pmd, test-regex, test-sad, 00:02:06.010 test-security-perf, 00:02:06.010 00:02:06.010 Message: 00:02:06.010 ================= 00:02:06.010 Libraries Enabled 00:02:06.010 ================= 00:02:06.010 00:02:06.010 libs: 00:02:06.010 kvargs, telemetry, eal, ring, rcu, mempool, mbuf, net, 00:02:06.010 meter, ethdev, pci, cmdline, metrics, hash, timer, acl, 00:02:06.010 bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, efd, 00:02:06.010 eventdev, gpudev, gro, gso, ip_frag, jobstats, latencystats, lpm, 00:02:06.010 member, pcapng, power, rawdev, regexdev, dmadev, rib, reorder, 00:02:06.010 sched, security, stack, vhost, ipsec, fib, port, pdump, 00:02:06.010 table, pipeline, graph, node, 00:02:06.010 00:02:06.010 Message: 00:02:06.010 =============== 00:02:06.010 Drivers Enabled 00:02:06.010 =============== 00:02:06.010 00:02:06.010 common: 00:02:06.010 00:02:06.010 bus: 00:02:06.010 pci, vdev, 00:02:06.010 mempool: 00:02:06.010 ring, 00:02:06.010 dma: 00:02:06.010 00:02:06.010 net: 00:02:06.010 i40e, 00:02:06.010 raw: 00:02:06.010 00:02:06.010 crypto: 00:02:06.010 00:02:06.010 compress: 00:02:06.010 00:02:06.010 regex: 00:02:06.010 00:02:06.010 vdpa: 00:02:06.010 00:02:06.010 event: 00:02:06.010 00:02:06.010 baseband: 00:02:06.010 00:02:06.010 gpu: 00:02:06.010 00:02:06.010 00:02:06.010 Message: 00:02:06.010 ================= 00:02:06.010 Content Skipped 00:02:06.010 ================= 00:02:06.010 00:02:06.010 apps: 00:02:06.010 00:02:06.010 libs: 00:02:06.010 kni: explicitly disabled via build config (deprecated lib) 00:02:06.010 flow_classify: explicitly disabled via build config (deprecated lib) 00:02:06.010 00:02:06.010 drivers: 00:02:06.010 common/cpt: not in enabled drivers build config 00:02:06.010 common/dpaax: not in enabled drivers build config 00:02:06.010 common/iavf: not in enabled drivers build config 00:02:06.010 common/idpf: not in enabled drivers build config 00:02:06.010 common/mvep: not in enabled drivers build config 00:02:06.010 common/octeontx: not in enabled drivers build config 00:02:06.010 bus/auxiliary: not in enabled drivers build config 00:02:06.010 bus/dpaa: not in enabled drivers build config 00:02:06.010 bus/fslmc: not in enabled drivers build config 00:02:06.010 bus/ifpga: not in enabled drivers build config 00:02:06.010 bus/vmbus: not in enabled drivers build config 00:02:06.010 common/cnxk: not in enabled drivers build config 00:02:06.010 common/mlx5: not in enabled drivers build config 00:02:06.010 common/qat: not in enabled drivers build config 00:02:06.010 common/sfc_efx: not in enabled drivers build config 00:02:06.010 mempool/bucket: not in enabled drivers build config 00:02:06.010 mempool/cnxk: not in enabled drivers build config 00:02:06.010 mempool/dpaa: not in enabled drivers build config 00:02:06.010 mempool/dpaa2: not in enabled drivers build config 00:02:06.010 mempool/octeontx: not in enabled drivers build config 00:02:06.010 mempool/stack: not in enabled drivers build config 00:02:06.010 dma/cnxk: not in enabled drivers build config 00:02:06.010 dma/dpaa: not in enabled drivers build config 00:02:06.010 dma/dpaa2: not in enabled drivers build config 00:02:06.010 dma/hisilicon: not in enabled drivers build config 00:02:06.010 dma/idxd: not in enabled drivers build config 00:02:06.010 dma/ioat: not in enabled drivers build config 00:02:06.010 dma/skeleton: not in enabled drivers build config 00:02:06.010 net/af_packet: not in enabled drivers build config 00:02:06.010 net/af_xdp: not in enabled drivers build config 00:02:06.010 net/ark: not in enabled drivers build config 00:02:06.010 net/atlantic: not in enabled drivers build config 00:02:06.010 net/avp: not in enabled drivers build config 00:02:06.010 net/axgbe: not in enabled drivers build config 00:02:06.010 net/bnx2x: not in enabled drivers build config 00:02:06.010 net/bnxt: not in enabled drivers build config 00:02:06.010 net/bonding: not in enabled drivers build config 00:02:06.010 net/cnxk: not in enabled drivers build config 00:02:06.010 net/cxgbe: not in enabled drivers build config 00:02:06.010 net/dpaa: not in enabled drivers build config 00:02:06.010 net/dpaa2: not in enabled drivers build config 00:02:06.010 net/e1000: not in enabled drivers build config 00:02:06.010 net/ena: not in enabled drivers build config 00:02:06.010 net/enetc: not in enabled drivers build config 00:02:06.010 net/enetfec: not in enabled drivers build config 00:02:06.010 net/enic: not in enabled drivers build config 00:02:06.010 net/failsafe: not in enabled drivers build config 00:02:06.010 net/fm10k: not in enabled drivers build config 00:02:06.010 net/gve: not in enabled drivers build config 00:02:06.010 net/hinic: not in enabled drivers build config 00:02:06.010 net/hns3: not in enabled drivers build config 00:02:06.010 net/iavf: not in enabled drivers build config 00:02:06.010 net/ice: not in enabled drivers build config 00:02:06.010 net/idpf: not in enabled drivers build config 00:02:06.010 net/igc: not in enabled drivers build config 00:02:06.010 net/ionic: not in enabled drivers build config 00:02:06.010 net/ipn3ke: not in enabled drivers build config 00:02:06.010 net/ixgbe: not in enabled drivers build config 00:02:06.010 net/kni: not in enabled drivers build config 00:02:06.010 net/liquidio: not in enabled drivers build config 00:02:06.010 net/mana: not in enabled drivers build config 00:02:06.010 net/memif: not in enabled drivers build config 00:02:06.010 net/mlx4: not in enabled drivers build config 00:02:06.010 net/mlx5: not in enabled drivers build config 00:02:06.010 net/mvneta: not in enabled drivers build config 00:02:06.010 net/mvpp2: not in enabled drivers build config 00:02:06.010 net/netvsc: not in enabled drivers build config 00:02:06.010 net/nfb: not in enabled drivers build config 00:02:06.010 net/nfp: not in enabled drivers build config 00:02:06.010 net/ngbe: not in enabled drivers build config 00:02:06.010 net/null: not in enabled drivers build config 00:02:06.010 net/octeontx: not in enabled drivers build config 00:02:06.010 net/octeon_ep: not in enabled drivers build config 00:02:06.010 net/pcap: not in enabled drivers build config 00:02:06.010 net/pfe: not in enabled drivers build config 00:02:06.010 net/qede: not in enabled drivers build config 00:02:06.010 net/ring: not in enabled drivers build config 00:02:06.010 net/sfc: not in enabled drivers build config 00:02:06.010 net/softnic: not in enabled drivers build config 00:02:06.010 net/tap: not in enabled drivers build config 00:02:06.010 net/thunderx: not in enabled drivers build config 00:02:06.010 net/txgbe: not in enabled drivers build config 00:02:06.010 net/vdev_netvsc: not in enabled drivers build config 00:02:06.010 net/vhost: not in enabled drivers build config 00:02:06.010 net/virtio: not in enabled drivers build config 00:02:06.010 net/vmxnet3: not in enabled drivers build config 00:02:06.010 raw/cnxk_bphy: not in enabled drivers build config 00:02:06.010 raw/cnxk_gpio: not in enabled drivers build config 00:02:06.010 raw/dpaa2_cmdif: not in enabled drivers build config 00:02:06.010 raw/ifpga: not in enabled drivers build config 00:02:06.010 raw/ntb: not in enabled drivers build config 00:02:06.010 raw/skeleton: not in enabled drivers build config 00:02:06.010 crypto/armv8: not in enabled drivers build config 00:02:06.010 crypto/bcmfs: not in enabled drivers build config 00:02:06.010 crypto/caam_jr: not in enabled drivers build config 00:02:06.010 crypto/ccp: not in enabled drivers build config 00:02:06.010 crypto/cnxk: not in enabled drivers build config 00:02:06.010 crypto/dpaa_sec: not in enabled drivers build config 00:02:06.010 crypto/dpaa2_sec: not in enabled drivers build config 00:02:06.010 crypto/ipsec_mb: not in enabled drivers build config 00:02:06.010 crypto/mlx5: not in enabled drivers build config 00:02:06.010 crypto/mvsam: not in enabled drivers build config 00:02:06.010 crypto/nitrox: not in enabled drivers build config 00:02:06.010 crypto/null: not in enabled drivers build config 00:02:06.010 crypto/octeontx: not in enabled drivers build config 00:02:06.010 crypto/openssl: not in enabled drivers build config 00:02:06.010 crypto/scheduler: not in enabled drivers build config 00:02:06.010 crypto/uadk: not in enabled drivers build config 00:02:06.010 crypto/virtio: not in enabled drivers build config 00:02:06.010 compress/isal: not in enabled drivers build config 00:02:06.010 compress/mlx5: not in enabled drivers build config 00:02:06.010 compress/octeontx: not in enabled drivers build config 00:02:06.010 compress/zlib: not in enabled drivers build config 00:02:06.010 regex/mlx5: not in enabled drivers build config 00:02:06.010 regex/cn9k: not in enabled drivers build config 00:02:06.010 vdpa/ifc: not in enabled drivers build config 00:02:06.010 vdpa/mlx5: not in enabled drivers build config 00:02:06.010 vdpa/sfc: not in enabled drivers build config 00:02:06.011 event/cnxk: not in enabled drivers build config 00:02:06.011 event/dlb2: not in enabled drivers build config 00:02:06.011 event/dpaa: not in enabled drivers build config 00:02:06.011 event/dpaa2: not in enabled drivers build config 00:02:06.011 event/dsw: not in enabled drivers build config 00:02:06.011 event/opdl: not in enabled drivers build config 00:02:06.011 event/skeleton: not in enabled drivers build config 00:02:06.011 event/sw: not in enabled drivers build config 00:02:06.011 event/octeontx: not in enabled drivers build config 00:02:06.011 baseband/acc: not in enabled drivers build config 00:02:06.011 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:02:06.011 baseband/fpga_lte_fec: not in enabled drivers build config 00:02:06.011 baseband/la12xx: not in enabled drivers build config 00:02:06.011 baseband/null: not in enabled drivers build config 00:02:06.011 baseband/turbo_sw: not in enabled drivers build config 00:02:06.011 gpu/cuda: not in enabled drivers build config 00:02:06.011 00:02:06.011 00:02:06.011 Build targets in project: 311 00:02:06.011 00:02:06.011 DPDK 22.11.4 00:02:06.011 00:02:06.011 User defined options 00:02:06.011 libdir : lib 00:02:06.011 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:06.011 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:02:06.011 c_link_args : 00:02:06.011 enable_docs : false 00:02:06.011 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base, 00:02:06.011 enable_kmods : false 00:02:06.011 machine : native 00:02:06.011 tests : false 00:02:06.011 00:02:06.011 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:06.011 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:02:06.011 16:26:03 build_native_dpdk -- common/autobuild_common.sh@192 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 00:02:06.011 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:06.011 [1/740] Generating lib/rte_kvargs_mingw with a custom command 00:02:06.011 [2/740] Generating lib/rte_telemetry_def with a custom command 00:02:06.011 [3/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:06.011 [4/740] Generating lib/rte_kvargs_def with a custom command 00:02:06.011 [5/740] Generating lib/rte_telemetry_mingw with a custom command 00:02:06.011 [6/740] Generating lib/rte_mempool_def with a custom command 00:02:06.011 [7/740] Generating lib/rte_eal_def with a custom command 00:02:06.011 [8/740] Generating lib/rte_ring_mingw with a custom command 00:02:06.011 [9/740] Generating lib/rte_mempool_mingw with a custom command 00:02:06.011 [10/740] Generating lib/rte_ring_def with a custom command 00:02:06.011 [11/740] Generating lib/rte_mbuf_def with a custom command 00:02:06.011 [12/740] Generating lib/rte_mbuf_mingw with a custom command 00:02:06.011 [13/740] Generating lib/rte_eal_mingw with a custom command 00:02:06.011 [14/740] Generating lib/rte_net_def with a custom command 00:02:06.011 [15/740] Generating lib/rte_rcu_def with a custom command 00:02:06.011 [16/740] Generating lib/rte_rcu_mingw with a custom command 00:02:06.011 [17/740] Generating lib/rte_meter_def with a custom command 00:02:06.280 [18/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:06.280 [19/740] Generating lib/rte_net_mingw with a custom command 00:02:06.280 [20/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:06.280 [21/740] Generating lib/rte_meter_mingw with a custom command 00:02:06.280 [22/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:06.280 [23/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:06.280 [24/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:06.280 [25/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:06.280 [26/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:06.280 [27/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:06.280 [28/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_log.c.o 00:02:06.280 [29/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:06.280 [30/740] Generating lib/rte_ethdev_mingw with a custom command 00:02:06.280 [31/740] Generating lib/rte_ethdev_def with a custom command 00:02:06.280 [32/740] Generating lib/rte_pci_def with a custom command 00:02:06.280 [33/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:06.280 [34/740] Generating lib/rte_pci_mingw with a custom command 00:02:06.280 [35/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:06.280 [36/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:06.280 [37/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:06.280 [38/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:06.280 [39/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:06.280 [40/740] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:06.280 [41/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:06.280 [42/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:06.280 [43/740] Linking static target lib/librte_kvargs.a 00:02:06.280 [44/740] Generating lib/rte_cmdline_def with a custom command 00:02:06.280 [45/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:06.280 [46/740] Generating lib/rte_metrics_def with a custom command 00:02:06.280 [47/740] Generating lib/rte_cmdline_mingw with a custom command 00:02:06.280 [48/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:06.280 [49/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:06.280 [50/740] Generating lib/rte_metrics_mingw with a custom command 00:02:06.280 [51/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:06.280 [52/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:06.281 [53/740] Generating lib/rte_hash_def with a custom command 00:02:06.281 [54/740] Generating lib/rte_hash_mingw with a custom command 00:02:06.281 [55/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:06.281 [56/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:06.281 [57/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:06.281 [58/740] Generating lib/rte_timer_def with a custom command 00:02:06.281 [59/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:06.281 [60/740] Generating lib/rte_timer_mingw with a custom command 00:02:06.281 [61/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:06.281 [62/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:06.281 [63/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:06.281 [64/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:06.281 [65/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:06.281 [66/740] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:06.281 [67/740] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:06.281 [68/740] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:06.281 [69/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:06.281 [70/740] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:06.281 [71/740] Generating lib/rte_acl_mingw with a custom command 00:02:06.281 [72/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:06.281 [73/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:06.281 [74/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:06.281 [75/740] Generating lib/rte_acl_def with a custom command 00:02:06.281 [76/740] Generating lib/rte_bbdev_mingw with a custom command 00:02:06.281 [77/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:06.281 [78/740] Generating lib/rte_bbdev_def with a custom command 00:02:06.281 [79/740] Generating lib/rte_bitratestats_def with a custom command 00:02:06.281 [80/740] Generating lib/rte_bitratestats_mingw with a custom command 00:02:06.281 [81/740] Linking static target lib/librte_pci.a 00:02:06.281 [82/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:06.281 [83/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:06.281 [84/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:06.281 [85/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:06.281 [86/740] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:06.281 [87/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:06.281 [88/740] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:06.281 [89/740] Generating lib/rte_bpf_def with a custom command 00:02:06.281 [90/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:06.281 [91/740] Generating lib/rte_cfgfile_def with a custom command 00:02:06.281 [92/740] Generating lib/rte_bpf_mingw with a custom command 00:02:06.540 [93/740] Generating lib/rte_cfgfile_mingw with a custom command 00:02:06.540 [94/740] Generating lib/rte_compressdev_def with a custom command 00:02:06.540 [95/740] Generating lib/rte_compressdev_mingw with a custom command 00:02:06.540 [96/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:06.540 [97/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:06.540 [98/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:06.540 [99/740] Linking static target lib/librte_meter.a 00:02:06.540 [100/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:06.540 [101/740] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:06.540 [102/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:06.540 [103/740] Linking static target lib/librte_ring.a 00:02:06.540 [104/740] Generating lib/rte_cryptodev_def with a custom command 00:02:06.540 [105/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:06.540 [106/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_log.c.o 00:02:06.540 [107/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:06.540 [108/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:06.540 [109/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:06.540 [110/740] Generating lib/rte_cryptodev_mingw with a custom command 00:02:06.540 [111/740] Generating lib/rte_distributor_mingw with a custom command 00:02:06.540 [112/740] Generating lib/rte_distributor_def with a custom command 00:02:06.540 [113/740] Generating lib/rte_efd_def with a custom command 00:02:06.540 [114/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:06.540 [115/740] Generating lib/rte_efd_mingw with a custom command 00:02:06.540 [116/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:06.540 [117/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:06.540 [118/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:06.540 [119/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:06.540 [120/740] Generating lib/rte_eventdev_def with a custom command 00:02:06.540 [121/740] Generating lib/rte_eventdev_mingw with a custom command 00:02:06.540 [122/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:06.540 [123/740] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:06.540 [124/740] Generating lib/rte_gpudev_mingw with a custom command 00:02:06.540 [125/740] Generating lib/rte_gpudev_def with a custom command 00:02:06.540 [126/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:06.540 [127/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:02:06.540 [128/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:06.540 [129/740] Generating lib/rte_gro_def with a custom command 00:02:06.540 [130/740] Generating lib/rte_gro_mingw with a custom command 00:02:06.540 [131/740] Generating lib/rte_gso_def with a custom command 00:02:06.540 [132/740] Generating lib/rte_gso_mingw with a custom command 00:02:06.540 [133/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:06.800 [134/740] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.800 [135/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:06.800 [136/740] Generating lib/rte_ip_frag_def with a custom command 00:02:06.800 [137/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:06.800 [138/740] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.800 [139/740] Linking target lib/librte_kvargs.so.23.0 00:02:06.800 [140/740] Generating lib/rte_ip_frag_mingw with a custom command 00:02:06.800 [141/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:06.800 [142/740] Generating lib/rte_jobstats_mingw with a custom command 00:02:06.800 [143/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:06.800 [144/740] Generating lib/rte_jobstats_def with a custom command 00:02:06.800 [145/740] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:02:06.800 [146/740] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:06.800 [147/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:06.800 [148/740] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:06.800 [149/740] Linking static target lib/librte_cfgfile.a 00:02:06.800 [150/740] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.800 [151/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:06.800 [152/740] Generating lib/rte_latencystats_def with a custom command 00:02:06.800 [153/740] Generating lib/rte_latencystats_mingw with a custom command 00:02:06.800 [154/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:06.800 [155/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:06.800 [156/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:06.800 [157/740] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:06.800 [158/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:06.800 [159/740] Generating lib/rte_lpm_mingw with a custom command 00:02:06.800 [160/740] Generating lib/rte_lpm_def with a custom command 00:02:06.800 [161/740] Generating lib/rte_member_def with a custom command 00:02:06.800 [162/740] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:02:06.800 [163/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:06.800 [164/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:06.800 [165/740] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:02:06.800 [166/740] Generating lib/rte_pcapng_mingw with a custom command 00:02:06.800 [167/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:06.800 [168/740] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.800 [169/740] Generating lib/rte_member_mingw with a custom command 00:02:06.800 [170/740] Generating lib/rte_pcapng_def with a custom command 00:02:06.800 [171/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:06.800 [172/740] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:06.800 [173/740] Linking static target lib/librte_jobstats.a 00:02:06.800 [174/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:06.800 [175/740] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:06.800 [176/740] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:07.061 [177/740] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:07.061 [178/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:07.061 [179/740] Linking static target lib/librte_cmdline.a 00:02:07.061 [180/740] Generating symbol file lib/librte_kvargs.so.23.0.p/librte_kvargs.so.23.0.symbols 00:02:07.061 [181/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:07.061 [182/740] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:02:07.061 [183/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:07.061 [184/740] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:07.061 [185/740] Linking static target lib/librte_timer.a 00:02:07.061 [186/740] Generating lib/rte_power_def with a custom command 00:02:07.061 [187/740] Generating lib/rte_power_mingw with a custom command 00:02:07.061 [188/740] Generating lib/rte_rawdev_def with a custom command 00:02:07.061 [189/740] Linking static target lib/librte_metrics.a 00:02:07.061 [190/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:02:07.061 [191/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:02:07.061 [192/740] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:07.061 [193/740] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:07.061 [194/740] Generating lib/rte_rawdev_mingw with a custom command 00:02:07.061 [195/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:07.061 [196/740] Linking static target lib/librte_telemetry.a 00:02:07.061 [197/740] Generating lib/rte_regexdev_mingw with a custom command 00:02:07.061 [198/740] Generating lib/rte_regexdev_def with a custom command 00:02:07.061 [199/740] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:02:07.061 [200/740] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:07.061 [201/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:02:07.061 [202/740] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:07.061 [203/740] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:07.061 [204/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:07.061 [205/740] Generating lib/rte_dmadev_mingw with a custom command 00:02:07.061 [206/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:02:07.061 [207/740] Generating lib/rte_dmadev_def with a custom command 00:02:07.061 [208/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:07.061 [209/740] Generating lib/rte_rib_mingw with a custom command 00:02:07.061 [210/740] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:07.061 [211/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:07.061 [212/740] Generating lib/rte_rib_def with a custom command 00:02:07.061 [213/740] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:07.061 [214/740] Generating lib/rte_reorder_mingw with a custom command 00:02:07.061 [215/740] Generating lib/rte_reorder_def with a custom command 00:02:07.061 [216/740] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:02:07.061 [217/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:02:07.061 [218/740] Linking static target lib/librte_net.a 00:02:07.061 [219/740] Generating lib/rte_sched_def with a custom command 00:02:07.061 [220/740] Linking static target lib/librte_bitratestats.a 00:02:07.061 [221/740] Generating lib/rte_sched_mingw with a custom command 00:02:07.061 [222/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:02:07.061 [223/740] Generating lib/rte_security_mingw with a custom command 00:02:07.061 [224/740] Generating lib/rte_security_def with a custom command 00:02:07.061 [225/740] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:07.061 [226/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:07.061 [227/740] Generating lib/rte_stack_def with a custom command 00:02:07.061 [228/740] Generating lib/rte_stack_mingw with a custom command 00:02:07.061 [229/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:07.061 [230/740] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:07.061 [231/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:02:07.061 [232/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:07.061 [233/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:07.061 [234/740] Generating lib/rte_vhost_def with a custom command 00:02:07.061 [235/740] Generating lib/rte_vhost_mingw with a custom command 00:02:07.061 [236/740] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:02:07.061 [237/740] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:07.061 [238/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:02:07.061 [239/740] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:07.061 [240/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:07.061 [241/740] Generating lib/rte_ipsec_def with a custom command 00:02:07.061 [242/740] Generating lib/rte_ipsec_mingw with a custom command 00:02:07.061 [243/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:07.061 [244/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:02:07.061 [245/740] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:02:07.061 [246/740] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:02:07.323 [247/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:02:07.323 [248/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:02:07.323 [249/740] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:02:07.323 [250/740] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:02:07.323 [251/740] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:02:07.323 [252/740] Generating lib/rte_fib_def with a custom command 00:02:07.323 [253/740] Generating lib/rte_fib_mingw with a custom command 00:02:07.323 [254/740] Linking static target lib/librte_stack.a 00:02:07.324 [255/740] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:02:07.324 [256/740] Compiling C object lib/librte_power.a.p/power_rte_power_empty_poll.c.o 00:02:07.324 [257/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:02:07.324 [258/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:07.324 [259/740] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:02:07.324 [260/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:02:07.324 [261/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:02:07.324 [262/740] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:02:07.324 [263/740] Generating lib/rte_port_mingw with a custom command 00:02:07.324 [264/740] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:07.324 [265/740] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:07.324 [266/740] Generating lib/rte_port_def with a custom command 00:02:07.324 [267/740] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:07.324 [268/740] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:02:07.324 [269/740] Generating lib/rte_pdump_def with a custom command 00:02:07.324 [270/740] Generating lib/rte_pdump_mingw with a custom command 00:02:07.324 [271/740] Linking static target lib/librte_compressdev.a 00:02:07.324 [272/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:02:07.324 [273/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:07.324 [274/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:02:07.324 [275/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:02:07.324 [276/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:02:07.324 [277/740] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.324 [278/740] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.324 [279/740] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:02:07.324 [280/740] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:02:07.324 [281/740] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:07.324 [282/740] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.324 [283/740] Linking static target lib/librte_rcu.a 00:02:07.324 [284/740] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:07.324 [285/740] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:02:07.324 [286/740] Linking static target lib/librte_mempool.a 00:02:07.324 [287/740] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:02:07.324 [288/740] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.586 [289/740] Linking static target lib/librte_rawdev.a 00:02:07.586 [290/740] Generating lib/rte_table_mingw with a custom command 00:02:07.586 [291/740] Generating lib/rte_table_def with a custom command 00:02:07.586 [292/740] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:02:07.586 [293/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:02:07.586 [294/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:02:07.586 [295/740] Linking static target lib/librte_bbdev.a 00:02:07.586 [296/740] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:02:07.586 [297/740] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:02:07.586 [298/740] Linking static target lib/librte_gpudev.a 00:02:07.586 [299/740] Linking static target lib/librte_gro.a 00:02:07.586 [300/740] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:07.586 [301/740] Linking static target lib/librte_dmadev.a 00:02:07.586 [302/740] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.586 [303/740] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:02:07.586 [304/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:02:07.586 [305/740] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:02:07.586 [306/740] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.586 [307/740] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:07.586 [308/740] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.586 [309/740] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.586 [310/740] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:02:07.586 [311/740] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:02:07.586 [312/740] Generating lib/rte_pipeline_mingw with a custom command 00:02:07.586 [313/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:02:07.586 [314/740] Generating lib/rte_pipeline_def with a custom command 00:02:07.586 [315/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:02:07.586 [316/740] Linking static target lib/librte_gso.a 00:02:07.586 [317/740] Linking static target lib/librte_latencystats.a 00:02:07.586 [318/740] Compiling C object lib/librte_power.a.p/power_rte_power_intel_uncore.c.o 00:02:07.586 [319/740] Linking target lib/librte_telemetry.so.23.0 00:02:07.586 [320/740] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:07.586 [321/740] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:02:07.587 [322/740] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:02:07.587 [323/740] Generating lib/rte_graph_def with a custom command 00:02:07.587 [324/740] Generating lib/rte_graph_mingw with a custom command 00:02:07.587 [325/740] Linking static target lib/librte_distributor.a 00:02:07.587 [326/740] Linking static target lib/member/libsketch_avx512_tmp.a 00:02:07.587 [327/740] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:02:07.848 [328/740] Linking static target lib/librte_ip_frag.a 00:02:07.848 [329/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:02:07.848 [330/740] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:07.848 [331/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:02:07.848 [332/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:02:07.848 [333/740] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:02:07.848 [334/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:02:07.848 [335/740] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:02:07.848 [336/740] Linking static target lib/librte_regexdev.a 00:02:07.848 [337/740] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:02:07.848 [338/740] Compiling C object lib/librte_node.a.p/node_null.c.o 00:02:07.848 [339/740] Generating lib/rte_node_def with a custom command 00:02:07.848 [340/740] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:07.848 [341/740] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:07.848 [342/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:02:07.848 [343/740] Generating lib/rte_node_mingw with a custom command 00:02:07.848 [344/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:02:07.848 [345/740] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.848 [346/740] Generating symbol file lib/librte_telemetry.so.23.0.p/librte_telemetry.so.23.0.symbols 00:02:07.848 [347/740] Linking static target lib/librte_eal.a 00:02:07.848 [348/740] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:02:07.848 [349/740] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.848 [350/740] Generating drivers/rte_bus_pci_def with a custom command 00:02:07.848 [351/740] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:07.848 [352/740] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:07.848 [353/740] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:07.848 [354/740] Generating drivers/rte_bus_pci_mingw with a custom command 00:02:07.848 [355/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:02:07.848 [356/740] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.848 [357/740] Linking static target lib/librte_reorder.a 00:02:07.848 [358/740] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.848 [359/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:07.848 [360/740] Generating drivers/rte_bus_vdev_def with a custom command 00:02:07.848 [361/740] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:07.848 [362/740] Generating drivers/rte_bus_vdev_mingw with a custom command 00:02:07.848 [363/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:02:07.848 [364/740] Generating drivers/rte_mempool_ring_def with a custom command 00:02:07.848 [365/740] Generating drivers/rte_mempool_ring_mingw with a custom command 00:02:07.848 [366/740] Linking static target lib/librte_power.a 00:02:07.848 [367/740] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:02:07.848 [368/740] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:02:08.114 [369/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:08.114 [370/740] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:02:08.114 [371/740] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:02:08.114 [372/740] Linking static target lib/librte_pcapng.a 00:02:08.114 [373/740] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:08.114 [374/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:02:08.114 [375/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:02:08.114 [376/740] Linking static target lib/librte_security.a 00:02:08.114 [377/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:08.114 [378/740] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:02:08.114 [379/740] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:08.114 [380/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:02:08.114 [381/740] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.114 [382/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:02:08.114 [383/740] Linking static target lib/librte_mbuf.a 00:02:08.114 [384/740] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:02:08.114 [385/740] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:02:08.114 [386/740] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:02:08.114 [387/740] Linking static target lib/librte_bpf.a 00:02:08.114 [388/740] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:02:08.114 [389/740] Generating drivers/rte_net_i40e_mingw with a custom command 00:02:08.114 [390/740] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:02:08.114 [391/740] Generating drivers/rte_net_i40e_def with a custom command 00:02:08.114 [392/740] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.114 [393/740] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.114 [394/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:08.114 [395/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:02:08.114 [396/740] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:08.373 [397/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:02:08.373 [398/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:02:08.373 [399/740] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:08.373 [400/740] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:02:08.373 [401/740] Compiling C object lib/librte_node.a.p/node_log.c.o 00:02:08.373 [402/740] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:02:08.373 [403/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:08.373 [404/740] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:08.373 [405/740] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:02:08.373 [406/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:02:08.373 [407/740] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:02:08.373 [408/740] Linking static target lib/librte_rib.a 00:02:08.373 [409/740] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:02:08.373 [410/740] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:02:08.373 [411/740] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:02:08.373 [412/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:02:08.373 [413/740] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:02:08.373 [414/740] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.373 [415/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:02:08.373 [416/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:02:08.373 [417/740] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:02:08.373 [418/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:08.373 [419/740] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:02:08.373 [420/740] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:02:08.373 [421/740] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:02:08.373 [422/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:08.373 [423/740] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:02:08.373 [424/740] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:02:08.373 [425/740] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.373 [426/740] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.373 [427/740] Linking static target lib/librte_graph.a 00:02:08.373 [428/740] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:02:08.373 [429/740] Linking static target lib/librte_lpm.a 00:02:08.373 [430/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:02:08.373 [431/740] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.373 [432/740] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:02:08.373 [433/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:02:08.373 [434/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:02:08.373 [435/740] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:02:08.634 [436/740] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:02:08.634 [437/740] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:02:08.634 [438/740] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:02:08.634 [439/740] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:02:08.634 [440/740] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:02:08.634 [441/740] Linking static target lib/librte_efd.a 00:02:08.634 [442/740] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:08.634 [443/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:02:08.634 [444/740] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.634 [445/740] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:02:08.634 [446/740] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:08.634 [447/740] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:08.634 [448/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:02:08.634 [449/740] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:08.634 [450/740] Linking static target drivers/librte_bus_vdev.a 00:02:08.634 [451/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:02:08.634 [452/740] Compiling C object drivers/librte_bus_vdev.so.23.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:08.634 [453/740] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.634 [454/740] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.634 [455/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:02:08.634 [456/740] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:02:08.634 [457/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:02:08.634 [458/740] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:02:08.634 [459/740] Linking static target lib/librte_fib.a 00:02:08.899 [460/740] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:02:08.899 [461/740] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.899 [462/740] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.899 [463/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:02:08.899 [464/740] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.899 [465/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:08.899 [466/740] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:02:08.899 [467/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:02:08.899 [468/740] Linking static target lib/librte_pdump.a 00:02:08.899 [469/740] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:02:08.899 [470/740] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.899 [471/740] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.899 [472/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:02:08.899 [473/740] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.899 [474/740] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.161 [475/740] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:09.161 [476/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:09.161 [477/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:02:09.161 [478/740] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.161 [479/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:02:09.161 [480/740] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:09.161 [481/740] Compiling C object drivers/librte_bus_pci.so.23.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:09.161 [482/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:02:09.161 [483/740] Linking static target drivers/librte_bus_pci.a 00:02:09.161 [484/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:02:09.161 [485/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:02:09.161 [486/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:02:09.161 [487/740] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.161 [488/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:02:09.161 [489/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:02:09.161 [490/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:02:09.161 [491/740] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:02:09.161 [492/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:02:09.161 [493/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:02:09.161 [494/740] Linking static target lib/librte_table.a 00:02:09.161 [495/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:02:09.161 [496/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:02:09.420 [497/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:02:09.420 [498/740] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:02:09.420 [499/740] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.420 [500/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:02:09.420 [501/740] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:02:09.420 [502/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:02:09.420 [503/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:02:09.420 [504/740] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:02:09.420 [505/740] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:02:09.420 [506/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:02:09.420 [507/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:02:09.420 [508/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:02:09.420 [509/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:02:09.420 [510/740] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.420 [511/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:02:09.420 [512/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:02:09.420 [513/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:02:09.420 [514/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:02:09.420 [515/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:02:09.420 [516/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:02:09.420 [517/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:02:09.420 [518/740] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:09.420 [519/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:02:09.420 [520/740] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.420 [521/740] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:09.420 [522/740] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:09.420 [523/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:02:09.420 [524/740] Linking static target lib/librte_cryptodev.a 00:02:09.420 [525/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:02:09.420 [526/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:02:09.679 [527/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:02:09.679 [528/740] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:02:09.679 [529/740] Linking static target lib/librte_sched.a 00:02:09.679 [530/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:02:09.680 [531/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:02:09.680 [532/740] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.680 [533/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:02:09.680 [534/740] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:02:09.680 [535/740] Linking static target lib/librte_node.a 00:02:09.680 [536/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:02:09.680 [537/740] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:02:09.680 [538/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:02:09.680 [539/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:02:09.680 [540/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:02:09.680 [541/740] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:02:09.680 [542/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:02:09.680 [543/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:02:09.680 [544/740] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:02:09.680 [545/740] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:09.680 [546/740] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.680 [547/740] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:09.680 [548/740] Compiling C object drivers/librte_mempool_ring.so.23.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:09.680 [549/740] Linking static target drivers/librte_mempool_ring.a 00:02:09.680 [550/740] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:02:09.680 [551/740] Linking static target lib/librte_ipsec.a 00:02:09.680 [552/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:02:09.938 [553/740] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:02:09.938 [554/740] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:02:09.938 [555/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:02:09.938 [556/740] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:02:09.938 [557/740] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:09.938 [558/740] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:02:09.938 [559/740] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:02:09.938 [560/740] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:02:09.938 [561/740] Linking static target lib/librte_ethdev.a 00:02:09.938 [562/740] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:02:09.938 [563/740] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.938 [564/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:02:09.938 [565/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:02:09.938 [566/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:02:09.938 [567/740] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:02:09.938 [568/740] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:02:09.938 [569/740] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:02:09.938 [570/740] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:02:09.938 [571/740] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:02:09.938 [572/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:02:09.938 [573/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:02:09.938 [574/740] Linking static target lib/librte_member.a 00:02:09.938 [575/740] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:02:09.938 [576/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:02:09.938 [577/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:02:09.938 [578/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:02:09.938 [579/740] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:02:10.196 [580/740] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:02:10.196 [581/740] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:02:10.196 [582/740] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:02:10.196 [583/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:02:10.196 [584/740] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:02:10.196 [585/740] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.196 [586/740] Linking static target lib/librte_port.a 00:02:10.196 [587/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:02:10.196 [588/740] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.196 [589/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:02:10.196 [590/740] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:02:10.196 [591/740] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:02:10.196 [592/740] Linking static target lib/librte_eventdev.a 00:02:10.196 [593/740] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.196 [594/740] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:02:10.196 [595/740] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:02:10.196 [596/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:02:10.196 [597/740] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:02:10.454 [598/740] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:10.454 [599/740] Linking static target lib/librte_hash.a 00:02:10.454 [600/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:02:10.454 [601/740] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx2.c.o 00:02:10.454 [602/740] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:02:10.454 [603/740] Linking static target lib/librte_acl.a 00:02:10.454 [604/740] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:02:10.454 [605/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:02:10.454 [606/740] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.454 [607/740] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:02:10.454 [608/740] Linking static target drivers/net/i40e/base/libi40e_base.a 00:02:10.454 [609/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_avx2.c.o 00:02:10.713 [610/740] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:02:10.713 [611/740] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:02:10.713 [612/740] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:02:10.972 [613/740] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:02:10.972 [614/740] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.972 [615/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:02:10.972 [616/740] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:02:11.230 [617/740] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:02:11.489 [618/740] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:11.748 [619/740] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:02:11.748 [620/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:02:12.007 [621/740] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:02:12.266 [622/740] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:02:12.266 [623/740] Linking static target drivers/libtmp_rte_net_i40e.a 00:02:12.524 [624/740] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:02:12.783 [625/740] Compiling C object drivers/librte_net_i40e.so.23.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:12.783 [626/740] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:12.783 [627/740] Linking static target drivers/librte_net_i40e.a 00:02:13.041 [628/740] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:13.041 [629/740] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:13.608 [630/740] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:13.608 [631/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:02:13.608 [632/740] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:02:13.867 [633/740] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.140 [634/740] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.140 [635/740] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:19.140 [636/740] Linking static target lib/librte_vhost.a 00:02:20.079 [637/740] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:02:20.079 [638/740] Linking static target lib/librte_pipeline.a 00:02:20.649 [639/740] Linking target app/dpdk-dumpcap 00:02:20.649 [640/740] Linking target app/dpdk-pdump 00:02:20.649 [641/740] Linking target app/dpdk-test-eventdev 00:02:20.649 [642/740] Linking target app/dpdk-test-gpudev 00:02:20.649 [643/740] Linking target app/dpdk-test-sad 00:02:20.649 [644/740] Linking target app/dpdk-test-pipeline 00:02:20.649 [645/740] Linking target app/dpdk-test-compress-perf 00:02:20.649 [646/740] Linking target app/dpdk-test-security-perf 00:02:20.649 [647/740] Linking target app/dpdk-test-acl 00:02:20.649 [648/740] Linking target app/dpdk-test-flow-perf 00:02:20.649 [649/740] Linking target app/dpdk-test-cmdline 00:02:20.649 [650/740] Linking target app/dpdk-test-crypto-perf 00:02:20.649 [651/740] Linking target app/dpdk-test-bbdev 00:02:20.649 [652/740] Linking target app/dpdk-proc-info 00:02:20.649 [653/740] Linking target app/dpdk-test-regex 00:02:20.649 [654/740] Linking target app/dpdk-test-fib 00:02:20.649 [655/740] Linking target app/dpdk-testpmd 00:02:21.219 [656/740] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.219 [657/740] Linking target lib/librte_eal.so.23.0 00:02:21.479 [658/740] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.479 [659/740] Generating symbol file lib/librte_eal.so.23.0.p/librte_eal.so.23.0.symbols 00:02:21.479 [660/740] Linking target lib/librte_ring.so.23.0 00:02:21.479 [661/740] Linking target lib/librte_meter.so.23.0 00:02:21.479 [662/740] Linking target lib/librte_timer.so.23.0 00:02:21.479 [663/740] Linking target lib/librte_pci.so.23.0 00:02:21.479 [664/740] Linking target lib/librte_stack.so.23.0 00:02:21.479 [665/740] Linking target lib/librte_cfgfile.so.23.0 00:02:21.479 [666/740] Linking target lib/librte_jobstats.so.23.0 00:02:21.479 [667/740] Linking target lib/librte_dmadev.so.23.0 00:02:21.479 [668/740] Linking target lib/librte_rawdev.so.23.0 00:02:21.479 [669/740] Linking target drivers/librte_bus_vdev.so.23.0 00:02:21.479 [670/740] Linking target lib/librte_graph.so.23.0 00:02:21.479 [671/740] Linking target lib/librte_acl.so.23.0 00:02:21.479 [672/740] Generating symbol file lib/librte_meter.so.23.0.p/librte_meter.so.23.0.symbols 00:02:21.479 [673/740] Generating symbol file lib/librte_pci.so.23.0.p/librte_pci.so.23.0.symbols 00:02:21.479 [674/740] Generating symbol file lib/librte_timer.so.23.0.p/librte_timer.so.23.0.symbols 00:02:21.479 [675/740] Generating symbol file lib/librte_ring.so.23.0.p/librte_ring.so.23.0.symbols 00:02:21.479 [676/740] Generating symbol file drivers/librte_bus_vdev.so.23.0.p/librte_bus_vdev.so.23.0.symbols 00:02:21.479 [677/740] Generating symbol file lib/librte_graph.so.23.0.p/librte_graph.so.23.0.symbols 00:02:21.738 [678/740] Generating symbol file lib/librte_dmadev.so.23.0.p/librte_dmadev.so.23.0.symbols 00:02:21.738 [679/740] Generating symbol file lib/librte_acl.so.23.0.p/librte_acl.so.23.0.symbols 00:02:21.738 [680/740] Linking target lib/librte_rcu.so.23.0 00:02:21.738 [681/740] Linking target drivers/librte_bus_pci.so.23.0 00:02:21.738 [682/740] Linking target lib/librte_mempool.so.23.0 00:02:21.738 [683/740] Generating symbol file lib/librte_rcu.so.23.0.p/librte_rcu.so.23.0.symbols 00:02:21.738 [684/740] Generating symbol file lib/librte_mempool.so.23.0.p/librte_mempool.so.23.0.symbols 00:02:21.738 [685/740] Generating symbol file drivers/librte_bus_pci.so.23.0.p/librte_bus_pci.so.23.0.symbols 00:02:21.738 [686/740] Linking target lib/librte_mbuf.so.23.0 00:02:21.738 [687/740] Linking target drivers/librte_mempool_ring.so.23.0 00:02:21.738 [688/740] Linking target lib/librte_rib.so.23.0 00:02:21.997 [689/740] Generating symbol file lib/librte_mbuf.so.23.0.p/librte_mbuf.so.23.0.symbols 00:02:21.997 [690/740] Generating symbol file lib/librte_rib.so.23.0.p/librte_rib.so.23.0.symbols 00:02:21.997 [691/740] Linking target lib/librte_distributor.so.23.0 00:02:21.997 [692/740] Linking target lib/librte_regexdev.so.23.0 00:02:21.997 [693/740] Linking target lib/librte_compressdev.so.23.0 00:02:21.997 [694/740] Linking target lib/librte_net.so.23.0 00:02:21.997 [695/740] Linking target lib/librte_bbdev.so.23.0 00:02:21.997 [696/740] Linking target lib/librte_gpudev.so.23.0 00:02:21.997 [697/740] Linking target lib/librte_reorder.so.23.0 00:02:21.997 [698/740] Linking target lib/librte_cryptodev.so.23.0 00:02:21.997 [699/740] Linking target lib/librte_sched.so.23.0 00:02:21.997 [700/740] Linking target lib/librte_fib.so.23.0 00:02:22.256 [701/740] Generating symbol file lib/librte_net.so.23.0.p/librte_net.so.23.0.symbols 00:02:22.256 [702/740] Generating symbol file lib/librte_cryptodev.so.23.0.p/librte_cryptodev.so.23.0.symbols 00:02:22.256 [703/740] Generating symbol file lib/librte_sched.so.23.0.p/librte_sched.so.23.0.symbols 00:02:22.256 [704/740] Linking target lib/librte_hash.so.23.0 00:02:22.256 [705/740] Linking target lib/librte_cmdline.so.23.0 00:02:22.256 [706/740] Linking target lib/librte_ethdev.so.23.0 00:02:22.257 [707/740] Linking target lib/librte_security.so.23.0 00:02:22.257 [708/740] Generating symbol file lib/librte_hash.so.23.0.p/librte_hash.so.23.0.symbols 00:02:22.257 [709/740] Generating symbol file lib/librte_ethdev.so.23.0.p/librte_ethdev.so.23.0.symbols 00:02:22.257 [710/740] Generating symbol file lib/librte_security.so.23.0.p/librte_security.so.23.0.symbols 00:02:22.257 [711/740] Linking target lib/librte_efd.so.23.0 00:02:22.257 [712/740] Linking target lib/librte_lpm.so.23.0 00:02:22.257 [713/740] Linking target lib/librte_member.so.23.0 00:02:22.515 [714/740] Linking target lib/librte_metrics.so.23.0 00:02:22.515 [715/740] Linking target lib/librte_pcapng.so.23.0 00:02:22.515 [716/740] Linking target lib/librte_power.so.23.0 00:02:22.515 [717/740] Linking target lib/librte_gso.so.23.0 00:02:22.515 [718/740] Linking target lib/librte_gro.so.23.0 00:02:22.515 [719/740] Linking target lib/librte_ip_frag.so.23.0 00:02:22.515 [720/740] Linking target lib/librte_bpf.so.23.0 00:02:22.515 [721/740] Linking target lib/librte_vhost.so.23.0 00:02:22.515 [722/740] Linking target lib/librte_eventdev.so.23.0 00:02:22.515 [723/740] Linking target lib/librte_ipsec.so.23.0 00:02:22.515 [724/740] Linking target drivers/librte_net_i40e.so.23.0 00:02:22.515 [725/740] Generating symbol file lib/librte_lpm.so.23.0.p/librte_lpm.so.23.0.symbols 00:02:22.515 [726/740] Generating symbol file lib/librte_metrics.so.23.0.p/librte_metrics.so.23.0.symbols 00:02:22.515 [727/740] Generating symbol file lib/librte_pcapng.so.23.0.p/librte_pcapng.so.23.0.symbols 00:02:22.515 [728/740] Generating symbol file lib/librte_ip_frag.so.23.0.p/librte_ip_frag.so.23.0.symbols 00:02:22.515 [729/740] Generating symbol file lib/librte_bpf.so.23.0.p/librte_bpf.so.23.0.symbols 00:02:22.515 [730/740] Generating symbol file lib/librte_eventdev.so.23.0.p/librte_eventdev.so.23.0.symbols 00:02:22.515 [731/740] Linking target lib/librte_node.so.23.0 00:02:22.515 [732/740] Linking target lib/librte_bitratestats.so.23.0 00:02:22.515 [733/740] Linking target lib/librte_latencystats.so.23.0 00:02:22.515 [734/740] Linking target lib/librte_pdump.so.23.0 00:02:22.515 [735/740] Linking target lib/librte_port.so.23.0 00:02:22.775 [736/740] Generating symbol file lib/librte_port.so.23.0.p/librte_port.so.23.0.symbols 00:02:22.775 [737/740] Linking target lib/librte_table.so.23.0 00:02:23.035 [738/740] Generating symbol file lib/librte_table.so.23.0.p/librte_table.so.23.0.symbols 00:02:25.575 [739/740] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.575 [740/740] Linking target lib/librte_pipeline.so.23.0 00:02:25.575 16:26:22 build_native_dpdk -- common/autobuild_common.sh@194 -- $ uname -s 00:02:25.575 16:26:22 build_native_dpdk -- common/autobuild_common.sh@194 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:25.575 16:26:22 build_native_dpdk -- common/autobuild_common.sh@207 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j112 install 00:02:25.575 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:25.575 [0/1] Installing files. 00:02:25.575 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:02:25.575 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:25.575 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:25.575 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:25.575 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:25.575 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:25.575 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:25.575 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:25.575 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:25.575 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:25.575 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:25.575 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:25.575 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:25.575 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:25.575 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:25.575 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:25.575 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:25.575 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:25.575 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:25.575 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.575 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/flow_classify.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/ipv4_rules_file.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_classify/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_classify 00:02:25.576 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_blocks.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:25.577 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/node 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/server 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:25.578 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.579 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:25.580 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/kni.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/kni.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:25.581 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:25.582 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:25.843 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:25.843 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.843 Installing lib/librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing lib/librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing drivers/librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:25.844 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing drivers/librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:25.844 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing drivers/librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:25.844 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:25.844 Installing drivers/librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0 00:02:25.844 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:25.844 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:25.844 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:25.844 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:25.844 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:25.844 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:25.844 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:25.844 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:25.844 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:25.844 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:25.844 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:25.844 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:25.844 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:25.844 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:25.844 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:25.844 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:25.844 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:25.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:25.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:25.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:25.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:25.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:25.844 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.845 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:25.846 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.108 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_empty_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_intel_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.109 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.110 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.111 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.111 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.111 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.111 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.111 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.111 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.111 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.111 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:26.111 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:26.111 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:26.111 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:26.111 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.111 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:26.111 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:26.111 Installing symlink pointing to librte_kvargs.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.23 00:02:26.111 Installing symlink pointing to librte_kvargs.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:02:26.111 Installing symlink pointing to librte_telemetry.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.23 00:02:26.111 Installing symlink pointing to librte_telemetry.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:02:26.111 Installing symlink pointing to librte_eal.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.23 00:02:26.111 Installing symlink pointing to librte_eal.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:02:26.111 Installing symlink pointing to librte_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.23 00:02:26.111 Installing symlink pointing to librte_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:02:26.111 Installing symlink pointing to librte_rcu.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.23 00:02:26.111 Installing symlink pointing to librte_rcu.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:02:26.111 Installing symlink pointing to librte_mempool.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.23 00:02:26.111 Installing symlink pointing to librte_mempool.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:02:26.111 Installing symlink pointing to librte_mbuf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.23 00:02:26.111 Installing symlink pointing to librte_mbuf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:02:26.111 Installing symlink pointing to librte_net.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.23 00:02:26.111 Installing symlink pointing to librte_net.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:02:26.111 Installing symlink pointing to librte_meter.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.23 00:02:26.111 Installing symlink pointing to librte_meter.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:02:26.111 Installing symlink pointing to librte_ethdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.23 00:02:26.111 Installing symlink pointing to librte_ethdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:02:26.111 Installing symlink pointing to librte_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.23 00:02:26.111 Installing symlink pointing to librte_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:02:26.111 Installing symlink pointing to librte_cmdline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.23 00:02:26.111 Installing symlink pointing to librte_cmdline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:02:26.111 Installing symlink pointing to librte_metrics.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.23 00:02:26.111 Installing symlink pointing to librte_metrics.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:02:26.111 Installing symlink pointing to librte_hash.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.23 00:02:26.111 Installing symlink pointing to librte_hash.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:02:26.111 Installing symlink pointing to librte_timer.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.23 00:02:26.111 Installing symlink pointing to librte_timer.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:02:26.111 Installing symlink pointing to librte_acl.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.23 00:02:26.111 Installing symlink pointing to librte_acl.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:02:26.111 Installing symlink pointing to librte_bbdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.23 00:02:26.111 Installing symlink pointing to librte_bbdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:02:26.111 Installing symlink pointing to librte_bitratestats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.23 00:02:26.111 Installing symlink pointing to librte_bitratestats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:02:26.111 Installing symlink pointing to librte_bpf.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.23 00:02:26.111 Installing symlink pointing to librte_bpf.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:02:26.111 Installing symlink pointing to librte_cfgfile.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.23 00:02:26.111 Installing symlink pointing to librte_cfgfile.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:02:26.111 Installing symlink pointing to librte_compressdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.23 00:02:26.111 Installing symlink pointing to librte_compressdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:02:26.111 Installing symlink pointing to librte_cryptodev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.23 00:02:26.111 Installing symlink pointing to librte_cryptodev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:02:26.111 Installing symlink pointing to librte_distributor.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.23 00:02:26.111 Installing symlink pointing to librte_distributor.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:02:26.111 Installing symlink pointing to librte_efd.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.23 00:02:26.111 Installing symlink pointing to librte_efd.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:02:26.111 Installing symlink pointing to librte_eventdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.23 00:02:26.111 Installing symlink pointing to librte_eventdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:02:26.111 Installing symlink pointing to librte_gpudev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.23 00:02:26.111 Installing symlink pointing to librte_gpudev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:02:26.111 Installing symlink pointing to librte_gro.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.23 00:02:26.111 Installing symlink pointing to librte_gro.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:02:26.111 Installing symlink pointing to librte_gso.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.23 00:02:26.111 Installing symlink pointing to librte_gso.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:02:26.112 Installing symlink pointing to librte_ip_frag.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.23 00:02:26.112 Installing symlink pointing to librte_ip_frag.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:02:26.112 Installing symlink pointing to librte_jobstats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.23 00:02:26.112 Installing symlink pointing to librte_jobstats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:02:26.112 Installing symlink pointing to librte_latencystats.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.23 00:02:26.112 Installing symlink pointing to librte_latencystats.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:02:26.112 Installing symlink pointing to librte_lpm.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.23 00:02:26.112 Installing symlink pointing to librte_lpm.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:02:26.112 Installing symlink pointing to librte_member.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.23 00:02:26.112 Installing symlink pointing to librte_member.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:02:26.112 Installing symlink pointing to librte_pcapng.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.23 00:02:26.112 Installing symlink pointing to librte_pcapng.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:02:26.112 Installing symlink pointing to librte_power.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.23 00:02:26.112 Installing symlink pointing to librte_power.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:02:26.112 Installing symlink pointing to librte_rawdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.23 00:02:26.112 Installing symlink pointing to librte_rawdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:02:26.112 Installing symlink pointing to librte_regexdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.23 00:02:26.112 Installing symlink pointing to librte_regexdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:02:26.112 Installing symlink pointing to librte_dmadev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.23 00:02:26.112 Installing symlink pointing to librte_dmadev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:02:26.112 Installing symlink pointing to librte_rib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.23 00:02:26.112 Installing symlink pointing to librte_rib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:02:26.112 Installing symlink pointing to librte_reorder.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.23 00:02:26.112 Installing symlink pointing to librte_reorder.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:02:26.112 Installing symlink pointing to librte_sched.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.23 00:02:26.112 Installing symlink pointing to librte_sched.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:02:26.112 Installing symlink pointing to librte_security.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.23 00:02:26.112 Installing symlink pointing to librte_security.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:02:26.112 Installing symlink pointing to librte_stack.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.23 00:02:26.112 Installing symlink pointing to librte_stack.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:02:26.112 Installing symlink pointing to librte_vhost.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.23 00:02:26.112 Installing symlink pointing to librte_vhost.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:02:26.112 './librte_bus_pci.so' -> 'dpdk/pmds-23.0/librte_bus_pci.so' 00:02:26.112 './librte_bus_pci.so.23' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23' 00:02:26.112 './librte_bus_pci.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_pci.so.23.0' 00:02:26.112 './librte_bus_vdev.so' -> 'dpdk/pmds-23.0/librte_bus_vdev.so' 00:02:26.112 './librte_bus_vdev.so.23' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23' 00:02:26.112 './librte_bus_vdev.so.23.0' -> 'dpdk/pmds-23.0/librte_bus_vdev.so.23.0' 00:02:26.112 './librte_mempool_ring.so' -> 'dpdk/pmds-23.0/librte_mempool_ring.so' 00:02:26.112 './librte_mempool_ring.so.23' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23' 00:02:26.112 './librte_mempool_ring.so.23.0' -> 'dpdk/pmds-23.0/librte_mempool_ring.so.23.0' 00:02:26.112 './librte_net_i40e.so' -> 'dpdk/pmds-23.0/librte_net_i40e.so' 00:02:26.112 './librte_net_i40e.so.23' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23' 00:02:26.112 './librte_net_i40e.so.23.0' -> 'dpdk/pmds-23.0/librte_net_i40e.so.23.0' 00:02:26.112 Installing symlink pointing to librte_ipsec.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.23 00:02:26.112 Installing symlink pointing to librte_ipsec.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:02:26.112 Installing symlink pointing to librte_fib.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.23 00:02:26.112 Installing symlink pointing to librte_fib.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:02:26.112 Installing symlink pointing to librte_port.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.23 00:02:26.112 Installing symlink pointing to librte_port.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:02:26.112 Installing symlink pointing to librte_pdump.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.23 00:02:26.112 Installing symlink pointing to librte_pdump.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:02:26.112 Installing symlink pointing to librte_table.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.23 00:02:26.112 Installing symlink pointing to librte_table.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:02:26.112 Installing symlink pointing to librte_pipeline.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.23 00:02:26.112 Installing symlink pointing to librte_pipeline.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:02:26.112 Installing symlink pointing to librte_graph.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.23 00:02:26.112 Installing symlink pointing to librte_graph.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:02:26.112 Installing symlink pointing to librte_node.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.23 00:02:26.112 Installing symlink pointing to librte_node.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:02:26.112 Installing symlink pointing to librte_bus_pci.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so.23 00:02:26.112 Installing symlink pointing to librte_bus_pci.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_pci.so 00:02:26.112 Installing symlink pointing to librte_bus_vdev.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so.23 00:02:26.112 Installing symlink pointing to librte_bus_vdev.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_bus_vdev.so 00:02:26.112 Installing symlink pointing to librte_mempool_ring.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so.23 00:02:26.112 Installing symlink pointing to librte_mempool_ring.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_mempool_ring.so 00:02:26.112 Installing symlink pointing to librte_net_i40e.so.23.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so.23 00:02:26.112 Installing symlink pointing to librte_net_i40e.so.23 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-23.0/librte_net_i40e.so 00:02:26.112 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-23.0' 00:02:26.112 16:26:23 build_native_dpdk -- common/autobuild_common.sh@213 -- $ cat 00:02:26.112 16:26:23 build_native_dpdk -- common/autobuild_common.sh@218 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:26.112 00:02:26.112 real 0m26.134s 00:02:26.112 user 6m36.881s 00:02:26.112 sys 2m11.574s 00:02:26.112 16:26:23 build_native_dpdk -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:02:26.112 16:26:23 build_native_dpdk -- common/autotest_common.sh@10 -- $ set +x 00:02:26.112 ************************************ 00:02:26.112 END TEST build_native_dpdk 00:02:26.112 ************************************ 00:02:26.112 16:26:23 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:26.112 16:26:23 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:26.112 16:26:23 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:02:26.112 16:26:23 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:02:26.113 16:26:23 -- common/autobuild_common.sh@438 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:02:26.113 16:26:23 -- common/autotest_common.sh@1101 -- $ '[' 2 -le 1 ']' 00:02:26.113 16:26:23 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:02:26.113 16:26:23 -- common/autotest_common.sh@10 -- $ set +x 00:02:26.113 ************************************ 00:02:26.113 START TEST autobuild_llvm_precompile 00:02:26.113 ************************************ 00:02:26.113 16:26:23 autobuild_llvm_precompile -- common/autotest_common.sh@1125 -- $ _llvm_precompile 00:02:26.113 16:26:23 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ clang --version 00:02:26.113 16:26:23 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:02:26.113 Target: x86_64-redhat-linux-gnu 00:02:26.113 Thread model: posix 00:02:26.113 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:02:26.113 16:26:23 autobuild_llvm_precompile -- common/autobuild_common.sh@33 -- $ clang_num=17 00:02:26.113 16:26:23 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:02:26.113 16:26:23 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:02:26.113 16:26:23 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:02:26.113 16:26:23 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:02:26.113 16:26:23 autobuild_llvm_precompile -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:02:26.113 16:26:23 autobuild_llvm_precompile -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:26.113 16:26:23 autobuild_llvm_precompile -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:02:26.113 16:26:23 autobuild_llvm_precompile -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:02:26.113 16:26:23 autobuild_llvm_precompile -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:26.372 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:26.632 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:26.632 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:26.632 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:27.199 Using 'verbs' RDMA provider 00:02:43.017 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:02:55.223 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:02:55.791 Creating mk/config.mk...done. 00:02:55.791 Creating mk/cc.flags.mk...done. 00:02:55.791 Type 'make' to build. 00:02:55.791 00:02:55.791 real 0m29.540s 00:02:55.791 user 0m12.925s 00:02:55.791 sys 0m15.946s 00:02:55.791 16:26:53 autobuild_llvm_precompile -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:02:55.791 16:26:53 autobuild_llvm_precompile -- common/autotest_common.sh@10 -- $ set +x 00:02:55.791 ************************************ 00:02:55.791 END TEST autobuild_llvm_precompile 00:02:55.791 ************************************ 00:02:55.791 16:26:53 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:55.791 16:26:53 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:55.791 16:26:53 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:55.791 16:26:53 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:55.791 16:26:53 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:56.051 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:56.051 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:56.051 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:56.310 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:56.569 Using 'verbs' RDMA provider 00:03:09.715 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:21.945 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:21.945 Creating mk/config.mk...done. 00:03:21.945 Creating mk/cc.flags.mk...done. 00:03:21.945 Type 'make' to build. 00:03:21.945 16:27:18 -- spdk/autobuild.sh@70 -- $ run_test make make -j112 00:03:21.945 16:27:18 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:03:21.945 16:27:18 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:03:21.945 16:27:18 -- common/autotest_common.sh@10 -- $ set +x 00:03:21.945 ************************************ 00:03:21.945 START TEST make 00:03:21.945 ************************************ 00:03:21.945 16:27:18 make -- common/autotest_common.sh@1125 -- $ make -j112 00:03:21.945 make[1]: Nothing to be done for 'all'. 00:03:22.888 The Meson build system 00:03:22.888 Version: 1.5.0 00:03:22.888 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:03:22.888 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:22.888 Build type: native build 00:03:22.888 Project name: libvfio-user 00:03:22.888 Project version: 0.0.1 00:03:22.888 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:03:22.888 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:03:22.888 Host machine cpu family: x86_64 00:03:22.888 Host machine cpu: x86_64 00:03:22.888 Run-time dependency threads found: YES 00:03:22.888 Library dl found: YES 00:03:22.888 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:03:22.888 Run-time dependency json-c found: YES 0.17 00:03:22.888 Run-time dependency cmocka found: YES 1.1.7 00:03:22.888 Program pytest-3 found: NO 00:03:22.888 Program flake8 found: NO 00:03:22.888 Program misspell-fixer found: NO 00:03:22.888 Program restructuredtext-lint found: NO 00:03:22.888 Program valgrind found: YES (/usr/bin/valgrind) 00:03:22.888 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:22.888 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:22.888 Compiler for C supports arguments -Wwrite-strings: YES 00:03:22.888 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:22.888 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:03:22.888 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:03:22.888 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:22.888 Build targets in project: 8 00:03:22.888 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:03:22.888 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:03:22.888 00:03:22.888 libvfio-user 0.0.1 00:03:22.888 00:03:22.888 User defined options 00:03:22.888 buildtype : debug 00:03:22.888 default_library: static 00:03:22.888 libdir : /usr/local/lib 00:03:22.888 00:03:22.888 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:23.148 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:23.148 [1/36] Compiling C object samples/null.p/null.c.o 00:03:23.148 [2/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:03:23.148 [3/36] Compiling C object samples/lspci.p/lspci.c.o 00:03:23.148 [4/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:03:23.148 [5/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:03:23.148 [6/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:03:23.148 [7/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:03:23.148 [8/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:03:23.148 [9/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:03:23.148 [10/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:03:23.148 [11/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:03:23.148 [12/36] Compiling C object test/unit_tests.p/mocks.c.o 00:03:23.148 [13/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:03:23.148 [14/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:03:23.148 [15/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:03:23.148 [16/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:03:23.148 [17/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:03:23.148 [18/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:03:23.148 [19/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:03:23.148 [20/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:03:23.148 [21/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:03:23.148 [22/36] Compiling C object samples/server.p/server.c.o 00:03:23.148 [23/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:03:23.148 [24/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:03:23.149 [25/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:03:23.149 [26/36] Compiling C object samples/client.p/client.c.o 00:03:23.408 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:03:23.408 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:03:23.408 [29/36] Linking static target lib/libvfio-user.a 00:03:23.408 [30/36] Linking target samples/client 00:03:23.408 [31/36] Linking target test/unit_tests 00:03:23.408 [32/36] Linking target samples/lspci 00:03:23.408 [33/36] Linking target samples/server 00:03:23.408 [34/36] Linking target samples/shadow_ioeventfd_server 00:03:23.408 [35/36] Linking target samples/null 00:03:23.408 [36/36] Linking target samples/gpio-pci-idio-16 00:03:23.408 INFO: autodetecting backend as ninja 00:03:23.408 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:23.408 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:23.668 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:23.668 ninja: no work to do. 00:03:35.960 CC lib/log/log.o 00:03:35.960 CC lib/log/log_flags.o 00:03:35.960 CC lib/log/log_deprecated.o 00:03:35.960 CC lib/ut_mock/mock.o 00:03:35.960 CC lib/ut/ut.o 00:03:35.960 LIB libspdk_log.a 00:03:35.960 LIB libspdk_ut.a 00:03:35.960 LIB libspdk_ut_mock.a 00:03:36.219 CC lib/util/base64.o 00:03:36.219 CC lib/util/bit_array.o 00:03:36.219 CC lib/util/cpuset.o 00:03:36.219 CC lib/util/crc16.o 00:03:36.219 CC lib/util/crc32.o 00:03:36.219 CC lib/util/crc32c.o 00:03:36.219 CC lib/util/crc32_ieee.o 00:03:36.219 CC lib/util/crc64.o 00:03:36.219 CC lib/util/dif.o 00:03:36.219 CC lib/util/fd.o 00:03:36.219 CC lib/util/fd_group.o 00:03:36.219 CC lib/util/file.o 00:03:36.219 CC lib/util/net.o 00:03:36.219 CC lib/util/hexlify.o 00:03:36.219 CC lib/util/iov.o 00:03:36.219 CC lib/util/math.o 00:03:36.219 CC lib/util/pipe.o 00:03:36.219 CC lib/util/strerror_tls.o 00:03:36.219 CC lib/util/string.o 00:03:36.219 CC lib/util/xor.o 00:03:36.219 CC lib/util/uuid.o 00:03:36.219 CC lib/util/zipf.o 00:03:36.219 CC lib/util/md5.o 00:03:36.219 CXX lib/trace_parser/trace.o 00:03:36.219 CC lib/ioat/ioat.o 00:03:36.219 CC lib/dma/dma.o 00:03:36.219 CC lib/vfio_user/host/vfio_user_pci.o 00:03:36.219 CC lib/vfio_user/host/vfio_user.o 00:03:36.219 LIB libspdk_dma.a 00:03:36.219 LIB libspdk_ioat.a 00:03:36.477 LIB libspdk_vfio_user.a 00:03:36.477 LIB libspdk_util.a 00:03:36.736 LIB libspdk_trace_parser.a 00:03:36.736 CC lib/conf/conf.o 00:03:36.736 CC lib/vmd/led.o 00:03:36.736 CC lib/vmd/vmd.o 00:03:36.736 CC lib/env_dpdk/env.o 00:03:36.736 CC lib/env_dpdk/init.o 00:03:36.736 CC lib/env_dpdk/memory.o 00:03:36.736 CC lib/env_dpdk/threads.o 00:03:36.736 CC lib/env_dpdk/pci.o 00:03:36.736 CC lib/env_dpdk/pci_virtio.o 00:03:36.736 CC lib/env_dpdk/pci_ioat.o 00:03:36.736 CC lib/env_dpdk/pci_vmd.o 00:03:36.736 CC lib/env_dpdk/pci_idxd.o 00:03:36.736 CC lib/env_dpdk/pci_event.o 00:03:36.736 CC lib/env_dpdk/sigbus_handler.o 00:03:36.736 CC lib/env_dpdk/pci_dpdk.o 00:03:36.736 CC lib/json/json_parse.o 00:03:36.736 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:36.736 CC lib/json/json_write.o 00:03:36.736 CC lib/json/json_util.o 00:03:36.736 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:36.736 CC lib/rdma_utils/rdma_utils.o 00:03:36.736 CC lib/idxd/idxd_user.o 00:03:36.736 CC lib/rdma_provider/common.o 00:03:36.736 CC lib/idxd/idxd.o 00:03:36.736 CC lib/rdma_provider/rdma_provider_verbs.o 00:03:36.736 CC lib/idxd/idxd_kernel.o 00:03:36.736 LIB libspdk_conf.a 00:03:36.994 LIB libspdk_rdma_provider.a 00:03:36.994 LIB libspdk_json.a 00:03:36.994 LIB libspdk_rdma_utils.a 00:03:36.994 LIB libspdk_idxd.a 00:03:36.994 LIB libspdk_vmd.a 00:03:37.252 CC lib/jsonrpc/jsonrpc_server.o 00:03:37.252 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:37.252 CC lib/jsonrpc/jsonrpc_client.o 00:03:37.252 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:37.253 LIB libspdk_jsonrpc.a 00:03:37.820 LIB libspdk_env_dpdk.a 00:03:37.820 CC lib/rpc/rpc.o 00:03:37.820 LIB libspdk_rpc.a 00:03:38.078 CC lib/notify/notify.o 00:03:38.078 CC lib/notify/notify_rpc.o 00:03:38.078 CC lib/keyring/keyring.o 00:03:38.078 CC lib/keyring/keyring_rpc.o 00:03:38.078 CC lib/trace/trace.o 00:03:38.078 CC lib/trace/trace_flags.o 00:03:38.078 CC lib/trace/trace_rpc.o 00:03:38.337 LIB libspdk_notify.a 00:03:38.337 LIB libspdk_keyring.a 00:03:38.337 LIB libspdk_trace.a 00:03:38.596 CC lib/sock/sock.o 00:03:38.596 CC lib/sock/sock_rpc.o 00:03:38.596 CC lib/thread/thread.o 00:03:38.596 CC lib/thread/iobuf.o 00:03:38.855 LIB libspdk_sock.a 00:03:39.114 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:39.114 CC lib/nvme/nvme_ctrlr.o 00:03:39.114 CC lib/nvme/nvme_fabric.o 00:03:39.114 CC lib/nvme/nvme_ns_cmd.o 00:03:39.114 CC lib/nvme/nvme_ns.o 00:03:39.114 CC lib/nvme/nvme_pcie_common.o 00:03:39.114 CC lib/nvme/nvme_pcie.o 00:03:39.114 CC lib/nvme/nvme_qpair.o 00:03:39.114 CC lib/nvme/nvme.o 00:03:39.114 CC lib/nvme/nvme_quirks.o 00:03:39.114 CC lib/nvme/nvme_transport.o 00:03:39.114 CC lib/nvme/nvme_opal.o 00:03:39.114 CC lib/nvme/nvme_discovery.o 00:03:39.114 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:39.114 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:39.114 CC lib/nvme/nvme_tcp.o 00:03:39.114 CC lib/nvme/nvme_io_msg.o 00:03:39.114 CC lib/nvme/nvme_poll_group.o 00:03:39.114 CC lib/nvme/nvme_zns.o 00:03:39.114 CC lib/nvme/nvme_auth.o 00:03:39.114 CC lib/nvme/nvme_stubs.o 00:03:39.114 CC lib/nvme/nvme_cuse.o 00:03:39.114 CC lib/nvme/nvme_vfio_user.o 00:03:39.114 CC lib/nvme/nvme_rdma.o 00:03:39.373 LIB libspdk_thread.a 00:03:39.631 CC lib/virtio/virtio.o 00:03:39.631 CC lib/vfu_tgt/tgt_rpc.o 00:03:39.631 CC lib/vfu_tgt/tgt_endpoint.o 00:03:39.631 CC lib/virtio/virtio_pci.o 00:03:39.631 CC lib/virtio/virtio_vhost_user.o 00:03:39.631 CC lib/virtio/virtio_vfio_user.o 00:03:39.631 CC lib/accel/accel_rpc.o 00:03:39.631 CC lib/accel/accel.o 00:03:39.631 CC lib/fsdev/fsdev_io.o 00:03:39.631 CC lib/fsdev/fsdev.o 00:03:39.631 CC lib/accel/accel_sw.o 00:03:39.631 CC lib/fsdev/fsdev_rpc.o 00:03:39.631 CC lib/init/json_config.o 00:03:39.631 CC lib/blob/request.o 00:03:39.631 CC lib/blob/blobstore.o 00:03:39.631 CC lib/init/subsystem.o 00:03:39.631 CC lib/init/subsystem_rpc.o 00:03:39.631 CC lib/blob/zeroes.o 00:03:39.631 CC lib/init/rpc.o 00:03:39.631 CC lib/blob/blob_bs_dev.o 00:03:39.890 LIB libspdk_init.a 00:03:39.890 LIB libspdk_virtio.a 00:03:39.890 LIB libspdk_vfu_tgt.a 00:03:39.890 LIB libspdk_fsdev.a 00:03:40.149 CC lib/event/app.o 00:03:40.150 CC lib/event/reactor.o 00:03:40.150 CC lib/event/app_rpc.o 00:03:40.150 CC lib/event/log_rpc.o 00:03:40.150 CC lib/event/scheduler_static.o 00:03:40.409 CC lib/fuse_dispatcher/fuse_dispatcher.o 00:03:40.409 LIB libspdk_event.a 00:03:40.409 LIB libspdk_accel.a 00:03:40.409 LIB libspdk_nvme.a 00:03:40.669 CC lib/bdev/bdev.o 00:03:40.669 CC lib/bdev/bdev_rpc.o 00:03:40.669 CC lib/bdev/bdev_zone.o 00:03:40.669 CC lib/bdev/part.o 00:03:40.669 CC lib/bdev/scsi_nvme.o 00:03:40.669 LIB libspdk_fuse_dispatcher.a 00:03:41.239 LIB libspdk_blob.a 00:03:41.498 CC lib/blobfs/blobfs.o 00:03:41.498 CC lib/blobfs/tree.o 00:03:41.498 CC lib/lvol/lvol.o 00:03:42.068 LIB libspdk_lvol.a 00:03:42.068 LIB libspdk_blobfs.a 00:03:42.327 LIB libspdk_bdev.a 00:03:42.585 CC lib/ublk/ublk.o 00:03:42.585 CC lib/ublk/ublk_rpc.o 00:03:42.585 CC lib/ftl/ftl_core.o 00:03:42.585 CC lib/ftl/ftl_init.o 00:03:42.585 CC lib/nbd/nbd.o 00:03:42.585 CC lib/ftl/ftl_layout.o 00:03:42.585 CC lib/nvmf/ctrlr.o 00:03:42.585 CC lib/nbd/nbd_rpc.o 00:03:42.585 CC lib/ftl/ftl_debug.o 00:03:42.585 CC lib/ftl/ftl_io.o 00:03:42.585 CC lib/ftl/ftl_sb.o 00:03:42.585 CC lib/nvmf/ctrlr_discovery.o 00:03:42.585 CC lib/ftl/ftl_l2p.o 00:03:42.585 CC lib/ftl/ftl_nv_cache.o 00:03:42.585 CC lib/ftl/ftl_l2p_flat.o 00:03:42.585 CC lib/nvmf/ctrlr_bdev.o 00:03:42.585 CC lib/nvmf/subsystem.o 00:03:42.585 CC lib/ftl/ftl_band.o 00:03:42.585 CC lib/nvmf/nvmf.o 00:03:42.585 CC lib/ftl/ftl_band_ops.o 00:03:42.585 CC lib/ftl/ftl_reloc.o 00:03:42.585 CC lib/ftl/ftl_writer.o 00:03:42.585 CC lib/ftl/ftl_rq.o 00:03:42.585 CC lib/nvmf/nvmf_rpc.o 00:03:42.585 CC lib/nvmf/tcp.o 00:03:42.585 CC lib/nvmf/stubs.o 00:03:42.585 CC lib/nvmf/transport.o 00:03:42.585 CC lib/ftl/ftl_l2p_cache.o 00:03:42.585 CC lib/nvmf/mdns_server.o 00:03:42.585 CC lib/ftl/ftl_p2l.o 00:03:42.585 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:42.585 CC lib/nvmf/vfio_user.o 00:03:42.585 CC lib/scsi/dev.o 00:03:42.585 CC lib/ftl/ftl_p2l_log.o 00:03:42.585 CC lib/scsi/lun.o 00:03:42.585 CC lib/ftl/mngt/ftl_mngt.o 00:03:42.585 CC lib/nvmf/rdma.o 00:03:42.585 CC lib/scsi/port.o 00:03:42.585 CC lib/nvmf/auth.o 00:03:42.585 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:42.585 CC lib/scsi/scsi.o 00:03:42.585 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:42.585 CC lib/scsi/scsi_rpc.o 00:03:42.585 CC lib/scsi/scsi_bdev.o 00:03:42.585 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:42.585 CC lib/scsi/scsi_pr.o 00:03:42.585 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:42.585 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:42.585 CC lib/scsi/task.o 00:03:42.585 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:42.585 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:42.585 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:42.585 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:42.585 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:42.585 CC lib/ftl/utils/ftl_conf.o 00:03:42.585 CC lib/ftl/utils/ftl_md.o 00:03:42.585 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:42.585 CC lib/ftl/utils/ftl_mempool.o 00:03:42.585 CC lib/ftl/utils/ftl_property.o 00:03:42.585 CC lib/ftl/utils/ftl_bitmap.o 00:03:42.585 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:42.585 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:42.585 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:42.585 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:42.585 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:42.585 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:42.585 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:03:42.585 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:42.585 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:42.585 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:42.585 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:42.585 CC lib/ftl/nvc/ftl_nvc_bdev_non_vss.o 00:03:42.585 CC lib/ftl/nvc/ftl_nvc_bdev_common.o 00:03:42.585 CC lib/ftl/base/ftl_base_dev.o 00:03:42.585 CC lib/ftl/base/ftl_base_bdev.o 00:03:42.585 CC lib/ftl/ftl_trace.o 00:03:42.844 LIB libspdk_nbd.a 00:03:43.102 LIB libspdk_ublk.a 00:03:43.102 LIB libspdk_scsi.a 00:03:43.361 LIB libspdk_ftl.a 00:03:43.361 CC lib/iscsi/conn.o 00:03:43.361 CC lib/iscsi/init_grp.o 00:03:43.361 CC lib/vhost/vhost.o 00:03:43.361 CC lib/vhost/vhost_rpc.o 00:03:43.361 CC lib/iscsi/iscsi.o 00:03:43.361 CC lib/iscsi/param.o 00:03:43.361 CC lib/vhost/vhost_scsi.o 00:03:43.361 CC lib/iscsi/portal_grp.o 00:03:43.361 CC lib/vhost/vhost_blk.o 00:03:43.361 CC lib/vhost/rte_vhost_user.o 00:03:43.361 CC lib/iscsi/tgt_node.o 00:03:43.361 CC lib/iscsi/iscsi_subsystem.o 00:03:43.361 CC lib/iscsi/iscsi_rpc.o 00:03:43.361 CC lib/iscsi/task.o 00:03:43.930 LIB libspdk_nvmf.a 00:03:43.930 LIB libspdk_vhost.a 00:03:44.189 LIB libspdk_iscsi.a 00:03:44.757 CC module/vfu_device/vfu_virtio_blk.o 00:03:44.757 CC module/vfu_device/vfu_virtio.o 00:03:44.757 CC module/vfu_device/vfu_virtio_fs.o 00:03:44.757 CC module/vfu_device/vfu_virtio_scsi.o 00:03:44.757 CC module/vfu_device/vfu_virtio_rpc.o 00:03:44.757 CC module/env_dpdk/env_dpdk_rpc.o 00:03:44.757 CC module/accel/error/accel_error.o 00:03:44.757 CC module/accel/error/accel_error_rpc.o 00:03:44.757 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:44.757 CC module/sock/posix/posix.o 00:03:44.757 CC module/accel/dsa/accel_dsa_rpc.o 00:03:44.757 CC module/accel/dsa/accel_dsa.o 00:03:44.757 CC module/scheduler/gscheduler/gscheduler.o 00:03:44.757 CC module/accel/ioat/accel_ioat.o 00:03:44.757 CC module/accel/ioat/accel_ioat_rpc.o 00:03:44.757 CC module/accel/iaa/accel_iaa.o 00:03:44.757 CC module/accel/iaa/accel_iaa_rpc.o 00:03:44.757 LIB libspdk_env_dpdk_rpc.a 00:03:44.757 CC module/blob/bdev/blob_bdev.o 00:03:44.757 CC module/keyring/linux/keyring.o 00:03:44.757 CC module/keyring/linux/keyring_rpc.o 00:03:44.757 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:44.757 CC module/fsdev/aio/fsdev_aio.o 00:03:44.757 CC module/fsdev/aio/fsdev_aio_rpc.o 00:03:44.757 CC module/fsdev/aio/linux_aio_mgr.o 00:03:44.757 CC module/keyring/file/keyring_rpc.o 00:03:44.757 CC module/keyring/file/keyring.o 00:03:44.757 LIB libspdk_scheduler_gscheduler.a 00:03:44.757 LIB libspdk_accel_error.a 00:03:44.757 LIB libspdk_scheduler_dpdk_governor.a 00:03:44.757 LIB libspdk_keyring_linux.a 00:03:44.757 LIB libspdk_accel_ioat.a 00:03:44.757 LIB libspdk_keyring_file.a 00:03:45.015 LIB libspdk_scheduler_dynamic.a 00:03:45.015 LIB libspdk_accel_iaa.a 00:03:45.015 LIB libspdk_blob_bdev.a 00:03:45.015 LIB libspdk_accel_dsa.a 00:03:45.015 LIB libspdk_vfu_device.a 00:03:45.273 LIB libspdk_sock_posix.a 00:03:45.273 LIB libspdk_fsdev_aio.a 00:03:45.273 CC module/bdev/delay/vbdev_delay.o 00:03:45.273 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:45.273 CC module/bdev/passthru/vbdev_passthru.o 00:03:45.273 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:45.273 CC module/bdev/null/bdev_null.o 00:03:45.273 CC module/bdev/null/bdev_null_rpc.o 00:03:45.273 CC module/bdev/raid/bdev_raid.o 00:03:45.273 CC module/bdev/raid/bdev_raid_rpc.o 00:03:45.273 CC module/bdev/raid/bdev_raid_sb.o 00:03:45.273 CC module/bdev/raid/raid1.o 00:03:45.273 CC module/bdev/gpt/gpt.o 00:03:45.273 CC module/bdev/raid/raid0.o 00:03:45.273 CC module/bdev/gpt/vbdev_gpt.o 00:03:45.273 CC module/bdev/raid/concat.o 00:03:45.273 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:45.273 CC module/bdev/nvme/bdev_nvme.o 00:03:45.273 CC module/bdev/ftl/bdev_ftl.o 00:03:45.273 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:45.273 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:45.273 CC module/bdev/lvol/vbdev_lvol.o 00:03:45.273 CC module/bdev/split/vbdev_split.o 00:03:45.273 CC module/bdev/nvme/vbdev_opal.o 00:03:45.273 CC module/bdev/nvme/bdev_mdns_client.o 00:03:45.273 CC module/bdev/nvme/nvme_rpc.o 00:03:45.273 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:45.273 CC module/bdev/split/vbdev_split_rpc.o 00:03:45.273 CC module/bdev/error/vbdev_error.o 00:03:45.273 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:45.273 CC module/bdev/error/vbdev_error_rpc.o 00:03:45.273 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:45.273 CC module/blobfs/bdev/blobfs_bdev.o 00:03:45.273 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:45.273 CC module/bdev/iscsi/bdev_iscsi.o 00:03:45.273 CC module/bdev/malloc/bdev_malloc.o 00:03:45.273 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:45.273 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:45.273 CC module/bdev/aio/bdev_aio.o 00:03:45.273 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:45.273 CC module/bdev/aio/bdev_aio_rpc.o 00:03:45.273 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:45.273 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:45.273 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:45.531 LIB libspdk_blobfs_bdev.a 00:03:45.531 LIB libspdk_bdev_split.a 00:03:45.531 LIB libspdk_bdev_null.a 00:03:45.531 LIB libspdk_bdev_gpt.a 00:03:45.531 LIB libspdk_bdev_error.a 00:03:45.531 LIB libspdk_bdev_passthru.a 00:03:45.531 LIB libspdk_bdev_ftl.a 00:03:45.531 LIB libspdk_bdev_delay.a 00:03:45.531 LIB libspdk_bdev_aio.a 00:03:45.531 LIB libspdk_bdev_iscsi.a 00:03:45.531 LIB libspdk_bdev_zone_block.a 00:03:45.531 LIB libspdk_bdev_malloc.a 00:03:45.789 LIB libspdk_bdev_lvol.a 00:03:45.789 LIB libspdk_bdev_virtio.a 00:03:46.048 LIB libspdk_bdev_raid.a 00:03:46.615 LIB libspdk_bdev_nvme.a 00:03:47.183 CC module/event/subsystems/vmd/vmd.o 00:03:47.183 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:47.183 CC module/event/subsystems/sock/sock.o 00:03:47.183 CC module/event/subsystems/iobuf/iobuf.o 00:03:47.183 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:47.183 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:47.183 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:03:47.183 CC module/event/subsystems/keyring/keyring.o 00:03:47.183 CC module/event/subsystems/fsdev/fsdev.o 00:03:47.183 CC module/event/subsystems/scheduler/scheduler.o 00:03:47.442 LIB libspdk_event_vmd.a 00:03:47.442 LIB libspdk_event_sock.a 00:03:47.442 LIB libspdk_event_vhost_blk.a 00:03:47.442 LIB libspdk_event_keyring.a 00:03:47.442 LIB libspdk_event_iobuf.a 00:03:47.442 LIB libspdk_event_fsdev.a 00:03:47.442 LIB libspdk_event_scheduler.a 00:03:47.442 LIB libspdk_event_vfu_tgt.a 00:03:47.699 CC module/event/subsystems/accel/accel.o 00:03:47.699 LIB libspdk_event_accel.a 00:03:47.958 CC module/event/subsystems/bdev/bdev.o 00:03:48.217 LIB libspdk_event_bdev.a 00:03:48.476 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:48.476 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:48.476 CC module/event/subsystems/ublk/ublk.o 00:03:48.476 CC module/event/subsystems/scsi/scsi.o 00:03:48.476 CC module/event/subsystems/nbd/nbd.o 00:03:48.476 LIB libspdk_event_ublk.a 00:03:48.736 LIB libspdk_event_nbd.a 00:03:48.736 LIB libspdk_event_scsi.a 00:03:48.736 LIB libspdk_event_nvmf.a 00:03:48.996 CC module/event/subsystems/iscsi/iscsi.o 00:03:48.996 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:48.996 LIB libspdk_event_iscsi.a 00:03:48.996 LIB libspdk_event_vhost_scsi.a 00:03:49.256 CC test/rpc_client/rpc_client_test.o 00:03:49.256 CC app/trace_record/trace_record.o 00:03:49.256 CXX app/trace/trace.o 00:03:49.256 TEST_HEADER include/spdk/accel.h 00:03:49.256 TEST_HEADER include/spdk/accel_module.h 00:03:49.256 TEST_HEADER include/spdk/barrier.h 00:03:49.521 TEST_HEADER include/spdk/assert.h 00:03:49.521 TEST_HEADER include/spdk/base64.h 00:03:49.521 TEST_HEADER include/spdk/bdev_zone.h 00:03:49.521 TEST_HEADER include/spdk/bdev_module.h 00:03:49.521 TEST_HEADER include/spdk/bdev.h 00:03:49.521 CC app/spdk_nvme_perf/perf.o 00:03:49.521 CC app/spdk_nvme_discover/discovery_aer.o 00:03:49.521 TEST_HEADER include/spdk/blob_bdev.h 00:03:49.521 TEST_HEADER include/spdk/bit_array.h 00:03:49.521 TEST_HEADER include/spdk/bit_pool.h 00:03:49.521 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:49.521 CC app/spdk_top/spdk_top.o 00:03:49.521 TEST_HEADER include/spdk/blobfs.h 00:03:49.521 TEST_HEADER include/spdk/conf.h 00:03:49.522 TEST_HEADER include/spdk/blob.h 00:03:49.522 TEST_HEADER include/spdk/config.h 00:03:49.522 TEST_HEADER include/spdk/crc16.h 00:03:49.522 TEST_HEADER include/spdk/cpuset.h 00:03:49.522 TEST_HEADER include/spdk/crc64.h 00:03:49.522 TEST_HEADER include/spdk/crc32.h 00:03:49.522 TEST_HEADER include/spdk/endian.h 00:03:49.522 TEST_HEADER include/spdk/dif.h 00:03:49.522 TEST_HEADER include/spdk/dma.h 00:03:49.522 TEST_HEADER include/spdk/event.h 00:03:49.522 TEST_HEADER include/spdk/env_dpdk.h 00:03:49.522 TEST_HEADER include/spdk/env.h 00:03:49.522 TEST_HEADER include/spdk/fd_group.h 00:03:49.522 TEST_HEADER include/spdk/fd.h 00:03:49.522 TEST_HEADER include/spdk/fsdev.h 00:03:49.522 TEST_HEADER include/spdk/fsdev_module.h 00:03:49.522 TEST_HEADER include/spdk/file.h 00:03:49.522 TEST_HEADER include/spdk/fuse_dispatcher.h 00:03:49.522 TEST_HEADER include/spdk/ftl.h 00:03:49.522 TEST_HEADER include/spdk/gpt_spec.h 00:03:49.522 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:49.522 TEST_HEADER include/spdk/idxd.h 00:03:49.522 TEST_HEADER include/spdk/hexlify.h 00:03:49.522 TEST_HEADER include/spdk/init.h 00:03:49.522 TEST_HEADER include/spdk/histogram_data.h 00:03:49.522 TEST_HEADER include/spdk/idxd_spec.h 00:03:49.522 CC app/spdk_nvme_identify/identify.o 00:03:49.522 TEST_HEADER include/spdk/ioat_spec.h 00:03:49.522 TEST_HEADER include/spdk/iscsi_spec.h 00:03:49.522 TEST_HEADER include/spdk/ioat.h 00:03:49.522 CC app/spdk_lspci/spdk_lspci.o 00:03:49.522 TEST_HEADER include/spdk/json.h 00:03:49.522 TEST_HEADER include/spdk/jsonrpc.h 00:03:49.522 TEST_HEADER include/spdk/keyring.h 00:03:49.522 TEST_HEADER include/spdk/likely.h 00:03:49.522 TEST_HEADER include/spdk/log.h 00:03:49.522 TEST_HEADER include/spdk/keyring_module.h 00:03:49.522 TEST_HEADER include/spdk/lvol.h 00:03:49.522 CC app/spdk_dd/spdk_dd.o 00:03:49.522 TEST_HEADER include/spdk/memory.h 00:03:49.522 TEST_HEADER include/spdk/md5.h 00:03:49.522 TEST_HEADER include/spdk/mmio.h 00:03:49.522 TEST_HEADER include/spdk/net.h 00:03:49.522 TEST_HEADER include/spdk/notify.h 00:03:49.522 TEST_HEADER include/spdk/nbd.h 00:03:49.522 TEST_HEADER include/spdk/nvme.h 00:03:49.522 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:49.522 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:49.522 TEST_HEADER include/spdk/nvme_intel.h 00:03:49.522 TEST_HEADER include/spdk/nvme_spec.h 00:03:49.522 TEST_HEADER include/spdk/nvme_zns.h 00:03:49.522 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:49.522 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:49.522 TEST_HEADER include/spdk/nvmf.h 00:03:49.522 TEST_HEADER include/spdk/opal.h 00:03:49.522 TEST_HEADER include/spdk/nvmf_transport.h 00:03:49.522 TEST_HEADER include/spdk/opal_spec.h 00:03:49.522 TEST_HEADER include/spdk/nvmf_spec.h 00:03:49.522 TEST_HEADER include/spdk/pipe.h 00:03:49.522 TEST_HEADER include/spdk/pci_ids.h 00:03:49.522 TEST_HEADER include/spdk/queue.h 00:03:49.522 TEST_HEADER include/spdk/rpc.h 00:03:49.522 TEST_HEADER include/spdk/reduce.h 00:03:49.522 TEST_HEADER include/spdk/scheduler.h 00:03:49.522 TEST_HEADER include/spdk/scsi.h 00:03:49.522 TEST_HEADER include/spdk/sock.h 00:03:49.522 TEST_HEADER include/spdk/string.h 00:03:49.522 TEST_HEADER include/spdk/scsi_spec.h 00:03:49.522 TEST_HEADER include/spdk/thread.h 00:03:49.522 TEST_HEADER include/spdk/trace_parser.h 00:03:49.522 TEST_HEADER include/spdk/trace.h 00:03:49.522 TEST_HEADER include/spdk/stdinc.h 00:03:49.522 TEST_HEADER include/spdk/tree.h 00:03:49.522 TEST_HEADER include/spdk/util.h 00:03:49.522 TEST_HEADER include/spdk/ublk.h 00:03:49.522 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:49.522 TEST_HEADER include/spdk/version.h 00:03:49.522 TEST_HEADER include/spdk/vhost.h 00:03:49.522 TEST_HEADER include/spdk/uuid.h 00:03:49.522 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:49.522 TEST_HEADER include/spdk/zipf.h 00:03:49.522 TEST_HEADER include/spdk/vmd.h 00:03:49.522 TEST_HEADER include/spdk/xor.h 00:03:49.522 CC app/iscsi_tgt/iscsi_tgt.o 00:03:49.522 CXX test/cpp_headers/accel_module.o 00:03:49.522 CXX test/cpp_headers/accel.o 00:03:49.522 CXX test/cpp_headers/assert.o 00:03:49.522 CXX test/cpp_headers/barrier.o 00:03:49.522 CXX test/cpp_headers/bdev.o 00:03:49.522 CXX test/cpp_headers/base64.o 00:03:49.522 CXX test/cpp_headers/bdev_module.o 00:03:49.522 CXX test/cpp_headers/bit_array.o 00:03:49.522 CXX test/cpp_headers/bdev_zone.o 00:03:49.522 CXX test/cpp_headers/bit_pool.o 00:03:49.522 CC app/nvmf_tgt/nvmf_main.o 00:03:49.522 CXX test/cpp_headers/blob_bdev.o 00:03:49.522 CXX test/cpp_headers/blobfs.o 00:03:49.522 CXX test/cpp_headers/blob.o 00:03:49.522 CXX test/cpp_headers/blobfs_bdev.o 00:03:49.522 CXX test/cpp_headers/crc16.o 00:03:49.522 CXX test/cpp_headers/config.o 00:03:49.522 CXX test/cpp_headers/conf.o 00:03:49.522 CXX test/cpp_headers/cpuset.o 00:03:49.522 CXX test/cpp_headers/crc32.o 00:03:49.522 CXX test/cpp_headers/dif.o 00:03:49.522 CXX test/cpp_headers/dma.o 00:03:49.522 CXX test/cpp_headers/crc64.o 00:03:49.522 CXX test/cpp_headers/env_dpdk.o 00:03:49.522 CXX test/cpp_headers/endian.o 00:03:49.522 CXX test/cpp_headers/env.o 00:03:49.522 CXX test/cpp_headers/event.o 00:03:49.522 CXX test/cpp_headers/fd.o 00:03:49.522 CXX test/cpp_headers/fd_group.o 00:03:49.522 CXX test/cpp_headers/file.o 00:03:49.522 CXX test/cpp_headers/fsdev.o 00:03:49.522 CXX test/cpp_headers/fsdev_module.o 00:03:49.522 CXX test/cpp_headers/ftl.o 00:03:49.522 CXX test/cpp_headers/hexlify.o 00:03:49.522 CXX test/cpp_headers/fuse_dispatcher.o 00:03:49.522 CXX test/cpp_headers/gpt_spec.o 00:03:49.522 CXX test/cpp_headers/idxd.o 00:03:49.522 CXX test/cpp_headers/histogram_data.o 00:03:49.522 CXX test/cpp_headers/idxd_spec.o 00:03:49.522 CXX test/cpp_headers/init.o 00:03:49.522 CXX test/cpp_headers/ioat.o 00:03:49.522 CXX test/cpp_headers/ioat_spec.o 00:03:49.522 CC app/spdk_tgt/spdk_tgt.o 00:03:49.522 CXX test/cpp_headers/iscsi_spec.o 00:03:49.522 CXX test/cpp_headers/json.o 00:03:49.522 CXX test/cpp_headers/jsonrpc.o 00:03:49.522 CXX test/cpp_headers/keyring.o 00:03:49.522 CXX test/cpp_headers/keyring_module.o 00:03:49.522 CXX test/cpp_headers/likely.o 00:03:49.522 CC test/env/vtophys/vtophys.o 00:03:49.522 CXX test/cpp_headers/log.o 00:03:49.522 CXX test/cpp_headers/lvol.o 00:03:49.522 CXX test/cpp_headers/md5.o 00:03:49.522 CXX test/cpp_headers/memory.o 00:03:49.522 CXX test/cpp_headers/nbd.o 00:03:49.522 CXX test/cpp_headers/mmio.o 00:03:49.522 CXX test/cpp_headers/net.o 00:03:49.522 CXX test/cpp_headers/notify.o 00:03:49.522 CXX test/cpp_headers/nvme.o 00:03:49.522 CXX test/cpp_headers/nvme_intel.o 00:03:49.522 CXX test/cpp_headers/nvme_ocssd.o 00:03:49.522 CC test/thread/lock/spdk_lock.o 00:03:49.522 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:49.522 CC test/thread/poller_perf/poller_perf.o 00:03:49.522 CXX test/cpp_headers/nvme_spec.o 00:03:49.522 CXX test/cpp_headers/nvme_zns.o 00:03:49.522 CC examples/util/zipf/zipf.o 00:03:49.522 CC test/env/memory/memory_ut.o 00:03:49.522 CXX test/cpp_headers/nvmf_cmd.o 00:03:49.522 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:49.522 CXX test/cpp_headers/nvmf.o 00:03:49.522 CXX test/cpp_headers/nvmf_spec.o 00:03:49.522 CXX test/cpp_headers/nvmf_transport.o 00:03:49.522 CC test/env/pci/pci_ut.o 00:03:49.522 CXX test/cpp_headers/opal.o 00:03:49.522 CXX test/cpp_headers/opal_spec.o 00:03:49.522 CXX test/cpp_headers/pci_ids.o 00:03:49.522 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:49.522 CXX test/cpp_headers/pipe.o 00:03:49.522 CC test/app/jsoncat/jsoncat.o 00:03:49.522 CXX test/cpp_headers/queue.o 00:03:49.522 CC test/app/histogram_perf/histogram_perf.o 00:03:49.522 CXX test/cpp_headers/reduce.o 00:03:49.522 CXX test/cpp_headers/rpc.o 00:03:49.522 CXX test/cpp_headers/scheduler.o 00:03:49.522 CXX test/cpp_headers/scsi.o 00:03:49.522 CXX test/cpp_headers/scsi_spec.o 00:03:49.522 CXX test/cpp_headers/sock.o 00:03:49.522 CC examples/ioat/verify/verify.o 00:03:49.522 CXX test/cpp_headers/stdinc.o 00:03:49.522 CXX test/cpp_headers/string.o 00:03:49.522 CXX test/cpp_headers/thread.o 00:03:49.522 CC examples/ioat/perf/perf.o 00:03:49.522 CXX test/cpp_headers/trace.o 00:03:49.522 CC test/app/stub/stub.o 00:03:49.522 CXX test/cpp_headers/trace_parser.o 00:03:49.522 LINK rpc_client_test 00:03:49.522 CC test/dma/test_dma/test_dma.o 00:03:49.522 CC app/fio/nvme/fio_plugin.o 00:03:49.522 CXX test/cpp_headers/tree.o 00:03:49.522 LINK spdk_lspci 00:03:49.522 CC test/app/bdev_svc/bdev_svc.o 00:03:49.522 CC test/env/mem_callbacks/mem_callbacks.o 00:03:49.522 LINK spdk_nvme_discover 00:03:49.522 CXX test/cpp_headers/ublk.o 00:03:49.522 LINK spdk_trace_record 00:03:49.522 LINK interrupt_tgt 00:03:49.522 CC app/fio/bdev/fio_plugin.o 00:03:49.522 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:49.781 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:49.781 CXX test/cpp_headers/util.o 00:03:49.781 CXX test/cpp_headers/version.o 00:03:49.781 CXX test/cpp_headers/uuid.o 00:03:49.781 CXX test/cpp_headers/vfio_user_pci.o 00:03:49.781 CXX test/cpp_headers/vfio_user_spec.o 00:03:49.781 LINK vtophys 00:03:49.781 LINK jsoncat 00:03:49.781 CXX test/cpp_headers/vhost.o 00:03:49.781 CXX test/cpp_headers/vmd.o 00:03:49.781 CXX test/cpp_headers/xor.o 00:03:49.781 CXX test/cpp_headers/zipf.o 00:03:49.781 LINK zipf 00:03:49.781 LINK poller_perf 00:03:49.781 LINK histogram_perf 00:03:49.781 LINK env_dpdk_post_init 00:03:49.781 LINK iscsi_tgt 00:03:49.781 LINK nvmf_tgt 00:03:49.781 LINK stub 00:03:49.781 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:49.781 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:49.781 LINK verify 00:03:49.781 LINK spdk_trace 00:03:49.781 LINK ioat_perf 00:03:49.781 LINK spdk_tgt 00:03:49.781 LINK bdev_svc 00:03:49.781 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:03:49.781 LINK mem_callbacks 00:03:49.781 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:03:50.040 LINK spdk_dd 00:03:50.040 LINK nvme_fuzz 00:03:50.040 LINK test_dma 00:03:50.040 LINK pci_ut 00:03:50.040 LINK spdk_nvme 00:03:50.040 LINK spdk_nvme_identify 00:03:50.040 LINK llvm_vfio_fuzz 00:03:50.040 LINK spdk_nvme_perf 00:03:50.040 LINK vhost_fuzz 00:03:50.040 LINK spdk_bdev 00:03:50.040 LINK spdk_top 00:03:50.299 LINK memory_ut 00:03:50.299 LINK llvm_nvme_fuzz 00:03:50.299 CC examples/sock/hello_world/hello_sock.o 00:03:50.299 CC app/vhost/vhost.o 00:03:50.299 CC examples/idxd/perf/perf.o 00:03:50.299 CC examples/vmd/led/led.o 00:03:50.299 CC examples/vmd/lsvmd/lsvmd.o 00:03:50.299 CC examples/thread/thread/thread_ex.o 00:03:50.299 LINK lsvmd 00:03:50.558 LINK led 00:03:50.558 LINK hello_sock 00:03:50.558 LINK vhost 00:03:50.558 LINK idxd_perf 00:03:50.558 LINK spdk_lock 00:03:50.558 LINK thread 00:03:50.816 LINK iscsi_fuzz 00:03:51.076 CC examples/nvme/abort/abort.o 00:03:51.076 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:51.076 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:51.076 CC examples/nvme/hello_world/hello_world.o 00:03:51.076 CC examples/nvme/hotplug/hotplug.o 00:03:51.076 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:51.076 CC examples/nvme/arbitration/arbitration.o 00:03:51.076 CC examples/nvme/reconnect/reconnect.o 00:03:51.076 CC test/event/event_perf/event_perf.o 00:03:51.076 CC test/event/reactor/reactor.o 00:03:51.076 CC test/event/reactor_perf/reactor_perf.o 00:03:51.336 CC test/event/app_repeat/app_repeat.o 00:03:51.336 CC test/event/scheduler/scheduler.o 00:03:51.336 LINK pmr_persistence 00:03:51.336 LINK event_perf 00:03:51.336 LINK cmb_copy 00:03:51.336 LINK reactor_perf 00:03:51.336 LINK reactor 00:03:51.336 LINK hello_world 00:03:51.336 LINK hotplug 00:03:51.336 LINK app_repeat 00:03:51.336 LINK abort 00:03:51.336 LINK reconnect 00:03:51.336 LINK arbitration 00:03:51.336 LINK nvme_manage 00:03:51.336 LINK scheduler 00:03:51.595 CC test/nvme/overhead/overhead.o 00:03:51.595 CC test/nvme/startup/startup.o 00:03:51.595 CC test/nvme/boot_partition/boot_partition.o 00:03:51.595 CC test/nvme/connect_stress/connect_stress.o 00:03:51.595 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:51.595 CC test/nvme/simple_copy/simple_copy.o 00:03:51.595 CC test/nvme/aer/aer.o 00:03:51.595 CC test/nvme/reset/reset.o 00:03:51.595 CC test/nvme/sgl/sgl.o 00:03:51.595 CC test/nvme/e2edp/nvme_dp.o 00:03:51.595 CC test/nvme/reserve/reserve.o 00:03:51.595 CC test/nvme/err_injection/err_injection.o 00:03:51.595 CC test/nvme/compliance/nvme_compliance.o 00:03:51.595 CC test/nvme/fused_ordering/fused_ordering.o 00:03:51.595 CC test/nvme/cuse/cuse.o 00:03:51.595 CC test/nvme/fdp/fdp.o 00:03:51.595 CC test/blobfs/mkfs/mkfs.o 00:03:51.595 CC test/accel/dif/dif.o 00:03:51.854 CC test/lvol/esnap/esnap.o 00:03:51.854 LINK connect_stress 00:03:51.854 LINK boot_partition 00:03:51.854 LINK startup 00:03:51.854 LINK err_injection 00:03:51.854 LINK doorbell_aers 00:03:51.854 LINK reserve 00:03:51.854 LINK simple_copy 00:03:51.854 LINK fused_ordering 00:03:51.854 LINK overhead 00:03:51.854 LINK aer 00:03:51.854 LINK sgl 00:03:51.854 LINK reset 00:03:51.854 LINK nvme_dp 00:03:51.854 LINK mkfs 00:03:51.854 LINK fdp 00:03:51.854 LINK nvme_compliance 00:03:52.113 LINK dif 00:03:52.373 CC examples/accel/perf/accel_perf.o 00:03:52.373 CC examples/fsdev/hello_world/hello_fsdev.o 00:03:52.373 CC examples/blob/cli/blobcli.o 00:03:52.373 CC examples/blob/hello_world/hello_blob.o 00:03:52.373 LINK hello_fsdev 00:03:52.373 LINK cuse 00:03:52.373 LINK hello_blob 00:03:52.632 LINK accel_perf 00:03:52.632 LINK blobcli 00:03:53.201 CC examples/bdev/bdevperf/bdevperf.o 00:03:53.201 CC examples/bdev/hello_world/hello_bdev.o 00:03:53.460 LINK hello_bdev 00:03:53.718 CC test/bdev/bdevio/bdevio.o 00:03:53.718 LINK bdevperf 00:03:53.977 LINK bdevio 00:03:54.913 LINK esnap 00:03:55.172 CC examples/nvmf/nvmf/nvmf.o 00:03:55.431 LINK nvmf 00:03:56.810 00:03:56.810 real 0m35.785s 00:03:56.810 user 4m36.941s 00:03:56.810 sys 1m40.796s 00:03:56.810 16:27:54 make -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:03:56.810 16:27:54 make -- common/autotest_common.sh@10 -- $ set +x 00:03:56.810 ************************************ 00:03:56.810 END TEST make 00:03:56.810 ************************************ 00:03:56.810 16:27:54 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:03:56.810 16:27:54 -- pm/common@29 -- $ signal_monitor_resources TERM 00:03:56.810 16:27:54 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:03:56.810 16:27:54 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:56.810 16:27:54 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:03:56.810 16:27:54 -- pm/common@44 -- $ pid=3613841 00:03:56.810 16:27:54 -- pm/common@50 -- $ kill -TERM 3613841 00:03:56.810 16:27:54 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:56.810 16:27:54 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:03:56.810 16:27:54 -- pm/common@44 -- $ pid=3613843 00:03:56.810 16:27:54 -- pm/common@50 -- $ kill -TERM 3613843 00:03:56.810 16:27:54 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:56.810 16:27:54 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:03:56.810 16:27:54 -- pm/common@44 -- $ pid=3613845 00:03:56.810 16:27:54 -- pm/common@50 -- $ kill -TERM 3613845 00:03:56.810 16:27:54 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:56.810 16:27:54 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:03:56.810 16:27:54 -- pm/common@44 -- $ pid=3613872 00:03:56.810 16:27:54 -- pm/common@50 -- $ sudo -E kill -TERM 3613872 00:03:56.810 16:27:54 -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:03:56.810 16:27:54 -- common/autotest_common.sh@1681 -- # lcov --version 00:03:56.810 16:27:54 -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:03:56.810 16:27:54 -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:03:56.810 16:27:54 -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:03:56.810 16:27:54 -- scripts/common.sh@333 -- # local ver1 ver1_l 00:03:56.810 16:27:54 -- scripts/common.sh@334 -- # local ver2 ver2_l 00:03:56.810 16:27:54 -- scripts/common.sh@336 -- # IFS=.-: 00:03:56.810 16:27:54 -- scripts/common.sh@336 -- # read -ra ver1 00:03:56.810 16:27:54 -- scripts/common.sh@337 -- # IFS=.-: 00:03:56.810 16:27:54 -- scripts/common.sh@337 -- # read -ra ver2 00:03:56.810 16:27:54 -- scripts/common.sh@338 -- # local 'op=<' 00:03:56.810 16:27:54 -- scripts/common.sh@340 -- # ver1_l=2 00:03:56.810 16:27:54 -- scripts/common.sh@341 -- # ver2_l=1 00:03:56.810 16:27:54 -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:03:56.810 16:27:54 -- scripts/common.sh@344 -- # case "$op" in 00:03:56.810 16:27:54 -- scripts/common.sh@345 -- # : 1 00:03:56.810 16:27:54 -- scripts/common.sh@364 -- # (( v = 0 )) 00:03:56.810 16:27:54 -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:56.810 16:27:54 -- scripts/common.sh@365 -- # decimal 1 00:03:56.810 16:27:54 -- scripts/common.sh@353 -- # local d=1 00:03:56.810 16:27:54 -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:56.810 16:27:54 -- scripts/common.sh@355 -- # echo 1 00:03:56.810 16:27:54 -- scripts/common.sh@365 -- # ver1[v]=1 00:03:56.810 16:27:54 -- scripts/common.sh@366 -- # decimal 2 00:03:56.810 16:27:54 -- scripts/common.sh@353 -- # local d=2 00:03:56.810 16:27:54 -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:56.810 16:27:54 -- scripts/common.sh@355 -- # echo 2 00:03:56.810 16:27:54 -- scripts/common.sh@366 -- # ver2[v]=2 00:03:56.810 16:27:54 -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:03:56.810 16:27:54 -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:03:56.810 16:27:54 -- scripts/common.sh@368 -- # return 0 00:03:56.810 16:27:54 -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:56.810 16:27:54 -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:03:56.810 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:56.810 --rc genhtml_branch_coverage=1 00:03:56.810 --rc genhtml_function_coverage=1 00:03:56.810 --rc genhtml_legend=1 00:03:56.810 --rc geninfo_all_blocks=1 00:03:56.810 --rc geninfo_unexecuted_blocks=1 00:03:56.810 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:56.810 ' 00:03:56.810 16:27:54 -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:03:56.810 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:56.810 --rc genhtml_branch_coverage=1 00:03:56.810 --rc genhtml_function_coverage=1 00:03:56.810 --rc genhtml_legend=1 00:03:56.810 --rc geninfo_all_blocks=1 00:03:56.810 --rc geninfo_unexecuted_blocks=1 00:03:56.810 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:56.810 ' 00:03:56.810 16:27:54 -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:03:56.810 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:56.810 --rc genhtml_branch_coverage=1 00:03:56.810 --rc genhtml_function_coverage=1 00:03:56.810 --rc genhtml_legend=1 00:03:56.810 --rc geninfo_all_blocks=1 00:03:56.810 --rc geninfo_unexecuted_blocks=1 00:03:56.810 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:56.810 ' 00:03:56.810 16:27:54 -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:03:56.810 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:56.810 --rc genhtml_branch_coverage=1 00:03:56.810 --rc genhtml_function_coverage=1 00:03:56.811 --rc genhtml_legend=1 00:03:56.811 --rc geninfo_all_blocks=1 00:03:56.811 --rc geninfo_unexecuted_blocks=1 00:03:56.811 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:56.811 ' 00:03:56.811 16:27:54 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:03:56.811 16:27:54 -- nvmf/common.sh@7 -- # uname -s 00:03:56.811 16:27:54 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:56.811 16:27:54 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:56.811 16:27:54 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:56.811 16:27:54 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:56.811 16:27:54 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:56.811 16:27:54 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:56.811 16:27:54 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:56.811 16:27:54 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:56.811 16:27:54 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:56.811 16:27:54 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:56.811 16:27:54 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:03:56.811 16:27:54 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:03:56.811 16:27:54 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:56.811 16:27:54 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:56.811 16:27:54 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:56.811 16:27:54 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:03:56.811 16:27:54 -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:03:56.811 16:27:54 -- scripts/common.sh@15 -- # shopt -s extglob 00:03:56.811 16:27:54 -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:56.811 16:27:54 -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:56.811 16:27:54 -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:56.811 16:27:54 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:56.811 16:27:54 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:56.811 16:27:54 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:56.811 16:27:54 -- paths/export.sh@5 -- # export PATH 00:03:56.811 16:27:54 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:56.811 16:27:54 -- nvmf/common.sh@51 -- # : 0 00:03:56.811 16:27:54 -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:03:56.811 16:27:54 -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:03:56.811 16:27:54 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:03:56.811 16:27:54 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:56.811 16:27:54 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:56.811 16:27:54 -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:03:56.811 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:03:56.811 16:27:54 -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:03:56.811 16:27:54 -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:03:56.811 16:27:54 -- nvmf/common.sh@55 -- # have_pci_nics=0 00:03:56.811 16:27:54 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:56.811 16:27:54 -- spdk/autotest.sh@32 -- # uname -s 00:03:56.811 16:27:54 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:56.811 16:27:54 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:56.811 16:27:54 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:56.811 16:27:54 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:56.811 16:27:54 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:03:56.811 16:27:54 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:56.811 16:27:54 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:56.811 16:27:54 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:56.811 16:27:54 -- spdk/autotest.sh@48 -- # udevadm_pid=3692168 00:03:56.811 16:27:54 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:03:56.811 16:27:54 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:56.811 16:27:54 -- pm/common@17 -- # local monitor 00:03:56.811 16:27:54 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:56.811 16:27:54 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:56.811 16:27:54 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:56.811 16:27:54 -- pm/common@21 -- # date +%s 00:03:56.811 16:27:54 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:56.811 16:27:54 -- pm/common@21 -- # date +%s 00:03:56.811 16:27:54 -- pm/common@25 -- # sleep 1 00:03:56.811 16:27:54 -- pm/common@21 -- # date +%s 00:03:56.811 16:27:54 -- pm/common@21 -- # date +%s 00:03:56.811 16:27:54 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732807674 00:03:56.811 16:27:54 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732807674 00:03:56.811 16:27:54 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732807674 00:03:56.811 16:27:54 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732807674 00:03:57.070 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732807674_collect-cpu-load.pm.log 00:03:57.070 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732807674_collect-vmstat.pm.log 00:03:57.070 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732807674_collect-cpu-temp.pm.log 00:03:57.070 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732807674_collect-bmc-pm.bmc.pm.log 00:03:58.006 16:27:55 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:58.006 16:27:55 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:03:58.006 16:27:55 -- common/autotest_common.sh@724 -- # xtrace_disable 00:03:58.006 16:27:55 -- common/autotest_common.sh@10 -- # set +x 00:03:58.006 16:27:55 -- spdk/autotest.sh@59 -- # create_test_list 00:03:58.006 16:27:55 -- common/autotest_common.sh@748 -- # xtrace_disable 00:03:58.006 16:27:55 -- common/autotest_common.sh@10 -- # set +x 00:03:58.006 16:27:55 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:03:58.006 16:27:55 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:58.006 16:27:55 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:58.006 16:27:55 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:03:58.006 16:27:55 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:03:58.006 16:27:55 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:03:58.006 16:27:55 -- common/autotest_common.sh@1455 -- # uname 00:03:58.006 16:27:55 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:03:58.006 16:27:55 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:03:58.006 16:27:55 -- common/autotest_common.sh@1475 -- # uname 00:03:58.006 16:27:55 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:03:58.006 16:27:55 -- spdk/autotest.sh@68 -- # [[ y == y ]] 00:03:58.006 16:27:55 -- spdk/autotest.sh@70 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:03:58.006 lcov: LCOV version 1.15 00:03:58.007 16:27:55 -- spdk/autotest.sh@72 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:04:03.281 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcno 00:04:08.555 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:04:13.835 16:28:11 -- spdk/autotest.sh@76 -- # timing_enter pre_cleanup 00:04:13.835 16:28:11 -- common/autotest_common.sh@724 -- # xtrace_disable 00:04:13.835 16:28:11 -- common/autotest_common.sh@10 -- # set +x 00:04:13.835 16:28:11 -- spdk/autotest.sh@78 -- # rm -f 00:04:13.835 16:28:11 -- spdk/autotest.sh@81 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:17.125 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:04:17.125 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:04:17.125 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:04:17.125 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:04:17.125 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:04:17.125 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:04:17.125 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:04:17.125 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:04:17.125 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:04:17.125 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:04:17.125 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:04:17.125 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:04:17.125 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:04:17.125 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:04:17.125 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:04:17.125 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:04:17.125 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:04:17.125 16:28:14 -- spdk/autotest.sh@83 -- # get_zoned_devs 00:04:17.125 16:28:14 -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:04:17.125 16:28:14 -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:04:17.125 16:28:14 -- common/autotest_common.sh@1656 -- # local nvme bdf 00:04:17.125 16:28:14 -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:04:17.125 16:28:14 -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:04:17.126 16:28:14 -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:04:17.126 16:28:14 -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:17.126 16:28:14 -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:04:17.126 16:28:14 -- spdk/autotest.sh@85 -- # (( 0 > 0 )) 00:04:17.126 16:28:14 -- spdk/autotest.sh@97 -- # for dev in /dev/nvme*n!(*p*) 00:04:17.126 16:28:14 -- spdk/autotest.sh@99 -- # [[ -z '' ]] 00:04:17.126 16:28:14 -- spdk/autotest.sh@100 -- # block_in_use /dev/nvme0n1 00:04:17.126 16:28:14 -- scripts/common.sh@381 -- # local block=/dev/nvme0n1 pt 00:04:17.126 16:28:14 -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:04:17.126 No valid GPT data, bailing 00:04:17.126 16:28:14 -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:17.126 16:28:14 -- scripts/common.sh@394 -- # pt= 00:04:17.126 16:28:14 -- scripts/common.sh@395 -- # return 1 00:04:17.126 16:28:14 -- spdk/autotest.sh@101 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:04:17.126 1+0 records in 00:04:17.126 1+0 records out 00:04:17.126 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0050034 s, 210 MB/s 00:04:17.126 16:28:14 -- spdk/autotest.sh@105 -- # sync 00:04:17.126 16:28:14 -- spdk/autotest.sh@107 -- # xtrace_disable_per_cmd reap_spdk_processes 00:04:17.126 16:28:14 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:04:17.126 16:28:14 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:25.353 16:28:21 -- spdk/autotest.sh@111 -- # uname -s 00:04:25.354 16:28:21 -- spdk/autotest.sh@111 -- # [[ Linux == Linux ]] 00:04:25.354 16:28:21 -- spdk/autotest.sh@111 -- # [[ 1 -eq 1 ]] 00:04:25.354 16:28:21 -- spdk/autotest.sh@112 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:25.354 16:28:21 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:25.354 16:28:21 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:25.354 16:28:21 -- common/autotest_common.sh@10 -- # set +x 00:04:25.354 ************************************ 00:04:25.354 START TEST setup.sh 00:04:25.354 ************************************ 00:04:25.354 16:28:21 setup.sh -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:25.354 * Looking for test storage... 00:04:25.354 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:25.354 16:28:22 setup.sh -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:25.354 16:28:22 setup.sh -- common/autotest_common.sh@1681 -- # lcov --version 00:04:25.354 16:28:22 setup.sh -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:25.354 16:28:22 setup.sh -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@336 -- # IFS=.-: 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@336 -- # read -ra ver1 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@337 -- # IFS=.-: 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@337 -- # read -ra ver2 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@338 -- # local 'op=<' 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@340 -- # ver1_l=2 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@341 -- # ver2_l=1 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@344 -- # case "$op" in 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@345 -- # : 1 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@365 -- # decimal 1 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@353 -- # local d=1 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@355 -- # echo 1 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@365 -- # ver1[v]=1 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@366 -- # decimal 2 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@353 -- # local d=2 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@355 -- # echo 2 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@366 -- # ver2[v]=2 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:25.354 16:28:22 setup.sh -- scripts/common.sh@368 -- # return 0 00:04:25.354 16:28:22 setup.sh -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:25.354 16:28:22 setup.sh -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:25.354 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:25.354 --rc genhtml_branch_coverage=1 00:04:25.354 --rc genhtml_function_coverage=1 00:04:25.354 --rc genhtml_legend=1 00:04:25.354 --rc geninfo_all_blocks=1 00:04:25.354 --rc geninfo_unexecuted_blocks=1 00:04:25.354 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:25.354 ' 00:04:25.354 16:28:22 setup.sh -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:25.354 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:25.354 --rc genhtml_branch_coverage=1 00:04:25.354 --rc genhtml_function_coverage=1 00:04:25.354 --rc genhtml_legend=1 00:04:25.354 --rc geninfo_all_blocks=1 00:04:25.354 --rc geninfo_unexecuted_blocks=1 00:04:25.354 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:25.354 ' 00:04:25.354 16:28:22 setup.sh -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:25.354 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:25.354 --rc genhtml_branch_coverage=1 00:04:25.354 --rc genhtml_function_coverage=1 00:04:25.354 --rc genhtml_legend=1 00:04:25.354 --rc geninfo_all_blocks=1 00:04:25.354 --rc geninfo_unexecuted_blocks=1 00:04:25.354 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:25.354 ' 00:04:25.354 16:28:22 setup.sh -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:25.354 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:25.354 --rc genhtml_branch_coverage=1 00:04:25.354 --rc genhtml_function_coverage=1 00:04:25.354 --rc genhtml_legend=1 00:04:25.354 --rc geninfo_all_blocks=1 00:04:25.354 --rc geninfo_unexecuted_blocks=1 00:04:25.354 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:25.354 ' 00:04:25.354 16:28:22 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:04:25.354 16:28:22 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:04:25.354 16:28:22 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:25.354 16:28:22 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:25.354 16:28:22 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:25.354 16:28:22 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:25.354 ************************************ 00:04:25.354 START TEST acl 00:04:25.354 ************************************ 00:04:25.354 16:28:22 setup.sh.acl -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:25.354 * Looking for test storage... 00:04:25.354 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:25.354 16:28:22 setup.sh.acl -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:25.354 16:28:22 setup.sh.acl -- common/autotest_common.sh@1681 -- # lcov --version 00:04:25.354 16:28:22 setup.sh.acl -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:25.354 16:28:22 setup.sh.acl -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:25.354 16:28:22 setup.sh.acl -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:25.354 16:28:22 setup.sh.acl -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:25.354 16:28:22 setup.sh.acl -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:25.354 16:28:22 setup.sh.acl -- scripts/common.sh@336 -- # IFS=.-: 00:04:25.354 16:28:22 setup.sh.acl -- scripts/common.sh@336 -- # read -ra ver1 00:04:25.354 16:28:22 setup.sh.acl -- scripts/common.sh@337 -- # IFS=.-: 00:04:25.354 16:28:22 setup.sh.acl -- scripts/common.sh@337 -- # read -ra ver2 00:04:25.354 16:28:22 setup.sh.acl -- scripts/common.sh@338 -- # local 'op=<' 00:04:25.354 16:28:22 setup.sh.acl -- scripts/common.sh@340 -- # ver1_l=2 00:04:25.354 16:28:22 setup.sh.acl -- scripts/common.sh@341 -- # ver2_l=1 00:04:25.354 16:28:22 setup.sh.acl -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:25.354 16:28:22 setup.sh.acl -- scripts/common.sh@344 -- # case "$op" in 00:04:25.354 16:28:22 setup.sh.acl -- scripts/common.sh@345 -- # : 1 00:04:25.354 16:28:22 setup.sh.acl -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:25.354 16:28:22 setup.sh.acl -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:25.354 16:28:22 setup.sh.acl -- scripts/common.sh@365 -- # decimal 1 00:04:25.354 16:28:22 setup.sh.acl -- scripts/common.sh@353 -- # local d=1 00:04:25.354 16:28:22 setup.sh.acl -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:25.354 16:28:22 setup.sh.acl -- scripts/common.sh@355 -- # echo 1 00:04:25.355 16:28:22 setup.sh.acl -- scripts/common.sh@365 -- # ver1[v]=1 00:04:25.355 16:28:22 setup.sh.acl -- scripts/common.sh@366 -- # decimal 2 00:04:25.355 16:28:22 setup.sh.acl -- scripts/common.sh@353 -- # local d=2 00:04:25.355 16:28:22 setup.sh.acl -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:25.355 16:28:22 setup.sh.acl -- scripts/common.sh@355 -- # echo 2 00:04:25.355 16:28:22 setup.sh.acl -- scripts/common.sh@366 -- # ver2[v]=2 00:04:25.355 16:28:22 setup.sh.acl -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:25.355 16:28:22 setup.sh.acl -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:25.355 16:28:22 setup.sh.acl -- scripts/common.sh@368 -- # return 0 00:04:25.355 16:28:22 setup.sh.acl -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:25.355 16:28:22 setup.sh.acl -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:25.355 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:25.355 --rc genhtml_branch_coverage=1 00:04:25.355 --rc genhtml_function_coverage=1 00:04:25.355 --rc genhtml_legend=1 00:04:25.355 --rc geninfo_all_blocks=1 00:04:25.355 --rc geninfo_unexecuted_blocks=1 00:04:25.355 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:25.355 ' 00:04:25.355 16:28:22 setup.sh.acl -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:25.355 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:25.355 --rc genhtml_branch_coverage=1 00:04:25.355 --rc genhtml_function_coverage=1 00:04:25.355 --rc genhtml_legend=1 00:04:25.355 --rc geninfo_all_blocks=1 00:04:25.355 --rc geninfo_unexecuted_blocks=1 00:04:25.355 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:25.355 ' 00:04:25.355 16:28:22 setup.sh.acl -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:25.355 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:25.355 --rc genhtml_branch_coverage=1 00:04:25.355 --rc genhtml_function_coverage=1 00:04:25.355 --rc genhtml_legend=1 00:04:25.355 --rc geninfo_all_blocks=1 00:04:25.355 --rc geninfo_unexecuted_blocks=1 00:04:25.355 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:25.355 ' 00:04:25.355 16:28:22 setup.sh.acl -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:25.355 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:25.355 --rc genhtml_branch_coverage=1 00:04:25.355 --rc genhtml_function_coverage=1 00:04:25.355 --rc genhtml_legend=1 00:04:25.355 --rc geninfo_all_blocks=1 00:04:25.355 --rc geninfo_unexecuted_blocks=1 00:04:25.355 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:25.355 ' 00:04:25.355 16:28:22 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:04:25.355 16:28:22 setup.sh.acl -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:04:25.355 16:28:22 setup.sh.acl -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:04:25.355 16:28:22 setup.sh.acl -- common/autotest_common.sh@1656 -- # local nvme bdf 00:04:25.355 16:28:22 setup.sh.acl -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:04:25.355 16:28:22 setup.sh.acl -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:04:25.355 16:28:22 setup.sh.acl -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:04:25.355 16:28:22 setup.sh.acl -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:25.355 16:28:22 setup.sh.acl -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:04:25.355 16:28:22 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:04:25.355 16:28:22 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:04:25.355 16:28:22 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:04:25.355 16:28:22 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:04:25.355 16:28:22 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:04:25.355 16:28:22 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:25.355 16:28:22 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:28.647 16:28:26 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:04:28.647 16:28:26 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:04:28.647 16:28:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:28.647 16:28:26 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:04:28.647 16:28:26 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:04:28.647 16:28:26 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:31.939 Hugepages 00:04:31.939 node hugesize free / total 00:04:32.198 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:32.198 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:32.198 16:28:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:32.198 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:32.198 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:32.199 00:04:32.199 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:32.199 16:28:29 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:32.199 16:28:29 setup.sh.acl -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:32.199 16:28:29 setup.sh.acl -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:32.199 16:28:29 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:32.459 ************************************ 00:04:32.459 START TEST denied 00:04:32.459 ************************************ 00:04:32.459 16:28:29 setup.sh.acl.denied -- common/autotest_common.sh@1125 -- # denied 00:04:32.459 16:28:29 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:04:32.459 16:28:29 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:04:32.459 16:28:29 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:32.459 16:28:29 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:32.459 16:28:29 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:35.752 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:04:35.752 16:28:33 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:04:35.752 16:28:33 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:35.752 16:28:33 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:35.752 16:28:33 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:04:35.752 16:28:33 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:04:35.752 16:28:33 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:35.752 16:28:33 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:35.752 16:28:33 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:35.752 16:28:33 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:35.752 16:28:33 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:39.956 00:04:39.956 real 0m7.664s 00:04:39.956 user 0m2.381s 00:04:39.956 sys 0m4.603s 00:04:39.956 16:28:37 setup.sh.acl.denied -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:39.956 16:28:37 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:04:39.956 ************************************ 00:04:39.956 END TEST denied 00:04:39.956 ************************************ 00:04:39.956 16:28:37 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:39.956 16:28:37 setup.sh.acl -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:39.956 16:28:37 setup.sh.acl -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:39.956 16:28:37 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:40.216 ************************************ 00:04:40.216 START TEST allowed 00:04:40.216 ************************************ 00:04:40.216 16:28:37 setup.sh.acl.allowed -- common/autotest_common.sh@1125 -- # allowed 00:04:40.216 16:28:37 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:04:40.216 16:28:37 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:04:40.216 16:28:37 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:04:40.216 16:28:37 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:04:40.216 16:28:37 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:45.493 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:45.493 16:28:42 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:04:45.493 16:28:42 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:04:45.493 16:28:42 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:04:45.493 16:28:42 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:45.493 16:28:42 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:48.786 00:04:48.786 real 0m8.782s 00:04:48.786 user 0m2.425s 00:04:48.786 sys 0m4.868s 00:04:48.786 16:28:46 setup.sh.acl.allowed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:48.786 16:28:46 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:04:48.786 ************************************ 00:04:48.786 END TEST allowed 00:04:48.786 ************************************ 00:04:49.046 00:04:49.046 real 0m24.249s 00:04:49.046 user 0m7.732s 00:04:49.046 sys 0m14.648s 00:04:49.046 16:28:46 setup.sh.acl -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:49.046 16:28:46 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:49.046 ************************************ 00:04:49.046 END TEST acl 00:04:49.046 ************************************ 00:04:49.046 16:28:46 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:49.046 16:28:46 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:49.046 16:28:46 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:49.046 16:28:46 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:49.046 ************************************ 00:04:49.046 START TEST hugepages 00:04:49.046 ************************************ 00:04:49.046 16:28:46 setup.sh.hugepages -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:49.046 * Looking for test storage... 00:04:49.046 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:49.046 16:28:46 setup.sh.hugepages -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:04:49.046 16:28:46 setup.sh.hugepages -- common/autotest_common.sh@1681 -- # lcov --version 00:04:49.046 16:28:46 setup.sh.hugepages -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:04:49.046 16:28:46 setup.sh.hugepages -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:04:49.046 16:28:46 setup.sh.hugepages -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:49.046 16:28:46 setup.sh.hugepages -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:49.046 16:28:46 setup.sh.hugepages -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:49.046 16:28:46 setup.sh.hugepages -- scripts/common.sh@336 -- # IFS=.-: 00:04:49.046 16:28:46 setup.sh.hugepages -- scripts/common.sh@336 -- # read -ra ver1 00:04:49.046 16:28:46 setup.sh.hugepages -- scripts/common.sh@337 -- # IFS=.-: 00:04:49.046 16:28:46 setup.sh.hugepages -- scripts/common.sh@337 -- # read -ra ver2 00:04:49.046 16:28:46 setup.sh.hugepages -- scripts/common.sh@338 -- # local 'op=<' 00:04:49.046 16:28:46 setup.sh.hugepages -- scripts/common.sh@340 -- # ver1_l=2 00:04:49.046 16:28:46 setup.sh.hugepages -- scripts/common.sh@341 -- # ver2_l=1 00:04:49.046 16:28:46 setup.sh.hugepages -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:49.046 16:28:46 setup.sh.hugepages -- scripts/common.sh@344 -- # case "$op" in 00:04:49.046 16:28:46 setup.sh.hugepages -- scripts/common.sh@345 -- # : 1 00:04:49.046 16:28:46 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:49.046 16:28:46 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:49.307 16:28:46 setup.sh.hugepages -- scripts/common.sh@365 -- # decimal 1 00:04:49.307 16:28:46 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=1 00:04:49.307 16:28:46 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:49.307 16:28:46 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 1 00:04:49.307 16:28:46 setup.sh.hugepages -- scripts/common.sh@365 -- # ver1[v]=1 00:04:49.307 16:28:46 setup.sh.hugepages -- scripts/common.sh@366 -- # decimal 2 00:04:49.307 16:28:46 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=2 00:04:49.307 16:28:46 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:49.307 16:28:46 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 2 00:04:49.307 16:28:46 setup.sh.hugepages -- scripts/common.sh@366 -- # ver2[v]=2 00:04:49.307 16:28:46 setup.sh.hugepages -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:49.307 16:28:46 setup.sh.hugepages -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:49.307 16:28:46 setup.sh.hugepages -- scripts/common.sh@368 -- # return 0 00:04:49.307 16:28:46 setup.sh.hugepages -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:49.307 16:28:46 setup.sh.hugepages -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:04:49.307 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:49.307 --rc genhtml_branch_coverage=1 00:04:49.307 --rc genhtml_function_coverage=1 00:04:49.307 --rc genhtml_legend=1 00:04:49.307 --rc geninfo_all_blocks=1 00:04:49.307 --rc geninfo_unexecuted_blocks=1 00:04:49.307 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:49.307 ' 00:04:49.307 16:28:46 setup.sh.hugepages -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:04:49.307 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:49.307 --rc genhtml_branch_coverage=1 00:04:49.307 --rc genhtml_function_coverage=1 00:04:49.307 --rc genhtml_legend=1 00:04:49.307 --rc geninfo_all_blocks=1 00:04:49.307 --rc geninfo_unexecuted_blocks=1 00:04:49.307 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:49.307 ' 00:04:49.307 16:28:46 setup.sh.hugepages -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:04:49.307 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:49.307 --rc genhtml_branch_coverage=1 00:04:49.307 --rc genhtml_function_coverage=1 00:04:49.307 --rc genhtml_legend=1 00:04:49.307 --rc geninfo_all_blocks=1 00:04:49.307 --rc geninfo_unexecuted_blocks=1 00:04:49.307 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:49.307 ' 00:04:49.307 16:28:46 setup.sh.hugepages -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:04:49.307 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:49.307 --rc genhtml_branch_coverage=1 00:04:49.307 --rc genhtml_function_coverage=1 00:04:49.307 --rc genhtml_legend=1 00:04:49.307 --rc geninfo_all_blocks=1 00:04:49.307 --rc geninfo_unexecuted_blocks=1 00:04:49.307 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:49.307 ' 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 39712724 kB' 'MemAvailable: 41339400 kB' 'Buffers: 6784 kB' 'Cached: 10907896 kB' 'SwapCached: 76 kB' 'Active: 8331496 kB' 'Inactive: 3175828 kB' 'Active(anon): 7424168 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 595840 kB' 'Mapped: 154556 kB' 'Shmem: 9166708 kB' 'KReclaimable: 575504 kB' 'Slab: 1580020 kB' 'SReclaimable: 575504 kB' 'SUnreclaim: 1004516 kB' 'KernelStack: 21840 kB' 'PageTables: 8676 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433348 kB' 'Committed_AS: 11694352 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217844 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.307 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.308 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGEMEM 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGENODE 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v NRHUGE 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@197 -- # get_nodes 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@26 -- # local node 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@198 -- # clear_hp 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:04:49.309 16:28:46 setup.sh.hugepages -- setup/hugepages.sh@200 -- # run_test single_node_setup single_node_setup 00:04:49.309 16:28:46 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:49.309 16:28:46 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:49.309 16:28:46 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:49.309 ************************************ 00:04:49.309 START TEST single_node_setup 00:04:49.309 ************************************ 00:04:49.309 16:28:46 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1125 -- # single_node_setup 00:04:49.309 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@135 -- # get_test_nr_hugepages 2097152 0 00:04:49.309 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@48 -- # local size=2097152 00:04:49.309 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:04:49.309 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@50 -- # shift 00:04:49.309 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # node_ids=('0') 00:04:49.309 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # local node_ids 00:04:49.309 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:49.309 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:04:49.309 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:04:49.310 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:04:49.310 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # local user_nodes 00:04:49.310 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:04:49.310 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:49.310 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:49.310 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:49.310 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:04:49.310 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:04:49.310 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:04:49.310 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@72 -- # return 0 00:04:49.310 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # NRHUGE=1024 00:04:49.310 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # HUGENODE=0 00:04:49.310 16:28:46 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # setup output 00:04:49.310 16:28:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:04:49.310 16:28:46 setup.sh.hugepages.single_node_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:52.602 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:52.602 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:52.602 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:52.602 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:52.602 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:52.602 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:52.602 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:52.602 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:52.602 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:52.602 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:52.602 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:52.602 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:52.602 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:52.602 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:52.602 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:52.602 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:53.987 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:53.987 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@137 -- # verify_nr_hugepages 00:04:53.987 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@88 -- # local node 00:04:53.987 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@89 -- # local sorted_t 00:04:53.987 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@90 -- # local sorted_s 00:04:53.987 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@91 -- # local surp 00:04:53.987 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@92 -- # local resv 00:04:53.987 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@93 -- # local anon 00:04:53.987 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:53.987 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:04:53.987 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:53.987 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:04:53.987 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:53.987 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:53.987 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:53.987 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:53.987 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:53.987 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:53.987 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:53.987 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.987 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41930708 kB' 'MemAvailable: 43557384 kB' 'Buffers: 6784 kB' 'Cached: 10908044 kB' 'SwapCached: 76 kB' 'Active: 8335668 kB' 'Inactive: 3175828 kB' 'Active(anon): 7428340 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 600396 kB' 'Mapped: 154760 kB' 'Shmem: 9166856 kB' 'KReclaimable: 575504 kB' 'Slab: 1579192 kB' 'SReclaimable: 575504 kB' 'SUnreclaim: 1003688 kB' 'KernelStack: 21888 kB' 'PageTables: 9040 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11699152 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217892 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.988 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # anon=0 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41938520 kB' 'MemAvailable: 43565132 kB' 'Buffers: 6784 kB' 'Cached: 10908048 kB' 'SwapCached: 76 kB' 'Active: 8335920 kB' 'Inactive: 3175828 kB' 'Active(anon): 7428592 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 600512 kB' 'Mapped: 154624 kB' 'Shmem: 9166860 kB' 'KReclaimable: 575440 kB' 'Slab: 1579096 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1003656 kB' 'KernelStack: 21968 kB' 'PageTables: 8772 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11699172 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217924 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.989 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.990 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # surp=0 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41940964 kB' 'MemAvailable: 43567576 kB' 'Buffers: 6784 kB' 'Cached: 10908048 kB' 'SwapCached: 76 kB' 'Active: 8336032 kB' 'Inactive: 3175828 kB' 'Active(anon): 7428704 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 600576 kB' 'Mapped: 154624 kB' 'Shmem: 9166860 kB' 'KReclaimable: 575440 kB' 'Slab: 1579092 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1003652 kB' 'KernelStack: 21936 kB' 'PageTables: 8452 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11699196 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217908 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.991 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.992 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # resv=0 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:04:53.993 nr_hugepages=1024 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:04:53.993 resv_hugepages=0 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:04:53.993 surplus_hugepages=0 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:04:53.993 anon_hugepages=0 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:53.993 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41943320 kB' 'MemAvailable: 43569932 kB' 'Buffers: 6784 kB' 'Cached: 10908084 kB' 'SwapCached: 76 kB' 'Active: 8335636 kB' 'Inactive: 3175828 kB' 'Active(anon): 7428308 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 600048 kB' 'Mapped: 154624 kB' 'Shmem: 9166896 kB' 'KReclaimable: 575440 kB' 'Slab: 1579092 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1003652 kB' 'KernelStack: 21888 kB' 'PageTables: 8616 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11699216 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217972 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:54.256 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.257 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 1024 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@111 -- # get_nodes 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@26 -- # local node 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node=0 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 22551648 kB' 'MemUsed: 10082788 kB' 'SwapCached: 44 kB' 'Active: 5089240 kB' 'Inactive: 535260 kB' 'Active(anon): 4311680 kB' 'Inactive(anon): 56 kB' 'Active(file): 777560 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5340488 kB' 'Mapped: 76384 kB' 'AnonPages: 287476 kB' 'Shmem: 4027680 kB' 'KernelStack: 10872 kB' 'PageTables: 5120 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 400876 kB' 'Slab: 886484 kB' 'SReclaimable: 400876 kB' 'SUnreclaim: 485608 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.258 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.259 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:04:54.260 node0=1024 expecting 1024 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:04:54.260 00:04:54.260 real 0m4.882s 00:04:54.260 user 0m1.175s 00:04:54.260 sys 0m2.133s 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:54.260 16:28:51 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@10 -- # set +x 00:04:54.260 ************************************ 00:04:54.260 END TEST single_node_setup 00:04:54.260 ************************************ 00:04:54.260 16:28:51 setup.sh.hugepages -- setup/hugepages.sh@201 -- # run_test even_2G_alloc even_2G_alloc 00:04:54.260 16:28:51 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:54.260 16:28:51 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:54.260 16:28:51 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:54.260 ************************************ 00:04:54.260 START TEST even_2G_alloc 00:04:54.260 ************************************ 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1125 -- # even_2G_alloc 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@142 -- # get_test_nr_hugepages 2097152 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 512 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 1 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 0 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # NRHUGE=1024 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # setup output 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:54.260 16:28:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:56.816 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:56.816 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:56.816 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:56.816 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:56.816 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:56.816 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:56.816 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:56.816 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:56.816 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:56.816 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:56.816 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:56.816 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:56.816 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:56.816 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:56.816 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:56.816 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:56.816 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:57.079 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@144 -- # verify_nr_hugepages 00:04:57.079 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@88 -- # local node 00:04:57.079 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:04:57.079 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:04:57.079 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local surp 00:04:57.079 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local resv 00:04:57.079 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local anon 00:04:57.079 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:57.079 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:04:57.079 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:57.079 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:57.079 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:57.079 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:57.079 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:57.079 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:57.079 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:57.079 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:57.079 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:57.079 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.079 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41939608 kB' 'MemAvailable: 43566220 kB' 'Buffers: 6784 kB' 'Cached: 10908192 kB' 'SwapCached: 76 kB' 'Active: 8336864 kB' 'Inactive: 3175828 kB' 'Active(anon): 7429536 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 600832 kB' 'Mapped: 154716 kB' 'Shmem: 9167004 kB' 'KReclaimable: 575440 kB' 'Slab: 1580096 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1004656 kB' 'KernelStack: 21920 kB' 'PageTables: 8860 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11696904 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218036 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.080 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # anon=0 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41939284 kB' 'MemAvailable: 43565896 kB' 'Buffers: 6784 kB' 'Cached: 10908196 kB' 'SwapCached: 76 kB' 'Active: 8336524 kB' 'Inactive: 3175828 kB' 'Active(anon): 7429196 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 600624 kB' 'Mapped: 154628 kB' 'Shmem: 9167008 kB' 'KReclaimable: 575440 kB' 'Slab: 1580072 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1004632 kB' 'KernelStack: 21888 kB' 'PageTables: 8772 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11698188 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218004 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.081 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.082 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # surp=0 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41939576 kB' 'MemAvailable: 43566188 kB' 'Buffers: 6784 kB' 'Cached: 10908212 kB' 'SwapCached: 76 kB' 'Active: 8336500 kB' 'Inactive: 3175828 kB' 'Active(anon): 7429172 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 600564 kB' 'Mapped: 154632 kB' 'Shmem: 9167024 kB' 'KReclaimable: 575440 kB' 'Slab: 1580072 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1004632 kB' 'KernelStack: 21984 kB' 'PageTables: 8684 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11698584 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218020 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.083 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.084 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # resv=0 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:04:57.085 nr_hugepages=1024 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:04:57.085 resv_hugepages=0 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:04:57.085 surplus_hugepages=0 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:04:57.085 anon_hugepages=0 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41937348 kB' 'MemAvailable: 43563960 kB' 'Buffers: 6784 kB' 'Cached: 10908252 kB' 'SwapCached: 76 kB' 'Active: 8337272 kB' 'Inactive: 3175828 kB' 'Active(anon): 7429944 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 601416 kB' 'Mapped: 154632 kB' 'Shmem: 9167064 kB' 'KReclaimable: 575440 kB' 'Slab: 1580072 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1004632 kB' 'KernelStack: 22032 kB' 'PageTables: 9308 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11701652 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218052 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.085 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.086 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@26 -- # local node 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23587560 kB' 'MemUsed: 9046876 kB' 'SwapCached: 44 kB' 'Active: 5089652 kB' 'Inactive: 535260 kB' 'Active(anon): 4312092 kB' 'Inactive(anon): 56 kB' 'Active(file): 777560 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5340544 kB' 'Mapped: 76392 kB' 'AnonPages: 287560 kB' 'Shmem: 4027736 kB' 'KernelStack: 11032 kB' 'PageTables: 5680 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 400876 kB' 'Slab: 887260 kB' 'SReclaimable: 400876 kB' 'SUnreclaim: 486384 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.087 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:57.088 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 18350568 kB' 'MemUsed: 9298792 kB' 'SwapCached: 32 kB' 'Active: 3247288 kB' 'Inactive: 2640568 kB' 'Active(anon): 3117520 kB' 'Inactive(anon): 2335128 kB' 'Active(file): 129768 kB' 'Inactive(file): 305440 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5574608 kB' 'Mapped: 78240 kB' 'AnonPages: 313464 kB' 'Shmem: 5139368 kB' 'KernelStack: 10984 kB' 'PageTables: 3520 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 174564 kB' 'Slab: 692812 kB' 'SReclaimable: 174564 kB' 'SUnreclaim: 518248 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.349 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:04:57.350 node0=512 expecting 512 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:57.350 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:57.351 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:04:57.351 node1=512 expecting 512 00:04:57.351 16:28:54 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@129 -- # [[ 512 == \5\1\2 ]] 00:04:57.351 00:04:57.351 real 0m2.985s 00:04:57.351 user 0m1.040s 00:04:57.351 sys 0m1.909s 00:04:57.351 16:28:54 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:57.351 16:28:54 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:57.351 ************************************ 00:04:57.351 END TEST even_2G_alloc 00:04:57.351 ************************************ 00:04:57.351 16:28:54 setup.sh.hugepages -- setup/hugepages.sh@202 -- # run_test odd_alloc odd_alloc 00:04:57.351 16:28:54 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:57.351 16:28:54 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:57.351 16:28:54 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:57.351 ************************************ 00:04:57.351 START TEST odd_alloc 00:04:57.351 ************************************ 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1125 -- # odd_alloc 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@149 -- # get_test_nr_hugepages 2098176 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@48 -- # local size=2098176 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1025 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1025 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 513 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 1 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=513 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 0 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # HUGEMEM=2049 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # setup output 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:57.351 16:28:54 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:00.644 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:00.644 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:00.644 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:00.644 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:00.644 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:00.644 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:00.644 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:00.644 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:00.644 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:00.644 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:00.644 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:00.644 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:00.644 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:00.644 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:00.644 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:00.644 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:00.644 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:00.644 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@151 -- # verify_nr_hugepages 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@88 -- # local node 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41974068 kB' 'MemAvailable: 43600680 kB' 'Buffers: 6784 kB' 'Cached: 10908376 kB' 'SwapCached: 76 kB' 'Active: 8335912 kB' 'Inactive: 3175828 kB' 'Active(anon): 7428584 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 599468 kB' 'Mapped: 153484 kB' 'Shmem: 9167188 kB' 'KReclaimable: 575440 kB' 'Slab: 1579144 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1003704 kB' 'KernelStack: 21888 kB' 'PageTables: 8320 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 11689484 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218244 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.645 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41973776 kB' 'MemAvailable: 43600388 kB' 'Buffers: 6784 kB' 'Cached: 10908380 kB' 'SwapCached: 76 kB' 'Active: 8335920 kB' 'Inactive: 3175828 kB' 'Active(anon): 7428592 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 599744 kB' 'Mapped: 153484 kB' 'Shmem: 9167192 kB' 'KReclaimable: 575440 kB' 'Slab: 1579168 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1003728 kB' 'KernelStack: 21904 kB' 'PageTables: 8236 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 11689500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218148 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.646 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.647 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41974516 kB' 'MemAvailable: 43601128 kB' 'Buffers: 6784 kB' 'Cached: 10908396 kB' 'SwapCached: 76 kB' 'Active: 8335856 kB' 'Inactive: 3175828 kB' 'Active(anon): 7428528 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 599672 kB' 'Mapped: 153484 kB' 'Shmem: 9167208 kB' 'KReclaimable: 575440 kB' 'Slab: 1579328 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1003888 kB' 'KernelStack: 21952 kB' 'PageTables: 8868 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 11687912 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.648 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.649 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1025 00:05:00.650 nr_hugepages=1025 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:00.650 resv_hugepages=0 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:00.650 surplus_hugepages=0 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:00.650 anon_hugepages=0 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@106 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@108 -- # (( 1025 == nr_hugepages )) 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41970704 kB' 'MemAvailable: 43597316 kB' 'Buffers: 6784 kB' 'Cached: 10908420 kB' 'SwapCached: 76 kB' 'Active: 8341064 kB' 'Inactive: 3175828 kB' 'Active(anon): 7433736 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 604964 kB' 'Mapped: 154264 kB' 'Shmem: 9167232 kB' 'KReclaimable: 575440 kB' 'Slab: 1579488 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1004048 kB' 'KernelStack: 21840 kB' 'PageTables: 8480 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 11693036 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218040 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.650 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.651 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:00.652 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@26 -- # local node 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=513 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23588628 kB' 'MemUsed: 9045808 kB' 'SwapCached: 44 kB' 'Active: 5090568 kB' 'Inactive: 535260 kB' 'Active(anon): 4313008 kB' 'Inactive(anon): 56 kB' 'Active(file): 777560 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5340556 kB' 'Mapped: 76096 kB' 'AnonPages: 288428 kB' 'Shmem: 4027748 kB' 'KernelStack: 10872 kB' 'PageTables: 4980 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 400876 kB' 'Slab: 886544 kB' 'SReclaimable: 400876 kB' 'SUnreclaim: 485668 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.914 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 18380688 kB' 'MemUsed: 9268672 kB' 'SwapCached: 32 kB' 'Active: 3247236 kB' 'Inactive: 2640568 kB' 'Active(anon): 3117468 kB' 'Inactive(anon): 2335128 kB' 'Active(file): 129768 kB' 'Inactive(file): 305440 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5574764 kB' 'Mapped: 78040 kB' 'AnonPages: 313132 kB' 'Shmem: 5139524 kB' 'KernelStack: 10952 kB' 'PageTables: 3416 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 174564 kB' 'Slab: 692944 kB' 'SReclaimable: 174564 kB' 'SUnreclaim: 518380 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.915 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.916 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node0=513 expecting 513' 00:05:00.917 node0=513 expecting 513 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:05:00.917 node1=512 expecting 512 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@129 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:05:00.917 00:05:00.917 real 0m3.508s 00:05:00.917 user 0m1.339s 00:05:00.917 sys 0m2.237s 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:00.917 16:28:58 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:00.917 ************************************ 00:05:00.917 END TEST odd_alloc 00:05:00.917 ************************************ 00:05:00.917 16:28:58 setup.sh.hugepages -- setup/hugepages.sh@203 -- # run_test custom_alloc custom_alloc 00:05:00.917 16:28:58 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:00.917 16:28:58 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:00.917 16:28:58 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:00.917 ************************************ 00:05:00.917 START TEST custom_alloc 00:05:00.917 ************************************ 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1125 -- # custom_alloc 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@157 -- # local IFS=, 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@159 -- # local node 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # nodes_hp=() 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # local nodes_hp 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@162 -- # local nr_hugepages=0 _nr_hugepages=0 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@164 -- # get_test_nr_hugepages 1048576 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=1048576 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=512 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=512 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 256 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@165 -- # nodes_hp[0]=512 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@166 -- # (( 2 > 1 )) 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # get_test_nr_hugepages 2097152 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 1 > 0 )) 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@168 -- # nodes_hp[1]=1024 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # get_test_nr_hugepages_per_node 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 2 > 0 )) 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=1024 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # setup output 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:00.917 16:28:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:04.206 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:04.206 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:04.206 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:04.206 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:04.206 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:04.206 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:04.206 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:04.206 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:04.206 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:04.206 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:04.206 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:04.206 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:04.206 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:04.206 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:04.206 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:04.206 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:04.206 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nr_hugepages=1536 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # verify_nr_hugepages 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@88 -- # local node 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40927668 kB' 'MemAvailable: 42554280 kB' 'Buffers: 6784 kB' 'Cached: 10908552 kB' 'SwapCached: 76 kB' 'Active: 8334392 kB' 'Inactive: 3175828 kB' 'Active(anon): 7427064 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597592 kB' 'Mapped: 153628 kB' 'Shmem: 9167364 kB' 'KReclaimable: 575440 kB' 'Slab: 1579192 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1003752 kB' 'KernelStack: 21840 kB' 'PageTables: 8428 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 11687556 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218020 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.472 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.473 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40928528 kB' 'MemAvailable: 42555140 kB' 'Buffers: 6784 kB' 'Cached: 10908556 kB' 'SwapCached: 76 kB' 'Active: 8333900 kB' 'Inactive: 3175828 kB' 'Active(anon): 7426572 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597616 kB' 'Mapped: 153492 kB' 'Shmem: 9167368 kB' 'KReclaimable: 575440 kB' 'Slab: 1579144 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1003704 kB' 'KernelStack: 21824 kB' 'PageTables: 8412 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 11687572 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218036 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.474 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.475 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.476 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40929032 kB' 'MemAvailable: 42555644 kB' 'Buffers: 6784 kB' 'Cached: 10908556 kB' 'SwapCached: 76 kB' 'Active: 8333936 kB' 'Inactive: 3175828 kB' 'Active(anon): 7426608 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597644 kB' 'Mapped: 153492 kB' 'Shmem: 9167368 kB' 'KReclaimable: 575440 kB' 'Slab: 1579144 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1003704 kB' 'KernelStack: 21840 kB' 'PageTables: 8468 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 11687596 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218036 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:04.476 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:01 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.476 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.477 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1536 00:05:04.478 nr_hugepages=1536 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:04.478 resv_hugepages=0 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:04.478 surplus_hugepages=0 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:04.478 anon_hugepages=0 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@106 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@108 -- # (( 1536 == nr_hugepages )) 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 40930436 kB' 'MemAvailable: 42557048 kB' 'Buffers: 6784 kB' 'Cached: 10908612 kB' 'SwapCached: 76 kB' 'Active: 8333592 kB' 'Inactive: 3175828 kB' 'Active(anon): 7426264 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 597200 kB' 'Mapped: 153492 kB' 'Shmem: 9167424 kB' 'KReclaimable: 575440 kB' 'Slab: 1579144 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1003704 kB' 'KernelStack: 21808 kB' 'PageTables: 8352 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 11687616 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218036 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.478 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.479 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@26 -- # local node 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23583704 kB' 'MemUsed: 9050732 kB' 'SwapCached: 44 kB' 'Active: 5087896 kB' 'Inactive: 535260 kB' 'Active(anon): 4310336 kB' 'Inactive(anon): 56 kB' 'Active(file): 777560 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5340604 kB' 'Mapped: 75604 kB' 'AnonPages: 285736 kB' 'Shmem: 4027796 kB' 'KernelStack: 10872 kB' 'PageTables: 4984 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 400876 kB' 'Slab: 886108 kB' 'SReclaimable: 400876 kB' 'SUnreclaim: 485232 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.480 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 17346612 kB' 'MemUsed: 10302748 kB' 'SwapCached: 32 kB' 'Active: 3246112 kB' 'Inactive: 2640568 kB' 'Active(anon): 3116344 kB' 'Inactive(anon): 2335128 kB' 'Active(file): 129768 kB' 'Inactive(file): 305440 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5574892 kB' 'Mapped: 77888 kB' 'AnonPages: 311880 kB' 'Shmem: 5139652 kB' 'KernelStack: 10952 kB' 'PageTables: 3428 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 174564 kB' 'Slab: 693036 kB' 'SReclaimable: 174564 kB' 'SUnreclaim: 518472 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.481 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.482 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.742 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.742 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.742 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.742 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.742 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.742 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.742 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.742 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.742 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.742 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.742 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.742 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.742 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.742 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.742 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.742 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.742 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.742 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:05:04.743 node0=512 expecting 512 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node1=1024 expecting 1024' 00:05:04.743 node1=1024 expecting 1024 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@129 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:05:04.743 00:05:04.743 real 0m3.699s 00:05:04.743 user 0m1.412s 00:05:04.743 sys 0m2.354s 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:04.743 16:29:02 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:04.743 ************************************ 00:05:04.743 END TEST custom_alloc 00:05:04.743 ************************************ 00:05:04.743 16:29:02 setup.sh.hugepages -- setup/hugepages.sh@204 -- # run_test no_shrink_alloc no_shrink_alloc 00:05:04.743 16:29:02 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:04.743 16:29:02 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:04.743 16:29:02 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:04.743 ************************************ 00:05:04.743 START TEST no_shrink_alloc 00:05:04.743 ************************************ 00:05:04.743 16:29:02 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1125 -- # no_shrink_alloc 00:05:04.743 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@185 -- # get_test_nr_hugepages 2097152 0 00:05:04.743 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:04.743 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:05:04.743 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # shift 00:05:04.743 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # node_ids=('0') 00:05:04.743 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # local node_ids 00:05:04.743 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:04.743 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:04.743 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:05:04.743 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:05:04.743 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:04.743 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:04.743 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:04.744 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:04.744 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:04.744 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:05:04.744 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:05:04.744 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:05:04.744 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@72 -- # return 0 00:05:04.744 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # NRHUGE=1024 00:05:04.744 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # HUGENODE=0 00:05:04.744 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # setup output 00:05:04.744 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:04.744 16:29:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:08.041 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:08.041 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:08.041 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:08.041 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:08.041 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:08.041 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:08.041 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:08.041 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:08.041 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:08.041 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:08.041 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:08.041 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:08.041 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:08.041 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:08.041 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:08.041 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:08.041 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@189 -- # verify_nr_hugepages 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41973596 kB' 'MemAvailable: 43600208 kB' 'Buffers: 6784 kB' 'Cached: 10908716 kB' 'SwapCached: 76 kB' 'Active: 8335904 kB' 'Inactive: 3175828 kB' 'Active(anon): 7428576 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 599484 kB' 'Mapped: 153656 kB' 'Shmem: 9167528 kB' 'KReclaimable: 575440 kB' 'Slab: 1578820 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1003380 kB' 'KernelStack: 22096 kB' 'PageTables: 8716 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11691076 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218148 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.041 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.042 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.043 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41974292 kB' 'MemAvailable: 43600904 kB' 'Buffers: 6784 kB' 'Cached: 10908716 kB' 'SwapCached: 76 kB' 'Active: 8334864 kB' 'Inactive: 3175828 kB' 'Active(anon): 7427536 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598308 kB' 'Mapped: 153520 kB' 'Shmem: 9167528 kB' 'KReclaimable: 575440 kB' 'Slab: 1578796 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1003356 kB' 'KernelStack: 21872 kB' 'PageTables: 8376 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11690932 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218100 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.044 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.045 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41973228 kB' 'MemAvailable: 43599840 kB' 'Buffers: 6784 kB' 'Cached: 10908740 kB' 'SwapCached: 76 kB' 'Active: 8335316 kB' 'Inactive: 3175828 kB' 'Active(anon): 7427988 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598728 kB' 'Mapped: 153520 kB' 'Shmem: 9167552 kB' 'KReclaimable: 575440 kB' 'Slab: 1578796 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1003356 kB' 'KernelStack: 21888 kB' 'PageTables: 8736 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11691116 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218100 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.046 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.047 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:08.048 nr_hugepages=1024 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:08.048 resv_hugepages=0 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:08.048 surplus_hugepages=0 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:08.048 anon_hugepages=0 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:08.048 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41974060 kB' 'MemAvailable: 43600672 kB' 'Buffers: 6784 kB' 'Cached: 10908760 kB' 'SwapCached: 76 kB' 'Active: 8335132 kB' 'Inactive: 3175828 kB' 'Active(anon): 7427804 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598524 kB' 'Mapped: 153520 kB' 'Shmem: 9167572 kB' 'KReclaimable: 575440 kB' 'Slab: 1578796 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1003356 kB' 'KernelStack: 21984 kB' 'PageTables: 8448 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11691136 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218100 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.049 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.050 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 22518592 kB' 'MemUsed: 10115844 kB' 'SwapCached: 44 kB' 'Active: 5087688 kB' 'Inactive: 535260 kB' 'Active(anon): 4310128 kB' 'Inactive(anon): 56 kB' 'Active(file): 777560 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5340624 kB' 'Mapped: 75628 kB' 'AnonPages: 285396 kB' 'Shmem: 4027816 kB' 'KernelStack: 10984 kB' 'PageTables: 5236 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 400876 kB' 'Slab: 885788 kB' 'SReclaimable: 400876 kB' 'SUnreclaim: 484912 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.051 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.052 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.053 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.053 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.053 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.053 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.053 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.053 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:08.053 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:08.053 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:08.053 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:08.053 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:08.053 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:08.053 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:08.053 node0=1024 expecting 1024 00:05:08.053 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:08.053 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # CLEAR_HUGE=no 00:05:08.053 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # NRHUGE=512 00:05:08.053 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # HUGENODE=0 00:05:08.053 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # setup output 00:05:08.053 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:08.053 16:29:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:11.348 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:11.348 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:11.348 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:11.348 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:11.348 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:11.348 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:11.348 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:11.348 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:11.348 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:11.348 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:11.348 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:11.348 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:11.348 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:11.348 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:11.348 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:11.348 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:11.348 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:11.348 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@194 -- # verify_nr_hugepages 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42003764 kB' 'MemAvailable: 43630376 kB' 'Buffers: 6784 kB' 'Cached: 10908868 kB' 'SwapCached: 76 kB' 'Active: 8336056 kB' 'Inactive: 3175828 kB' 'Active(anon): 7428728 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 599464 kB' 'Mapped: 153624 kB' 'Shmem: 9167680 kB' 'KReclaimable: 575440 kB' 'Slab: 1579300 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1003860 kB' 'KernelStack: 21936 kB' 'PageTables: 8788 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11688912 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218180 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.348 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:11.349 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42005520 kB' 'MemAvailable: 43632132 kB' 'Buffers: 6784 kB' 'Cached: 10908876 kB' 'SwapCached: 76 kB' 'Active: 8335264 kB' 'Inactive: 3175828 kB' 'Active(anon): 7427936 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598684 kB' 'Mapped: 153512 kB' 'Shmem: 9167688 kB' 'KReclaimable: 575440 kB' 'Slab: 1579284 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1003844 kB' 'KernelStack: 21840 kB' 'PageTables: 8408 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11688932 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.350 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.351 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.616 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.616 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.616 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.616 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.616 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.616 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.616 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.616 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.616 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.616 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.616 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.616 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.616 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.616 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.616 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.616 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.616 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.617 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:11.617 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:11.617 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:11.617 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:11.617 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:11.617 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:11.617 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:11.617 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:11.617 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:11.617 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:11.617 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:11.617 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:11.617 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:11.617 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42005920 kB' 'MemAvailable: 43632532 kB' 'Buffers: 6784 kB' 'Cached: 10908912 kB' 'SwapCached: 76 kB' 'Active: 8334952 kB' 'Inactive: 3175828 kB' 'Active(anon): 7427624 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598292 kB' 'Mapped: 153512 kB' 'Shmem: 9167724 kB' 'KReclaimable: 575440 kB' 'Slab: 1579284 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1003844 kB' 'KernelStack: 21824 kB' 'PageTables: 8348 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11688952 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218084 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:11.617 16:29:08 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.617 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:11.618 nr_hugepages=1024 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:11.618 resv_hugepages=0 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:11.618 surplus_hugepages=0 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:11.618 anon_hugepages=0 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:11.618 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42006508 kB' 'MemAvailable: 43633120 kB' 'Buffers: 6784 kB' 'Cached: 10908936 kB' 'SwapCached: 76 kB' 'Active: 8334976 kB' 'Inactive: 3175828 kB' 'Active(anon): 7427648 kB' 'Inactive(anon): 2335184 kB' 'Active(file): 907328 kB' 'Inactive(file): 840644 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 598292 kB' 'Mapped: 153512 kB' 'Shmem: 9167748 kB' 'KReclaimable: 575440 kB' 'Slab: 1579284 kB' 'SReclaimable: 575440 kB' 'SUnreclaim: 1003844 kB' 'KernelStack: 21824 kB' 'PageTables: 8348 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 11688976 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218084 kB' 'VmallocChunk: 0 kB' 'Percpu: 114688 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.619 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:11.620 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 22533568 kB' 'MemUsed: 10100868 kB' 'SwapCached: 44 kB' 'Active: 5088412 kB' 'Inactive: 535260 kB' 'Active(anon): 4310852 kB' 'Inactive(anon): 56 kB' 'Active(file): 777560 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5340668 kB' 'Mapped: 75624 kB' 'AnonPages: 286116 kB' 'Shmem: 4027860 kB' 'KernelStack: 10872 kB' 'PageTables: 4880 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 400876 kB' 'Slab: 886152 kB' 'SReclaimable: 400876 kB' 'SUnreclaim: 485276 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.621 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:11.622 node0=1024 expecting 1024 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:11.622 00:05:11.622 real 0m6.885s 00:05:11.622 user 0m2.508s 00:05:11.622 sys 0m4.393s 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:11.622 16:29:09 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:11.622 ************************************ 00:05:11.622 END TEST no_shrink_alloc 00:05:11.622 ************************************ 00:05:11.622 16:29:09 setup.sh.hugepages -- setup/hugepages.sh@206 -- # clear_hp 00:05:11.622 16:29:09 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:05:11.622 16:29:09 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:11.622 16:29:09 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:11.622 16:29:09 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:11.622 16:29:09 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:11.622 16:29:09 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:11.622 16:29:09 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:11.622 16:29:09 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:11.622 16:29:09 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:11.622 16:29:09 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:11.622 16:29:09 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:11.622 16:29:09 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:05:11.622 16:29:09 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:05:11.622 00:05:11.622 real 0m22.628s 00:05:11.622 user 0m7.763s 00:05:11.622 sys 0m13.456s 00:05:11.622 16:29:09 setup.sh.hugepages -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:11.622 16:29:09 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:11.622 ************************************ 00:05:11.622 END TEST hugepages 00:05:11.622 ************************************ 00:05:11.622 16:29:09 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:11.622 16:29:09 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:11.622 16:29:09 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:11.622 16:29:09 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:11.622 ************************************ 00:05:11.622 START TEST driver 00:05:11.622 ************************************ 00:05:11.622 16:29:09 setup.sh.driver -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:11.882 * Looking for test storage... 00:05:11.882 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:11.882 16:29:09 setup.sh.driver -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:05:11.882 16:29:09 setup.sh.driver -- common/autotest_common.sh@1681 -- # lcov --version 00:05:11.882 16:29:09 setup.sh.driver -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:05:11.882 16:29:09 setup.sh.driver -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@336 -- # IFS=.-: 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@336 -- # read -ra ver1 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@337 -- # IFS=.-: 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@337 -- # read -ra ver2 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@338 -- # local 'op=<' 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@340 -- # ver1_l=2 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@341 -- # ver2_l=1 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@344 -- # case "$op" in 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@345 -- # : 1 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@365 -- # decimal 1 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@353 -- # local d=1 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@355 -- # echo 1 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@365 -- # ver1[v]=1 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@366 -- # decimal 2 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@353 -- # local d=2 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@355 -- # echo 2 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@366 -- # ver2[v]=2 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:11.882 16:29:09 setup.sh.driver -- scripts/common.sh@368 -- # return 0 00:05:11.882 16:29:09 setup.sh.driver -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:11.882 16:29:09 setup.sh.driver -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:05:11.882 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:11.882 --rc genhtml_branch_coverage=1 00:05:11.882 --rc genhtml_function_coverage=1 00:05:11.882 --rc genhtml_legend=1 00:05:11.882 --rc geninfo_all_blocks=1 00:05:11.882 --rc geninfo_unexecuted_blocks=1 00:05:11.882 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:11.883 ' 00:05:11.883 16:29:09 setup.sh.driver -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:05:11.883 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:11.883 --rc genhtml_branch_coverage=1 00:05:11.883 --rc genhtml_function_coverage=1 00:05:11.883 --rc genhtml_legend=1 00:05:11.883 --rc geninfo_all_blocks=1 00:05:11.883 --rc geninfo_unexecuted_blocks=1 00:05:11.883 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:11.883 ' 00:05:11.883 16:29:09 setup.sh.driver -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:05:11.883 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:11.883 --rc genhtml_branch_coverage=1 00:05:11.883 --rc genhtml_function_coverage=1 00:05:11.883 --rc genhtml_legend=1 00:05:11.883 --rc geninfo_all_blocks=1 00:05:11.883 --rc geninfo_unexecuted_blocks=1 00:05:11.883 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:11.883 ' 00:05:11.883 16:29:09 setup.sh.driver -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:05:11.883 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:11.883 --rc genhtml_branch_coverage=1 00:05:11.883 --rc genhtml_function_coverage=1 00:05:11.883 --rc genhtml_legend=1 00:05:11.883 --rc geninfo_all_blocks=1 00:05:11.883 --rc geninfo_unexecuted_blocks=1 00:05:11.883 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:11.883 ' 00:05:11.883 16:29:09 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:05:11.883 16:29:09 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:11.883 16:29:09 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:17.161 16:29:14 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:17.161 16:29:14 setup.sh.driver -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:17.161 16:29:14 setup.sh.driver -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:17.161 16:29:14 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:17.161 ************************************ 00:05:17.161 START TEST guess_driver 00:05:17.161 ************************************ 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- common/autotest_common.sh@1125 -- # guess_driver 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:17.161 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:17.161 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:17.161 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:17.161 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:17.161 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:17.161 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:17.161 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:17.161 Looking for driver=vfio-pci 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:05:17.161 16:29:14 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:20.450 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.450 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.450 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.450 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.450 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.450 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.450 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.450 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.450 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.450 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.450 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.450 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.450 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.450 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.450 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.450 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.450 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.450 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.450 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.450 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.450 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:20.451 16:29:17 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:21.832 16:29:19 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:21.832 16:29:19 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:21.832 16:29:19 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:21.832 16:29:19 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:21.832 16:29:19 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:05:21.832 16:29:19 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:21.832 16:29:19 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:27.232 00:05:27.232 real 0m9.933s 00:05:27.232 user 0m2.644s 00:05:27.232 sys 0m5.028s 00:05:27.232 16:29:24 setup.sh.driver.guess_driver -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:27.232 16:29:24 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:27.232 ************************************ 00:05:27.232 END TEST guess_driver 00:05:27.232 ************************************ 00:05:27.233 00:05:27.233 real 0m14.823s 00:05:27.233 user 0m4.004s 00:05:27.233 sys 0m7.795s 00:05:27.233 16:29:24 setup.sh.driver -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:27.233 16:29:24 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:27.233 ************************************ 00:05:27.233 END TEST driver 00:05:27.233 ************************************ 00:05:27.233 16:29:24 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:27.233 16:29:24 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:27.233 16:29:24 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:27.233 16:29:24 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:27.233 ************************************ 00:05:27.233 START TEST devices 00:05:27.233 ************************************ 00:05:27.233 16:29:24 setup.sh.devices -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:27.233 * Looking for test storage... 00:05:27.233 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:27.233 16:29:24 setup.sh.devices -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:05:27.233 16:29:24 setup.sh.devices -- common/autotest_common.sh@1681 -- # lcov --version 00:05:27.233 16:29:24 setup.sh.devices -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:05:27.233 16:29:24 setup.sh.devices -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@336 -- # IFS=.-: 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@336 -- # read -ra ver1 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@337 -- # IFS=.-: 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@337 -- # read -ra ver2 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@338 -- # local 'op=<' 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@340 -- # ver1_l=2 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@341 -- # ver2_l=1 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@344 -- # case "$op" in 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@345 -- # : 1 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@365 -- # decimal 1 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@353 -- # local d=1 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@355 -- # echo 1 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@365 -- # ver1[v]=1 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@366 -- # decimal 2 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@353 -- # local d=2 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@355 -- # echo 2 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@366 -- # ver2[v]=2 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:27.233 16:29:24 setup.sh.devices -- scripts/common.sh@368 -- # return 0 00:05:27.233 16:29:24 setup.sh.devices -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:27.233 16:29:24 setup.sh.devices -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:05:27.233 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:27.233 --rc genhtml_branch_coverage=1 00:05:27.233 --rc genhtml_function_coverage=1 00:05:27.233 --rc genhtml_legend=1 00:05:27.233 --rc geninfo_all_blocks=1 00:05:27.233 --rc geninfo_unexecuted_blocks=1 00:05:27.233 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:27.233 ' 00:05:27.233 16:29:24 setup.sh.devices -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:05:27.233 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:27.233 --rc genhtml_branch_coverage=1 00:05:27.233 --rc genhtml_function_coverage=1 00:05:27.233 --rc genhtml_legend=1 00:05:27.233 --rc geninfo_all_blocks=1 00:05:27.233 --rc geninfo_unexecuted_blocks=1 00:05:27.233 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:27.233 ' 00:05:27.233 16:29:24 setup.sh.devices -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:05:27.233 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:27.233 --rc genhtml_branch_coverage=1 00:05:27.233 --rc genhtml_function_coverage=1 00:05:27.233 --rc genhtml_legend=1 00:05:27.233 --rc geninfo_all_blocks=1 00:05:27.233 --rc geninfo_unexecuted_blocks=1 00:05:27.233 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:27.233 ' 00:05:27.233 16:29:24 setup.sh.devices -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:05:27.233 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:27.233 --rc genhtml_branch_coverage=1 00:05:27.233 --rc genhtml_function_coverage=1 00:05:27.233 --rc genhtml_legend=1 00:05:27.233 --rc geninfo_all_blocks=1 00:05:27.233 --rc geninfo_unexecuted_blocks=1 00:05:27.233 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:27.233 ' 00:05:27.233 16:29:24 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:27.233 16:29:24 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:27.233 16:29:24 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:27.233 16:29:24 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:30.555 16:29:28 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:30.555 16:29:28 setup.sh.devices -- common/autotest_common.sh@1655 -- # zoned_devs=() 00:05:30.555 16:29:28 setup.sh.devices -- common/autotest_common.sh@1655 -- # local -gA zoned_devs 00:05:30.555 16:29:28 setup.sh.devices -- common/autotest_common.sh@1656 -- # local nvme bdf 00:05:30.555 16:29:28 setup.sh.devices -- common/autotest_common.sh@1658 -- # for nvme in /sys/block/nvme* 00:05:30.555 16:29:28 setup.sh.devices -- common/autotest_common.sh@1659 -- # is_block_zoned nvme0n1 00:05:30.555 16:29:28 setup.sh.devices -- common/autotest_common.sh@1648 -- # local device=nvme0n1 00:05:30.555 16:29:28 setup.sh.devices -- common/autotest_common.sh@1650 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:30.555 16:29:28 setup.sh.devices -- common/autotest_common.sh@1651 -- # [[ none != none ]] 00:05:30.555 16:29:28 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:30.555 16:29:28 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:30.555 16:29:28 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:30.555 16:29:28 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:30.555 16:29:28 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:30.555 16:29:28 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:30.555 16:29:28 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:30.555 16:29:28 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:30.555 16:29:28 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:05:30.555 16:29:28 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:05:30.555 16:29:28 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:30.555 16:29:28 setup.sh.devices -- scripts/common.sh@381 -- # local block=nvme0n1 pt 00:05:30.555 16:29:28 setup.sh.devices -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:30.815 No valid GPT data, bailing 00:05:30.815 16:29:28 setup.sh.devices -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:30.815 16:29:28 setup.sh.devices -- scripts/common.sh@394 -- # pt= 00:05:30.815 16:29:28 setup.sh.devices -- scripts/common.sh@395 -- # return 1 00:05:30.815 16:29:28 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:30.815 16:29:28 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:30.815 16:29:28 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:30.815 16:29:28 setup.sh.devices -- setup/common.sh@80 -- # echo 1600321314816 00:05:30.815 16:29:28 setup.sh.devices -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:05:30.815 16:29:28 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:30.815 16:29:28 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:05:30.815 16:29:28 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:30.815 16:29:28 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:30.815 16:29:28 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:30.815 16:29:28 setup.sh.devices -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:30.815 16:29:28 setup.sh.devices -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:30.815 16:29:28 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:30.815 ************************************ 00:05:30.815 START TEST nvme_mount 00:05:30.815 ************************************ 00:05:30.815 16:29:28 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1125 -- # nvme_mount 00:05:30.815 16:29:28 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:30.815 16:29:28 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:30.815 16:29:28 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:30.815 16:29:28 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:30.815 16:29:28 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:30.815 16:29:28 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:30.815 16:29:28 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:30.815 16:29:28 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:30.815 16:29:28 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:30.815 16:29:28 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:30.815 16:29:28 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:30.815 16:29:28 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:30.815 16:29:28 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:30.815 16:29:28 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:30.815 16:29:28 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:30.815 16:29:28 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:30.815 16:29:28 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:30.815 16:29:28 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:30.815 16:29:28 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:31.752 Creating new GPT entries in memory. 00:05:31.752 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:31.752 other utilities. 00:05:31.752 16:29:29 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:31.752 16:29:29 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:31.752 16:29:29 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:31.752 16:29:29 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:31.752 16:29:29 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:32.691 Creating new GPT entries in memory. 00:05:32.691 The operation has completed successfully. 00:05:32.691 16:29:30 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:32.691 16:29:30 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:32.691 16:29:30 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 3724050 00:05:32.951 16:29:30 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:32.951 16:29:30 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:32.951 16:29:30 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:32.951 16:29:30 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:32.951 16:29:30 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:32.951 16:29:30 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:32.951 16:29:30 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:32.951 16:29:30 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:32.951 16:29:30 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:32.951 16:29:30 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:32.951 16:29:30 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:32.951 16:29:30 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:32.951 16:29:30 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:32.951 16:29:30 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:32.951 16:29:30 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:32.951 16:29:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:32.951 16:29:30 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:32.951 16:29:30 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:32.951 16:29:30 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:32.951 16:29:30 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:36.242 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:36.243 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:36.243 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:36.243 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:36.243 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:36.243 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:36.243 16:29:33 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:39.543 16:29:36 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.543 16:29:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:39.543 16:29:37 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:39.543 16:29:37 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:39.543 16:29:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.803 16:29:37 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:39.803 16:29:37 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:39.803 16:29:37 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:39.803 16:29:37 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:39.803 16:29:37 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:39.803 16:29:37 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:39.803 16:29:37 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:05:39.803 16:29:37 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:39.803 16:29:37 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:39.803 16:29:37 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:39.803 16:29:37 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:05:39.803 16:29:37 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:39.803 16:29:37 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:39.803 16:29:37 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:39.803 16:29:37 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:39.803 16:29:37 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:39.803 16:29:37 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:39.803 16:29:37 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:39.803 16:29:37 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:43.095 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:43.095 00:05:43.095 real 0m12.380s 00:05:43.095 user 0m3.515s 00:05:43.095 sys 0m6.756s 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:43.095 16:29:40 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:05:43.095 ************************************ 00:05:43.095 END TEST nvme_mount 00:05:43.095 ************************************ 00:05:43.095 16:29:40 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:43.095 16:29:40 setup.sh.devices -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:43.095 16:29:40 setup.sh.devices -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:43.095 16:29:40 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:43.355 ************************************ 00:05:43.355 START TEST dm_mount 00:05:43.355 ************************************ 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- common/autotest_common.sh@1125 -- # dm_mount 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:43.355 16:29:40 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:44.294 Creating new GPT entries in memory. 00:05:44.294 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:44.294 other utilities. 00:05:44.294 16:29:41 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:44.294 16:29:41 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:44.294 16:29:41 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:44.294 16:29:41 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:44.294 16:29:41 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:45.232 Creating new GPT entries in memory. 00:05:45.232 The operation has completed successfully. 00:05:45.232 16:29:42 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:45.232 16:29:42 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:45.232 16:29:42 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:45.232 16:29:42 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:45.232 16:29:42 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:46.611 The operation has completed successfully. 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 3728479 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:46.611 16:29:43 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:46.611 16:29:44 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:46.611 16:29:44 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.611 16:29:44 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:46.611 16:29:44 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:46.611 16:29:44 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:46.611 16:29:44 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:49.901 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:49.901 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.901 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:49.901 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.901 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:49.901 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.901 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:49.902 16:29:47 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:52.435 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.695 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:05:52.695 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:52.695 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:52.695 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.695 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:52.695 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:52.695 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:05:52.695 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:05:52.695 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:52.695 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:52.695 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:52.695 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:52.695 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:52.954 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:52.954 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:52.954 16:29:50 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:52.954 00:05:52.954 real 0m9.603s 00:05:52.954 user 0m2.187s 00:05:52.954 sys 0m4.453s 00:05:52.954 16:29:50 setup.sh.devices.dm_mount -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:52.954 16:29:50 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:05:52.954 ************************************ 00:05:52.954 END TEST dm_mount 00:05:52.954 ************************************ 00:05:52.954 16:29:50 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:05:52.954 16:29:50 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:05:52.954 16:29:50 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:52.954 16:29:50 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:52.954 16:29:50 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:52.954 16:29:50 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:52.954 16:29:50 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:53.213 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:53.213 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:05:53.213 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:53.213 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:53.213 16:29:50 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:05:53.213 16:29:50 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:53.213 16:29:50 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:53.213 16:29:50 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:53.213 16:29:50 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:53.213 16:29:50 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:53.213 16:29:50 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:53.213 00:05:53.213 real 0m26.562s 00:05:53.213 user 0m7.264s 00:05:53.213 sys 0m14.149s 00:05:53.213 16:29:50 setup.sh.devices -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:53.213 16:29:50 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:53.213 ************************************ 00:05:53.213 END TEST devices 00:05:53.213 ************************************ 00:05:53.213 00:05:53.213 real 1m28.788s 00:05:53.213 user 0m27.000s 00:05:53.213 sys 0m50.380s 00:05:53.213 16:29:50 setup.sh -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:53.213 16:29:50 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:53.214 ************************************ 00:05:53.214 END TEST setup.sh 00:05:53.214 ************************************ 00:05:53.214 16:29:50 -- spdk/autotest.sh@115 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:05:56.504 Hugepages 00:05:56.504 node hugesize free / total 00:05:56.504 node0 1048576kB 0 / 0 00:05:56.504 node0 2048kB 1024 / 1024 00:05:56.504 node1 1048576kB 0 / 0 00:05:56.504 node1 2048kB 1024 / 1024 00:05:56.504 00:05:56.504 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:56.504 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:56.504 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:56.504 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:56.504 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:56.504 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:56.504 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:56.504 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:56.504 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:56.504 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:56.504 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:56.504 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:56.504 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:56.504 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:56.504 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:56.504 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:56.504 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:56.762 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:05:56.762 16:29:54 -- spdk/autotest.sh@117 -- # uname -s 00:05:56.762 16:29:54 -- spdk/autotest.sh@117 -- # [[ Linux == Linux ]] 00:05:56.762 16:29:54 -- spdk/autotest.sh@119 -- # nvme_namespace_revert 00:05:56.762 16:29:54 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:00.049 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:00.049 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:00.049 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:00.049 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:00.049 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:00.049 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:00.049 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:00.049 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:00.049 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:00.049 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:00.049 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:00.049 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:00.049 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:00.049 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:00.049 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:00.049 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:01.429 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:06:01.689 16:29:59 -- common/autotest_common.sh@1515 -- # sleep 1 00:06:02.627 16:30:00 -- common/autotest_common.sh@1516 -- # bdfs=() 00:06:02.627 16:30:00 -- common/autotest_common.sh@1516 -- # local bdfs 00:06:02.627 16:30:00 -- common/autotest_common.sh@1518 -- # bdfs=($(get_nvme_bdfs)) 00:06:02.627 16:30:00 -- common/autotest_common.sh@1518 -- # get_nvme_bdfs 00:06:02.627 16:30:00 -- common/autotest_common.sh@1496 -- # bdfs=() 00:06:02.627 16:30:00 -- common/autotest_common.sh@1496 -- # local bdfs 00:06:02.627 16:30:00 -- common/autotest_common.sh@1497 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:02.627 16:30:00 -- common/autotest_common.sh@1497 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:02.627 16:30:00 -- common/autotest_common.sh@1497 -- # jq -r '.config[].params.traddr' 00:06:02.886 16:30:00 -- common/autotest_common.sh@1498 -- # (( 1 == 0 )) 00:06:02.886 16:30:00 -- common/autotest_common.sh@1502 -- # printf '%s\n' 0000:d8:00.0 00:06:02.886 16:30:00 -- common/autotest_common.sh@1520 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:06:06.176 Waiting for block devices as requested 00:06:06.176 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:06.176 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:06.176 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:06.176 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:06.176 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:06.176 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:06.436 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:06.436 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:06.436 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:06.695 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:06.695 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:06.695 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:06.695 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:06.955 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:06.955 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:06.955 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:07.215 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:06:07.475 16:30:04 -- common/autotest_common.sh@1522 -- # for bdf in "${bdfs[@]}" 00:06:07.475 16:30:04 -- common/autotest_common.sh@1523 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:06:07.475 16:30:04 -- common/autotest_common.sh@1485 -- # readlink -f /sys/class/nvme/nvme0 00:06:07.475 16:30:04 -- common/autotest_common.sh@1485 -- # grep 0000:d8:00.0/nvme/nvme 00:06:07.475 16:30:04 -- common/autotest_common.sh@1485 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:06:07.475 16:30:04 -- common/autotest_common.sh@1486 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:06:07.475 16:30:04 -- common/autotest_common.sh@1490 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:06:07.475 16:30:04 -- common/autotest_common.sh@1490 -- # printf '%s\n' nvme0 00:06:07.475 16:30:04 -- common/autotest_common.sh@1523 -- # nvme_ctrlr=/dev/nvme0 00:06:07.475 16:30:04 -- common/autotest_common.sh@1524 -- # [[ -z /dev/nvme0 ]] 00:06:07.475 16:30:04 -- common/autotest_common.sh@1529 -- # nvme id-ctrl /dev/nvme0 00:06:07.475 16:30:04 -- common/autotest_common.sh@1529 -- # grep oacs 00:06:07.475 16:30:04 -- common/autotest_common.sh@1529 -- # cut -d: -f2 00:06:07.475 16:30:04 -- common/autotest_common.sh@1529 -- # oacs=' 0xe' 00:06:07.475 16:30:04 -- common/autotest_common.sh@1530 -- # oacs_ns_manage=8 00:06:07.475 16:30:04 -- common/autotest_common.sh@1532 -- # [[ 8 -ne 0 ]] 00:06:07.475 16:30:04 -- common/autotest_common.sh@1538 -- # nvme id-ctrl /dev/nvme0 00:06:07.475 16:30:04 -- common/autotest_common.sh@1538 -- # grep unvmcap 00:06:07.475 16:30:04 -- common/autotest_common.sh@1538 -- # cut -d: -f2 00:06:07.475 16:30:04 -- common/autotest_common.sh@1538 -- # unvmcap=' 0' 00:06:07.475 16:30:04 -- common/autotest_common.sh@1539 -- # [[ 0 -eq 0 ]] 00:06:07.475 16:30:04 -- common/autotest_common.sh@1541 -- # continue 00:06:07.475 16:30:04 -- spdk/autotest.sh@122 -- # timing_exit pre_cleanup 00:06:07.475 16:30:04 -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:07.475 16:30:04 -- common/autotest_common.sh@10 -- # set +x 00:06:07.475 16:30:04 -- spdk/autotest.sh@125 -- # timing_enter afterboot 00:06:07.475 16:30:04 -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:07.475 16:30:04 -- common/autotest_common.sh@10 -- # set +x 00:06:07.475 16:30:04 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:10.762 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:10.762 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:10.762 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:10.762 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:10.762 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:10.762 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:10.762 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:10.762 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:10.762 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:10.762 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:10.762 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:10.762 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:10.762 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:10.762 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:10.762 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:10.762 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:12.667 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:06:12.667 16:30:09 -- spdk/autotest.sh@127 -- # timing_exit afterboot 00:06:12.667 16:30:09 -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:12.667 16:30:09 -- common/autotest_common.sh@10 -- # set +x 00:06:12.667 16:30:10 -- spdk/autotest.sh@131 -- # opal_revert_cleanup 00:06:12.667 16:30:10 -- common/autotest_common.sh@1576 -- # mapfile -t bdfs 00:06:12.667 16:30:10 -- common/autotest_common.sh@1576 -- # get_nvme_bdfs_by_id 0x0a54 00:06:12.667 16:30:10 -- common/autotest_common.sh@1561 -- # bdfs=() 00:06:12.667 16:30:10 -- common/autotest_common.sh@1561 -- # _bdfs=() 00:06:12.667 16:30:10 -- common/autotest_common.sh@1561 -- # local bdfs _bdfs 00:06:12.667 16:30:10 -- common/autotest_common.sh@1562 -- # _bdfs=($(get_nvme_bdfs)) 00:06:12.667 16:30:10 -- common/autotest_common.sh@1562 -- # get_nvme_bdfs 00:06:12.667 16:30:10 -- common/autotest_common.sh@1496 -- # bdfs=() 00:06:12.667 16:30:10 -- common/autotest_common.sh@1496 -- # local bdfs 00:06:12.667 16:30:10 -- common/autotest_common.sh@1497 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:12.667 16:30:10 -- common/autotest_common.sh@1497 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:12.667 16:30:10 -- common/autotest_common.sh@1497 -- # jq -r '.config[].params.traddr' 00:06:12.667 16:30:10 -- common/autotest_common.sh@1498 -- # (( 1 == 0 )) 00:06:12.667 16:30:10 -- common/autotest_common.sh@1502 -- # printf '%s\n' 0000:d8:00.0 00:06:12.667 16:30:10 -- common/autotest_common.sh@1563 -- # for bdf in "${_bdfs[@]}" 00:06:12.667 16:30:10 -- common/autotest_common.sh@1564 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:06:12.667 16:30:10 -- common/autotest_common.sh@1564 -- # device=0x0a54 00:06:12.667 16:30:10 -- common/autotest_common.sh@1565 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:06:12.667 16:30:10 -- common/autotest_common.sh@1566 -- # bdfs+=($bdf) 00:06:12.667 16:30:10 -- common/autotest_common.sh@1570 -- # (( 1 > 0 )) 00:06:12.667 16:30:10 -- common/autotest_common.sh@1571 -- # printf '%s\n' 0000:d8:00.0 00:06:12.667 16:30:10 -- common/autotest_common.sh@1577 -- # [[ -z 0000:d8:00.0 ]] 00:06:12.667 16:30:10 -- common/autotest_common.sh@1582 -- # spdk_tgt_pid=3738577 00:06:12.667 16:30:10 -- common/autotest_common.sh@1583 -- # waitforlisten 3738577 00:06:12.667 16:30:10 -- common/autotest_common.sh@1581 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:12.667 16:30:10 -- common/autotest_common.sh@831 -- # '[' -z 3738577 ']' 00:06:12.667 16:30:10 -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:12.667 16:30:10 -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:12.667 16:30:10 -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:12.667 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:12.667 16:30:10 -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:12.667 16:30:10 -- common/autotest_common.sh@10 -- # set +x 00:06:12.667 [2024-11-28 16:30:10.177459] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:12.667 [2024-11-28 16:30:10.177546] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3738577 ] 00:06:12.667 [2024-11-28 16:30:10.245049] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:12.667 [2024-11-28 16:30:10.284945] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:12.926 16:30:10 -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:12.927 16:30:10 -- common/autotest_common.sh@864 -- # return 0 00:06:12.927 16:30:10 -- common/autotest_common.sh@1585 -- # bdf_id=0 00:06:12.927 16:30:10 -- common/autotest_common.sh@1586 -- # for bdf in "${bdfs[@]}" 00:06:12.927 16:30:10 -- common/autotest_common.sh@1587 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:06:16.217 nvme0n1 00:06:16.217 16:30:13 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:06:16.217 [2024-11-28 16:30:13.664004] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:06:16.217 request: 00:06:16.217 { 00:06:16.217 "nvme_ctrlr_name": "nvme0", 00:06:16.217 "password": "test", 00:06:16.217 "method": "bdev_nvme_opal_revert", 00:06:16.217 "req_id": 1 00:06:16.217 } 00:06:16.217 Got JSON-RPC error response 00:06:16.217 response: 00:06:16.217 { 00:06:16.217 "code": -32602, 00:06:16.217 "message": "Invalid parameters" 00:06:16.217 } 00:06:16.217 16:30:13 -- common/autotest_common.sh@1589 -- # true 00:06:16.217 16:30:13 -- common/autotest_common.sh@1590 -- # (( ++bdf_id )) 00:06:16.217 16:30:13 -- common/autotest_common.sh@1593 -- # killprocess 3738577 00:06:16.217 16:30:13 -- common/autotest_common.sh@950 -- # '[' -z 3738577 ']' 00:06:16.217 16:30:13 -- common/autotest_common.sh@954 -- # kill -0 3738577 00:06:16.217 16:30:13 -- common/autotest_common.sh@955 -- # uname 00:06:16.217 16:30:13 -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:16.217 16:30:13 -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3738577 00:06:16.217 16:30:13 -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:16.217 16:30:13 -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:16.217 16:30:13 -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3738577' 00:06:16.217 killing process with pid 3738577 00:06:16.217 16:30:13 -- common/autotest_common.sh@969 -- # kill 3738577 00:06:16.217 16:30:13 -- common/autotest_common.sh@974 -- # wait 3738577 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.217 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:16.218 EAL: Unexpected size 0 of DMA remapping cleared instead of 2097152 00:06:18.754 16:30:15 -- spdk/autotest.sh@137 -- # '[' 0 -eq 1 ']' 00:06:18.754 16:30:15 -- spdk/autotest.sh@141 -- # '[' 1 -eq 1 ']' 00:06:18.754 16:30:15 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:06:18.754 16:30:15 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:06:18.754 16:30:15 -- spdk/autotest.sh@149 -- # timing_enter lib 00:06:18.754 16:30:15 -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:18.754 16:30:15 -- common/autotest_common.sh@10 -- # set +x 00:06:18.754 16:30:15 -- spdk/autotest.sh@151 -- # [[ 0 -eq 1 ]] 00:06:18.754 16:30:15 -- spdk/autotest.sh@155 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:18.754 16:30:15 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:18.754 16:30:15 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:18.754 16:30:15 -- common/autotest_common.sh@10 -- # set +x 00:06:18.754 ************************************ 00:06:18.754 START TEST env 00:06:18.754 ************************************ 00:06:18.754 16:30:15 env -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:18.754 * Looking for test storage... 00:06:18.754 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:06:18.754 16:30:16 env -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:18.754 16:30:16 env -- common/autotest_common.sh@1681 -- # lcov --version 00:06:18.754 16:30:16 env -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:18.754 16:30:16 env -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:18.754 16:30:16 env -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:18.754 16:30:16 env -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:18.754 16:30:16 env -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:18.754 16:30:16 env -- scripts/common.sh@336 -- # IFS=.-: 00:06:18.754 16:30:16 env -- scripts/common.sh@336 -- # read -ra ver1 00:06:18.754 16:30:16 env -- scripts/common.sh@337 -- # IFS=.-: 00:06:18.754 16:30:16 env -- scripts/common.sh@337 -- # read -ra ver2 00:06:18.754 16:30:16 env -- scripts/common.sh@338 -- # local 'op=<' 00:06:18.754 16:30:16 env -- scripts/common.sh@340 -- # ver1_l=2 00:06:18.754 16:30:16 env -- scripts/common.sh@341 -- # ver2_l=1 00:06:18.754 16:30:16 env -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:18.754 16:30:16 env -- scripts/common.sh@344 -- # case "$op" in 00:06:18.754 16:30:16 env -- scripts/common.sh@345 -- # : 1 00:06:18.754 16:30:16 env -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:18.754 16:30:16 env -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:18.754 16:30:16 env -- scripts/common.sh@365 -- # decimal 1 00:06:18.754 16:30:16 env -- scripts/common.sh@353 -- # local d=1 00:06:18.754 16:30:16 env -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:18.754 16:30:16 env -- scripts/common.sh@355 -- # echo 1 00:06:18.754 16:30:16 env -- scripts/common.sh@365 -- # ver1[v]=1 00:06:18.754 16:30:16 env -- scripts/common.sh@366 -- # decimal 2 00:06:18.754 16:30:16 env -- scripts/common.sh@353 -- # local d=2 00:06:18.754 16:30:16 env -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:18.754 16:30:16 env -- scripts/common.sh@355 -- # echo 2 00:06:18.754 16:30:16 env -- scripts/common.sh@366 -- # ver2[v]=2 00:06:18.754 16:30:16 env -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:18.754 16:30:16 env -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:18.754 16:30:16 env -- scripts/common.sh@368 -- # return 0 00:06:18.754 16:30:16 env -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:18.754 16:30:16 env -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:18.754 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:18.754 --rc genhtml_branch_coverage=1 00:06:18.754 --rc genhtml_function_coverage=1 00:06:18.754 --rc genhtml_legend=1 00:06:18.754 --rc geninfo_all_blocks=1 00:06:18.754 --rc geninfo_unexecuted_blocks=1 00:06:18.754 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:18.754 ' 00:06:18.754 16:30:16 env -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:18.754 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:18.754 --rc genhtml_branch_coverage=1 00:06:18.754 --rc genhtml_function_coverage=1 00:06:18.754 --rc genhtml_legend=1 00:06:18.754 --rc geninfo_all_blocks=1 00:06:18.754 --rc geninfo_unexecuted_blocks=1 00:06:18.754 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:18.754 ' 00:06:18.754 16:30:16 env -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:18.754 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:18.754 --rc genhtml_branch_coverage=1 00:06:18.754 --rc genhtml_function_coverage=1 00:06:18.754 --rc genhtml_legend=1 00:06:18.754 --rc geninfo_all_blocks=1 00:06:18.754 --rc geninfo_unexecuted_blocks=1 00:06:18.754 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:18.754 ' 00:06:18.754 16:30:16 env -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:18.754 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:18.754 --rc genhtml_branch_coverage=1 00:06:18.754 --rc genhtml_function_coverage=1 00:06:18.754 --rc genhtml_legend=1 00:06:18.754 --rc geninfo_all_blocks=1 00:06:18.754 --rc geninfo_unexecuted_blocks=1 00:06:18.754 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:18.754 ' 00:06:18.754 16:30:16 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:18.754 16:30:16 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:18.754 16:30:16 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:18.754 16:30:16 env -- common/autotest_common.sh@10 -- # set +x 00:06:18.754 ************************************ 00:06:18.754 START TEST env_memory 00:06:18.754 ************************************ 00:06:18.754 16:30:16 env.env_memory -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:18.754 00:06:18.754 00:06:18.754 CUnit - A unit testing framework for C - Version 2.1-3 00:06:18.754 http://cunit.sourceforge.net/ 00:06:18.754 00:06:18.754 00:06:18.754 Suite: memory 00:06:18.754 Test: alloc and free memory map ...[2024-11-28 16:30:16.249251] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:18.754 passed 00:06:18.754 Test: mem map translation ...[2024-11-28 16:30:16.262411] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:18.754 [2024-11-28 16:30:16.262427] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:18.755 [2024-11-28 16:30:16.262458] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 589:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:18.755 [2024-11-28 16:30:16.262468] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 605:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:18.755 passed 00:06:18.755 Test: mem map registration ...[2024-11-28 16:30:16.283604] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=200000 len=1234 00:06:18.755 [2024-11-28 16:30:16.283623] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=4d2 len=2097152 00:06:18.755 passed 00:06:18.755 Test: mem map adjacent registrations ...passed 00:06:18.755 00:06:18.755 Run Summary: Type Total Ran Passed Failed Inactive 00:06:18.755 suites 1 1 n/a 0 0 00:06:18.755 tests 4 4 4 0 0 00:06:18.755 asserts 152 152 152 0 n/a 00:06:18.755 00:06:18.755 Elapsed time = 0.084 seconds 00:06:18.755 00:06:18.755 real 0m0.098s 00:06:18.755 user 0m0.086s 00:06:18.755 sys 0m0.011s 00:06:18.755 16:30:16 env.env_memory -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:18.755 16:30:16 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:06:18.755 ************************************ 00:06:18.755 END TEST env_memory 00:06:18.755 ************************************ 00:06:18.755 16:30:16 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:18.755 16:30:16 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:18.755 16:30:16 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:18.755 16:30:16 env -- common/autotest_common.sh@10 -- # set +x 00:06:18.755 ************************************ 00:06:18.755 START TEST env_vtophys 00:06:18.755 ************************************ 00:06:18.755 16:30:16 env.env_vtophys -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:19.015 EAL: lib.eal log level changed from notice to debug 00:06:19.015 EAL: Detected lcore 0 as core 0 on socket 0 00:06:19.015 EAL: Detected lcore 1 as core 1 on socket 0 00:06:19.015 EAL: Detected lcore 2 as core 2 on socket 0 00:06:19.015 EAL: Detected lcore 3 as core 3 on socket 0 00:06:19.015 EAL: Detected lcore 4 as core 4 on socket 0 00:06:19.015 EAL: Detected lcore 5 as core 5 on socket 0 00:06:19.015 EAL: Detected lcore 6 as core 6 on socket 0 00:06:19.015 EAL: Detected lcore 7 as core 8 on socket 0 00:06:19.015 EAL: Detected lcore 8 as core 9 on socket 0 00:06:19.015 EAL: Detected lcore 9 as core 10 on socket 0 00:06:19.015 EAL: Detected lcore 10 as core 11 on socket 0 00:06:19.015 EAL: Detected lcore 11 as core 12 on socket 0 00:06:19.015 EAL: Detected lcore 12 as core 13 on socket 0 00:06:19.015 EAL: Detected lcore 13 as core 14 on socket 0 00:06:19.015 EAL: Detected lcore 14 as core 16 on socket 0 00:06:19.015 EAL: Detected lcore 15 as core 17 on socket 0 00:06:19.015 EAL: Detected lcore 16 as core 18 on socket 0 00:06:19.015 EAL: Detected lcore 17 as core 19 on socket 0 00:06:19.015 EAL: Detected lcore 18 as core 20 on socket 0 00:06:19.015 EAL: Detected lcore 19 as core 21 on socket 0 00:06:19.015 EAL: Detected lcore 20 as core 22 on socket 0 00:06:19.015 EAL: Detected lcore 21 as core 24 on socket 0 00:06:19.015 EAL: Detected lcore 22 as core 25 on socket 0 00:06:19.015 EAL: Detected lcore 23 as core 26 on socket 0 00:06:19.015 EAL: Detected lcore 24 as core 27 on socket 0 00:06:19.015 EAL: Detected lcore 25 as core 28 on socket 0 00:06:19.015 EAL: Detected lcore 26 as core 29 on socket 0 00:06:19.015 EAL: Detected lcore 27 as core 30 on socket 0 00:06:19.015 EAL: Detected lcore 28 as core 0 on socket 1 00:06:19.015 EAL: Detected lcore 29 as core 1 on socket 1 00:06:19.015 EAL: Detected lcore 30 as core 2 on socket 1 00:06:19.015 EAL: Detected lcore 31 as core 3 on socket 1 00:06:19.015 EAL: Detected lcore 32 as core 4 on socket 1 00:06:19.015 EAL: Detected lcore 33 as core 5 on socket 1 00:06:19.015 EAL: Detected lcore 34 as core 6 on socket 1 00:06:19.015 EAL: Detected lcore 35 as core 8 on socket 1 00:06:19.015 EAL: Detected lcore 36 as core 9 on socket 1 00:06:19.015 EAL: Detected lcore 37 as core 10 on socket 1 00:06:19.015 EAL: Detected lcore 38 as core 11 on socket 1 00:06:19.015 EAL: Detected lcore 39 as core 12 on socket 1 00:06:19.015 EAL: Detected lcore 40 as core 13 on socket 1 00:06:19.015 EAL: Detected lcore 41 as core 14 on socket 1 00:06:19.015 EAL: Detected lcore 42 as core 16 on socket 1 00:06:19.015 EAL: Detected lcore 43 as core 17 on socket 1 00:06:19.015 EAL: Detected lcore 44 as core 18 on socket 1 00:06:19.015 EAL: Detected lcore 45 as core 19 on socket 1 00:06:19.015 EAL: Detected lcore 46 as core 20 on socket 1 00:06:19.015 EAL: Detected lcore 47 as core 21 on socket 1 00:06:19.015 EAL: Detected lcore 48 as core 22 on socket 1 00:06:19.015 EAL: Detected lcore 49 as core 24 on socket 1 00:06:19.015 EAL: Detected lcore 50 as core 25 on socket 1 00:06:19.015 EAL: Detected lcore 51 as core 26 on socket 1 00:06:19.015 EAL: Detected lcore 52 as core 27 on socket 1 00:06:19.015 EAL: Detected lcore 53 as core 28 on socket 1 00:06:19.015 EAL: Detected lcore 54 as core 29 on socket 1 00:06:19.015 EAL: Detected lcore 55 as core 30 on socket 1 00:06:19.015 EAL: Detected lcore 56 as core 0 on socket 0 00:06:19.015 EAL: Detected lcore 57 as core 1 on socket 0 00:06:19.015 EAL: Detected lcore 58 as core 2 on socket 0 00:06:19.015 EAL: Detected lcore 59 as core 3 on socket 0 00:06:19.015 EAL: Detected lcore 60 as core 4 on socket 0 00:06:19.015 EAL: Detected lcore 61 as core 5 on socket 0 00:06:19.015 EAL: Detected lcore 62 as core 6 on socket 0 00:06:19.015 EAL: Detected lcore 63 as core 8 on socket 0 00:06:19.015 EAL: Detected lcore 64 as core 9 on socket 0 00:06:19.015 EAL: Detected lcore 65 as core 10 on socket 0 00:06:19.015 EAL: Detected lcore 66 as core 11 on socket 0 00:06:19.015 EAL: Detected lcore 67 as core 12 on socket 0 00:06:19.015 EAL: Detected lcore 68 as core 13 on socket 0 00:06:19.015 EAL: Detected lcore 69 as core 14 on socket 0 00:06:19.015 EAL: Detected lcore 70 as core 16 on socket 0 00:06:19.015 EAL: Detected lcore 71 as core 17 on socket 0 00:06:19.015 EAL: Detected lcore 72 as core 18 on socket 0 00:06:19.015 EAL: Detected lcore 73 as core 19 on socket 0 00:06:19.015 EAL: Detected lcore 74 as core 20 on socket 0 00:06:19.015 EAL: Detected lcore 75 as core 21 on socket 0 00:06:19.015 EAL: Detected lcore 76 as core 22 on socket 0 00:06:19.015 EAL: Detected lcore 77 as core 24 on socket 0 00:06:19.015 EAL: Detected lcore 78 as core 25 on socket 0 00:06:19.015 EAL: Detected lcore 79 as core 26 on socket 0 00:06:19.015 EAL: Detected lcore 80 as core 27 on socket 0 00:06:19.015 EAL: Detected lcore 81 as core 28 on socket 0 00:06:19.015 EAL: Detected lcore 82 as core 29 on socket 0 00:06:19.015 EAL: Detected lcore 83 as core 30 on socket 0 00:06:19.015 EAL: Detected lcore 84 as core 0 on socket 1 00:06:19.015 EAL: Detected lcore 85 as core 1 on socket 1 00:06:19.015 EAL: Detected lcore 86 as core 2 on socket 1 00:06:19.015 EAL: Detected lcore 87 as core 3 on socket 1 00:06:19.015 EAL: Detected lcore 88 as core 4 on socket 1 00:06:19.015 EAL: Detected lcore 89 as core 5 on socket 1 00:06:19.015 EAL: Detected lcore 90 as core 6 on socket 1 00:06:19.015 EAL: Detected lcore 91 as core 8 on socket 1 00:06:19.015 EAL: Detected lcore 92 as core 9 on socket 1 00:06:19.015 EAL: Detected lcore 93 as core 10 on socket 1 00:06:19.015 EAL: Detected lcore 94 as core 11 on socket 1 00:06:19.015 EAL: Detected lcore 95 as core 12 on socket 1 00:06:19.015 EAL: Detected lcore 96 as core 13 on socket 1 00:06:19.015 EAL: Detected lcore 97 as core 14 on socket 1 00:06:19.015 EAL: Detected lcore 98 as core 16 on socket 1 00:06:19.015 EAL: Detected lcore 99 as core 17 on socket 1 00:06:19.015 EAL: Detected lcore 100 as core 18 on socket 1 00:06:19.015 EAL: Detected lcore 101 as core 19 on socket 1 00:06:19.015 EAL: Detected lcore 102 as core 20 on socket 1 00:06:19.015 EAL: Detected lcore 103 as core 21 on socket 1 00:06:19.015 EAL: Detected lcore 104 as core 22 on socket 1 00:06:19.015 EAL: Detected lcore 105 as core 24 on socket 1 00:06:19.015 EAL: Detected lcore 106 as core 25 on socket 1 00:06:19.015 EAL: Detected lcore 107 as core 26 on socket 1 00:06:19.015 EAL: Detected lcore 108 as core 27 on socket 1 00:06:19.015 EAL: Detected lcore 109 as core 28 on socket 1 00:06:19.015 EAL: Detected lcore 110 as core 29 on socket 1 00:06:19.015 EAL: Detected lcore 111 as core 30 on socket 1 00:06:19.015 EAL: Maximum logical cores by configuration: 128 00:06:19.015 EAL: Detected CPU lcores: 112 00:06:19.015 EAL: Detected NUMA nodes: 2 00:06:19.015 EAL: Checking presence of .so 'librte_eal.so.23.0' 00:06:19.015 EAL: Checking presence of .so 'librte_eal.so.23' 00:06:19.015 EAL: Checking presence of .so 'librte_eal.so' 00:06:19.015 EAL: Detected static linkage of DPDK 00:06:19.015 EAL: No shared files mode enabled, IPC will be disabled 00:06:19.015 EAL: Bus pci wants IOVA as 'DC' 00:06:19.015 EAL: Buses did not request a specific IOVA mode. 00:06:19.015 EAL: IOMMU is available, selecting IOVA as VA mode. 00:06:19.015 EAL: Selected IOVA mode 'VA' 00:06:19.015 EAL: Probing VFIO support... 00:06:19.015 EAL: IOMMU type 1 (Type 1) is supported 00:06:19.015 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:19.015 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:19.015 EAL: VFIO support initialized 00:06:19.015 EAL: Ask a virtual area of 0x2e000 bytes 00:06:19.015 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:19.015 EAL: Setting up physically contiguous memory... 00:06:19.015 EAL: Setting maximum number of open files to 524288 00:06:19.015 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:19.015 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:19.015 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:19.015 EAL: Ask a virtual area of 0x61000 bytes 00:06:19.015 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:19.015 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:19.015 EAL: Ask a virtual area of 0x400000000 bytes 00:06:19.015 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:19.015 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:19.015 EAL: Ask a virtual area of 0x61000 bytes 00:06:19.015 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:19.015 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:19.015 EAL: Ask a virtual area of 0x400000000 bytes 00:06:19.015 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:19.015 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:19.015 EAL: Ask a virtual area of 0x61000 bytes 00:06:19.015 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:19.015 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:19.015 EAL: Ask a virtual area of 0x400000000 bytes 00:06:19.015 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:19.015 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:19.015 EAL: Ask a virtual area of 0x61000 bytes 00:06:19.015 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:19.015 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:19.015 EAL: Ask a virtual area of 0x400000000 bytes 00:06:19.015 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:19.015 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:19.015 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:19.015 EAL: Ask a virtual area of 0x61000 bytes 00:06:19.015 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:19.015 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:19.015 EAL: Ask a virtual area of 0x400000000 bytes 00:06:19.015 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:19.015 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:19.015 EAL: Ask a virtual area of 0x61000 bytes 00:06:19.015 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:19.015 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:19.015 EAL: Ask a virtual area of 0x400000000 bytes 00:06:19.015 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:19.016 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:19.016 EAL: Ask a virtual area of 0x61000 bytes 00:06:19.016 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:19.016 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:19.016 EAL: Ask a virtual area of 0x400000000 bytes 00:06:19.016 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:19.016 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:19.016 EAL: Ask a virtual area of 0x61000 bytes 00:06:19.016 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:19.016 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:19.016 EAL: Ask a virtual area of 0x400000000 bytes 00:06:19.016 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:19.016 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:19.016 EAL: Hugepages will be freed exactly as allocated. 00:06:19.016 EAL: No shared files mode enabled, IPC is disabled 00:06:19.016 EAL: No shared files mode enabled, IPC is disabled 00:06:19.016 EAL: TSC frequency is ~2500000 KHz 00:06:19.016 EAL: Main lcore 0 is ready (tid=7fa7acde8a00;cpuset=[0]) 00:06:19.016 EAL: Trying to obtain current memory policy. 00:06:19.016 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:19.016 EAL: Restoring previous memory policy: 0 00:06:19.016 EAL: request: mp_malloc_sync 00:06:19.016 EAL: No shared files mode enabled, IPC is disabled 00:06:19.016 EAL: Heap on socket 0 was expanded by 2MB 00:06:19.016 EAL: No shared files mode enabled, IPC is disabled 00:06:19.016 EAL: Mem event callback 'spdk:(nil)' registered 00:06:19.016 00:06:19.016 00:06:19.016 CUnit - A unit testing framework for C - Version 2.1-3 00:06:19.016 http://cunit.sourceforge.net/ 00:06:19.016 00:06:19.016 00:06:19.016 Suite: components_suite 00:06:19.016 Test: vtophys_malloc_test ...passed 00:06:19.016 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:19.016 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:19.016 EAL: Restoring previous memory policy: 4 00:06:19.016 EAL: Calling mem event callback 'spdk:(nil)' 00:06:19.016 EAL: request: mp_malloc_sync 00:06:19.016 EAL: No shared files mode enabled, IPC is disabled 00:06:19.016 EAL: Heap on socket 0 was expanded by 4MB 00:06:19.016 EAL: Calling mem event callback 'spdk:(nil)' 00:06:19.016 EAL: request: mp_malloc_sync 00:06:19.016 EAL: No shared files mode enabled, IPC is disabled 00:06:19.016 EAL: Heap on socket 0 was shrunk by 4MB 00:06:19.016 EAL: Trying to obtain current memory policy. 00:06:19.016 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:19.016 EAL: Restoring previous memory policy: 4 00:06:19.016 EAL: Calling mem event callback 'spdk:(nil)' 00:06:19.016 EAL: request: mp_malloc_sync 00:06:19.016 EAL: No shared files mode enabled, IPC is disabled 00:06:19.016 EAL: Heap on socket 0 was expanded by 6MB 00:06:19.016 EAL: Calling mem event callback 'spdk:(nil)' 00:06:19.016 EAL: request: mp_malloc_sync 00:06:19.016 EAL: No shared files mode enabled, IPC is disabled 00:06:19.016 EAL: Heap on socket 0 was shrunk by 6MB 00:06:19.016 EAL: Trying to obtain current memory policy. 00:06:19.016 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:19.016 EAL: Restoring previous memory policy: 4 00:06:19.016 EAL: Calling mem event callback 'spdk:(nil)' 00:06:19.016 EAL: request: mp_malloc_sync 00:06:19.016 EAL: No shared files mode enabled, IPC is disabled 00:06:19.016 EAL: Heap on socket 0 was expanded by 10MB 00:06:19.016 EAL: Calling mem event callback 'spdk:(nil)' 00:06:19.016 EAL: request: mp_malloc_sync 00:06:19.016 EAL: No shared files mode enabled, IPC is disabled 00:06:19.016 EAL: Heap on socket 0 was shrunk by 10MB 00:06:19.016 EAL: Trying to obtain current memory policy. 00:06:19.016 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:19.016 EAL: Restoring previous memory policy: 4 00:06:19.016 EAL: Calling mem event callback 'spdk:(nil)' 00:06:19.016 EAL: request: mp_malloc_sync 00:06:19.016 EAL: No shared files mode enabled, IPC is disabled 00:06:19.016 EAL: Heap on socket 0 was expanded by 18MB 00:06:19.016 EAL: Calling mem event callback 'spdk:(nil)' 00:06:19.016 EAL: request: mp_malloc_sync 00:06:19.016 EAL: No shared files mode enabled, IPC is disabled 00:06:19.016 EAL: Heap on socket 0 was shrunk by 18MB 00:06:19.016 EAL: Trying to obtain current memory policy. 00:06:19.016 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:19.016 EAL: Restoring previous memory policy: 4 00:06:19.016 EAL: Calling mem event callback 'spdk:(nil)' 00:06:19.016 EAL: request: mp_malloc_sync 00:06:19.016 EAL: No shared files mode enabled, IPC is disabled 00:06:19.016 EAL: Heap on socket 0 was expanded by 34MB 00:06:19.016 EAL: Calling mem event callback 'spdk:(nil)' 00:06:19.016 EAL: request: mp_malloc_sync 00:06:19.016 EAL: No shared files mode enabled, IPC is disabled 00:06:19.016 EAL: Heap on socket 0 was shrunk by 34MB 00:06:19.016 EAL: Trying to obtain current memory policy. 00:06:19.016 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:19.016 EAL: Restoring previous memory policy: 4 00:06:19.016 EAL: Calling mem event callback 'spdk:(nil)' 00:06:19.016 EAL: request: mp_malloc_sync 00:06:19.016 EAL: No shared files mode enabled, IPC is disabled 00:06:19.016 EAL: Heap on socket 0 was expanded by 66MB 00:06:19.016 EAL: Calling mem event callback 'spdk:(nil)' 00:06:19.016 EAL: request: mp_malloc_sync 00:06:19.016 EAL: No shared files mode enabled, IPC is disabled 00:06:19.016 EAL: Heap on socket 0 was shrunk by 66MB 00:06:19.016 EAL: Trying to obtain current memory policy. 00:06:19.016 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:19.016 EAL: Restoring previous memory policy: 4 00:06:19.016 EAL: Calling mem event callback 'spdk:(nil)' 00:06:19.016 EAL: request: mp_malloc_sync 00:06:19.016 EAL: No shared files mode enabled, IPC is disabled 00:06:19.016 EAL: Heap on socket 0 was expanded by 130MB 00:06:19.016 EAL: Calling mem event callback 'spdk:(nil)' 00:06:19.016 EAL: request: mp_malloc_sync 00:06:19.016 EAL: No shared files mode enabled, IPC is disabled 00:06:19.016 EAL: Heap on socket 0 was shrunk by 130MB 00:06:19.016 EAL: Trying to obtain current memory policy. 00:06:19.016 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:19.016 EAL: Restoring previous memory policy: 4 00:06:19.016 EAL: Calling mem event callback 'spdk:(nil)' 00:06:19.016 EAL: request: mp_malloc_sync 00:06:19.016 EAL: No shared files mode enabled, IPC is disabled 00:06:19.016 EAL: Heap on socket 0 was expanded by 258MB 00:06:19.275 EAL: Calling mem event callback 'spdk:(nil)' 00:06:19.275 EAL: request: mp_malloc_sync 00:06:19.275 EAL: No shared files mode enabled, IPC is disabled 00:06:19.275 EAL: Heap on socket 0 was shrunk by 258MB 00:06:19.275 EAL: Trying to obtain current memory policy. 00:06:19.275 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:19.275 EAL: Restoring previous memory policy: 4 00:06:19.275 EAL: Calling mem event callback 'spdk:(nil)' 00:06:19.276 EAL: request: mp_malloc_sync 00:06:19.276 EAL: No shared files mode enabled, IPC is disabled 00:06:19.276 EAL: Heap on socket 0 was expanded by 514MB 00:06:19.276 EAL: Calling mem event callback 'spdk:(nil)' 00:06:19.535 EAL: request: mp_malloc_sync 00:06:19.535 EAL: No shared files mode enabled, IPC is disabled 00:06:19.535 EAL: Heap on socket 0 was shrunk by 514MB 00:06:19.535 EAL: Trying to obtain current memory policy. 00:06:19.535 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:19.535 EAL: Restoring previous memory policy: 4 00:06:19.535 EAL: Calling mem event callback 'spdk:(nil)' 00:06:19.535 EAL: request: mp_malloc_sync 00:06:19.535 EAL: No shared files mode enabled, IPC is disabled 00:06:19.535 EAL: Heap on socket 0 was expanded by 1026MB 00:06:19.794 EAL: Calling mem event callback 'spdk:(nil)' 00:06:20.054 EAL: request: mp_malloc_sync 00:06:20.054 EAL: No shared files mode enabled, IPC is disabled 00:06:20.054 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:20.054 passed 00:06:20.054 00:06:20.054 Run Summary: Type Total Ran Passed Failed Inactive 00:06:20.054 suites 1 1 n/a 0 0 00:06:20.054 tests 2 2 2 0 0 00:06:20.054 asserts 497 497 497 0 n/a 00:06:20.054 00:06:20.054 Elapsed time = 0.962 seconds 00:06:20.054 EAL: Calling mem event callback 'spdk:(nil)' 00:06:20.055 EAL: request: mp_malloc_sync 00:06:20.055 EAL: No shared files mode enabled, IPC is disabled 00:06:20.055 EAL: Heap on socket 0 was shrunk by 2MB 00:06:20.055 EAL: No shared files mode enabled, IPC is disabled 00:06:20.055 EAL: No shared files mode enabled, IPC is disabled 00:06:20.055 EAL: No shared files mode enabled, IPC is disabled 00:06:20.055 00:06:20.055 real 0m1.085s 00:06:20.055 user 0m0.626s 00:06:20.055 sys 0m0.429s 00:06:20.055 16:30:17 env.env_vtophys -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:20.055 16:30:17 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:20.055 ************************************ 00:06:20.055 END TEST env_vtophys 00:06:20.055 ************************************ 00:06:20.055 16:30:17 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:20.055 16:30:17 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:20.055 16:30:17 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:20.055 16:30:17 env -- common/autotest_common.sh@10 -- # set +x 00:06:20.055 ************************************ 00:06:20.055 START TEST env_pci 00:06:20.055 ************************************ 00:06:20.055 16:30:17 env.env_pci -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:20.055 00:06:20.055 00:06:20.055 CUnit - A unit testing framework for C - Version 2.1-3 00:06:20.055 http://cunit.sourceforge.net/ 00:06:20.055 00:06:20.055 00:06:20.055 Suite: pci 00:06:20.055 Test: pci_hook ...[2024-11-28 16:30:17.560734] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1050:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 3739889 has claimed it 00:06:20.055 EAL: Cannot find device (10000:00:01.0) 00:06:20.055 EAL: Failed to attach device on primary process 00:06:20.055 passed 00:06:20.055 00:06:20.055 Run Summary: Type Total Ran Passed Failed Inactive 00:06:20.055 suites 1 1 n/a 0 0 00:06:20.055 tests 1 1 1 0 0 00:06:20.055 asserts 25 25 25 0 n/a 00:06:20.055 00:06:20.055 Elapsed time = 0.033 seconds 00:06:20.055 00:06:20.055 real 0m0.045s 00:06:20.055 user 0m0.011s 00:06:20.055 sys 0m0.034s 00:06:20.055 16:30:17 env.env_pci -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:20.055 16:30:17 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:20.055 ************************************ 00:06:20.055 END TEST env_pci 00:06:20.055 ************************************ 00:06:20.055 16:30:17 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:20.055 16:30:17 env -- env/env.sh@15 -- # uname 00:06:20.055 16:30:17 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:20.055 16:30:17 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:20.055 16:30:17 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:20.055 16:30:17 env -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:06:20.055 16:30:17 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:20.055 16:30:17 env -- common/autotest_common.sh@10 -- # set +x 00:06:20.055 ************************************ 00:06:20.055 START TEST env_dpdk_post_init 00:06:20.055 ************************************ 00:06:20.055 16:30:17 env.env_dpdk_post_init -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:20.055 EAL: Detected CPU lcores: 112 00:06:20.055 EAL: Detected NUMA nodes: 2 00:06:20.055 EAL: Detected static linkage of DPDK 00:06:20.055 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:20.315 EAL: Selected IOVA mode 'VA' 00:06:20.315 EAL: VFIO support initialized 00:06:20.315 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:20.315 EAL: Using IOMMU type 1 (Type 1) 00:06:20.883 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:06:25.077 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:06:25.077 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:06:25.077 Starting DPDK initialization... 00:06:25.077 Starting SPDK post initialization... 00:06:25.077 SPDK NVMe probe 00:06:25.077 Attaching to 0000:d8:00.0 00:06:25.077 Attached to 0000:d8:00.0 00:06:25.077 Cleaning up... 00:06:25.077 00:06:25.077 real 0m4.731s 00:06:25.077 user 0m3.534s 00:06:25.077 sys 0m0.442s 00:06:25.077 16:30:22 env.env_dpdk_post_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:25.077 16:30:22 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:25.077 ************************************ 00:06:25.077 END TEST env_dpdk_post_init 00:06:25.077 ************************************ 00:06:25.077 16:30:22 env -- env/env.sh@26 -- # uname 00:06:25.077 16:30:22 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:25.077 16:30:22 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:25.077 16:30:22 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:25.077 16:30:22 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:25.077 16:30:22 env -- common/autotest_common.sh@10 -- # set +x 00:06:25.077 ************************************ 00:06:25.077 START TEST env_mem_callbacks 00:06:25.077 ************************************ 00:06:25.077 16:30:22 env.env_mem_callbacks -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:25.077 EAL: Detected CPU lcores: 112 00:06:25.077 EAL: Detected NUMA nodes: 2 00:06:25.077 EAL: Detected static linkage of DPDK 00:06:25.077 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:25.077 EAL: Selected IOVA mode 'VA' 00:06:25.077 EAL: VFIO support initialized 00:06:25.077 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:25.077 00:06:25.077 00:06:25.077 CUnit - A unit testing framework for C - Version 2.1-3 00:06:25.077 http://cunit.sourceforge.net/ 00:06:25.077 00:06:25.077 00:06:25.077 Suite: memory 00:06:25.077 Test: test ... 00:06:25.077 register 0x200000200000 2097152 00:06:25.077 malloc 3145728 00:06:25.077 register 0x200000400000 4194304 00:06:25.077 buf 0x200000500000 len 3145728 PASSED 00:06:25.077 malloc 64 00:06:25.077 buf 0x2000004fff40 len 64 PASSED 00:06:25.077 malloc 4194304 00:06:25.077 register 0x200000800000 6291456 00:06:25.077 buf 0x200000a00000 len 4194304 PASSED 00:06:25.077 free 0x200000500000 3145728 00:06:25.077 free 0x2000004fff40 64 00:06:25.077 unregister 0x200000400000 4194304 PASSED 00:06:25.077 free 0x200000a00000 4194304 00:06:25.077 unregister 0x200000800000 6291456 PASSED 00:06:25.077 malloc 8388608 00:06:25.077 register 0x200000400000 10485760 00:06:25.077 buf 0x200000600000 len 8388608 PASSED 00:06:25.077 free 0x200000600000 8388608 00:06:25.077 unregister 0x200000400000 10485760 PASSED 00:06:25.077 passed 00:06:25.077 00:06:25.077 Run Summary: Type Total Ran Passed Failed Inactive 00:06:25.077 suites 1 1 n/a 0 0 00:06:25.077 tests 1 1 1 0 0 00:06:25.077 asserts 15 15 15 0 n/a 00:06:25.077 00:06:25.077 Elapsed time = 0.005 seconds 00:06:25.077 00:06:25.077 real 0m0.059s 00:06:25.077 user 0m0.013s 00:06:25.077 sys 0m0.045s 00:06:25.077 16:30:22 env.env_mem_callbacks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:25.077 16:30:22 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:25.077 ************************************ 00:06:25.077 END TEST env_mem_callbacks 00:06:25.077 ************************************ 00:06:25.077 00:06:25.077 real 0m6.605s 00:06:25.077 user 0m4.526s 00:06:25.077 sys 0m1.338s 00:06:25.077 16:30:22 env -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:25.077 16:30:22 env -- common/autotest_common.sh@10 -- # set +x 00:06:25.077 ************************************ 00:06:25.077 END TEST env 00:06:25.077 ************************************ 00:06:25.077 16:30:22 -- spdk/autotest.sh@156 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:25.077 16:30:22 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:25.077 16:30:22 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:25.077 16:30:22 -- common/autotest_common.sh@10 -- # set +x 00:06:25.077 ************************************ 00:06:25.077 START TEST rpc 00:06:25.077 ************************************ 00:06:25.077 16:30:22 rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:25.336 * Looking for test storage... 00:06:25.336 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:25.336 16:30:22 rpc -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:25.336 16:30:22 rpc -- common/autotest_common.sh@1681 -- # lcov --version 00:06:25.336 16:30:22 rpc -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:25.336 16:30:22 rpc -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:25.336 16:30:22 rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:25.336 16:30:22 rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:25.336 16:30:22 rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:25.336 16:30:22 rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:25.336 16:30:22 rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:25.336 16:30:22 rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:25.336 16:30:22 rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:25.336 16:30:22 rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:25.336 16:30:22 rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:25.336 16:30:22 rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:25.336 16:30:22 rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:25.336 16:30:22 rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:25.336 16:30:22 rpc -- scripts/common.sh@345 -- # : 1 00:06:25.336 16:30:22 rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:25.336 16:30:22 rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:25.336 16:30:22 rpc -- scripts/common.sh@365 -- # decimal 1 00:06:25.336 16:30:22 rpc -- scripts/common.sh@353 -- # local d=1 00:06:25.336 16:30:22 rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:25.336 16:30:22 rpc -- scripts/common.sh@355 -- # echo 1 00:06:25.336 16:30:22 rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:25.336 16:30:22 rpc -- scripts/common.sh@366 -- # decimal 2 00:06:25.336 16:30:22 rpc -- scripts/common.sh@353 -- # local d=2 00:06:25.336 16:30:22 rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:25.336 16:30:22 rpc -- scripts/common.sh@355 -- # echo 2 00:06:25.336 16:30:22 rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:25.336 16:30:22 rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:25.336 16:30:22 rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:25.336 16:30:22 rpc -- scripts/common.sh@368 -- # return 0 00:06:25.336 16:30:22 rpc -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:25.336 16:30:22 rpc -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:25.336 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:25.336 --rc genhtml_branch_coverage=1 00:06:25.336 --rc genhtml_function_coverage=1 00:06:25.336 --rc genhtml_legend=1 00:06:25.336 --rc geninfo_all_blocks=1 00:06:25.337 --rc geninfo_unexecuted_blocks=1 00:06:25.337 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:25.337 ' 00:06:25.337 16:30:22 rpc -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:25.337 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:25.337 --rc genhtml_branch_coverage=1 00:06:25.337 --rc genhtml_function_coverage=1 00:06:25.337 --rc genhtml_legend=1 00:06:25.337 --rc geninfo_all_blocks=1 00:06:25.337 --rc geninfo_unexecuted_blocks=1 00:06:25.337 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:25.337 ' 00:06:25.337 16:30:22 rpc -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:25.337 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:25.337 --rc genhtml_branch_coverage=1 00:06:25.337 --rc genhtml_function_coverage=1 00:06:25.337 --rc genhtml_legend=1 00:06:25.337 --rc geninfo_all_blocks=1 00:06:25.337 --rc geninfo_unexecuted_blocks=1 00:06:25.337 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:25.337 ' 00:06:25.337 16:30:22 rpc -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:25.337 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:25.337 --rc genhtml_branch_coverage=1 00:06:25.337 --rc genhtml_function_coverage=1 00:06:25.337 --rc genhtml_legend=1 00:06:25.337 --rc geninfo_all_blocks=1 00:06:25.337 --rc geninfo_unexecuted_blocks=1 00:06:25.337 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:25.337 ' 00:06:25.337 16:30:22 rpc -- rpc/rpc.sh@65 -- # spdk_pid=3741048 00:06:25.337 16:30:22 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:25.337 16:30:22 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:25.337 16:30:22 rpc -- rpc/rpc.sh@67 -- # waitforlisten 3741048 00:06:25.337 16:30:22 rpc -- common/autotest_common.sh@831 -- # '[' -z 3741048 ']' 00:06:25.337 16:30:22 rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:25.337 16:30:22 rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:25.337 16:30:22 rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:25.337 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:25.337 16:30:22 rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:25.337 16:30:22 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:25.337 [2024-11-28 16:30:22.901481] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:25.337 [2024-11-28 16:30:22.901570] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3741048 ] 00:06:25.337 [2024-11-28 16:30:22.966900] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:25.602 [2024-11-28 16:30:23.005153] app.c: 610:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:25.602 [2024-11-28 16:30:23.005197] app.c: 614:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 3741048' to capture a snapshot of events at runtime. 00:06:25.602 [2024-11-28 16:30:23.005207] app.c: 616:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:25.602 [2024-11-28 16:30:23.005215] app.c: 617:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:25.602 [2024-11-28 16:30:23.005222] app.c: 618:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid3741048 for offline analysis/debug. 00:06:25.602 [2024-11-28 16:30:23.005251] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.603 16:30:23 rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:25.603 16:30:23 rpc -- common/autotest_common.sh@864 -- # return 0 00:06:25.603 16:30:23 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:25.603 16:30:23 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:25.603 16:30:23 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:25.603 16:30:23 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:25.603 16:30:23 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:25.603 16:30:23 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:25.603 16:30:23 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:25.603 ************************************ 00:06:25.603 START TEST rpc_integrity 00:06:25.603 ************************************ 00:06:25.603 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:06:25.603 16:30:23 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:25.603 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:25.603 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:25.603 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:25.603 16:30:23 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:25.603 16:30:23 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:25.883 16:30:23 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:25.883 16:30:23 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:25.883 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:25.883 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:25.883 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:25.883 16:30:23 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:25.883 16:30:23 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:25.883 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:25.883 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:25.883 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:25.883 16:30:23 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:25.883 { 00:06:25.883 "name": "Malloc0", 00:06:25.883 "aliases": [ 00:06:25.883 "198d0505-0b19-42f8-9cf6-09063a151aad" 00:06:25.883 ], 00:06:25.884 "product_name": "Malloc disk", 00:06:25.884 "block_size": 512, 00:06:25.884 "num_blocks": 16384, 00:06:25.884 "uuid": "198d0505-0b19-42f8-9cf6-09063a151aad", 00:06:25.884 "assigned_rate_limits": { 00:06:25.884 "rw_ios_per_sec": 0, 00:06:25.884 "rw_mbytes_per_sec": 0, 00:06:25.884 "r_mbytes_per_sec": 0, 00:06:25.884 "w_mbytes_per_sec": 0 00:06:25.884 }, 00:06:25.884 "claimed": false, 00:06:25.884 "zoned": false, 00:06:25.884 "supported_io_types": { 00:06:25.884 "read": true, 00:06:25.884 "write": true, 00:06:25.884 "unmap": true, 00:06:25.884 "flush": true, 00:06:25.884 "reset": true, 00:06:25.884 "nvme_admin": false, 00:06:25.884 "nvme_io": false, 00:06:25.884 "nvme_io_md": false, 00:06:25.884 "write_zeroes": true, 00:06:25.884 "zcopy": true, 00:06:25.884 "get_zone_info": false, 00:06:25.884 "zone_management": false, 00:06:25.884 "zone_append": false, 00:06:25.884 "compare": false, 00:06:25.884 "compare_and_write": false, 00:06:25.884 "abort": true, 00:06:25.884 "seek_hole": false, 00:06:25.884 "seek_data": false, 00:06:25.884 "copy": true, 00:06:25.884 "nvme_iov_md": false 00:06:25.884 }, 00:06:25.884 "memory_domains": [ 00:06:25.884 { 00:06:25.884 "dma_device_id": "system", 00:06:25.884 "dma_device_type": 1 00:06:25.884 }, 00:06:25.884 { 00:06:25.884 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:25.884 "dma_device_type": 2 00:06:25.884 } 00:06:25.884 ], 00:06:25.884 "driver_specific": {} 00:06:25.884 } 00:06:25.884 ]' 00:06:25.884 16:30:23 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:25.884 16:30:23 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:25.884 16:30:23 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:25.884 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:25.884 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:25.884 [2024-11-28 16:30:23.358958] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:25.884 [2024-11-28 16:30:23.358990] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:25.884 [2024-11-28 16:30:23.359007] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4536bc0 00:06:25.884 [2024-11-28 16:30:23.359016] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:25.884 [2024-11-28 16:30:23.359821] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:25.884 [2024-11-28 16:30:23.359843] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:25.884 Passthru0 00:06:25.884 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:25.884 16:30:23 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:25.884 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:25.884 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:25.884 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:25.884 16:30:23 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:25.884 { 00:06:25.884 "name": "Malloc0", 00:06:25.884 "aliases": [ 00:06:25.884 "198d0505-0b19-42f8-9cf6-09063a151aad" 00:06:25.884 ], 00:06:25.884 "product_name": "Malloc disk", 00:06:25.884 "block_size": 512, 00:06:25.884 "num_blocks": 16384, 00:06:25.884 "uuid": "198d0505-0b19-42f8-9cf6-09063a151aad", 00:06:25.884 "assigned_rate_limits": { 00:06:25.884 "rw_ios_per_sec": 0, 00:06:25.884 "rw_mbytes_per_sec": 0, 00:06:25.884 "r_mbytes_per_sec": 0, 00:06:25.884 "w_mbytes_per_sec": 0 00:06:25.884 }, 00:06:25.884 "claimed": true, 00:06:25.884 "claim_type": "exclusive_write", 00:06:25.884 "zoned": false, 00:06:25.884 "supported_io_types": { 00:06:25.884 "read": true, 00:06:25.884 "write": true, 00:06:25.884 "unmap": true, 00:06:25.884 "flush": true, 00:06:25.884 "reset": true, 00:06:25.884 "nvme_admin": false, 00:06:25.884 "nvme_io": false, 00:06:25.884 "nvme_io_md": false, 00:06:25.884 "write_zeroes": true, 00:06:25.884 "zcopy": true, 00:06:25.884 "get_zone_info": false, 00:06:25.884 "zone_management": false, 00:06:25.884 "zone_append": false, 00:06:25.884 "compare": false, 00:06:25.884 "compare_and_write": false, 00:06:25.884 "abort": true, 00:06:25.884 "seek_hole": false, 00:06:25.884 "seek_data": false, 00:06:25.884 "copy": true, 00:06:25.884 "nvme_iov_md": false 00:06:25.884 }, 00:06:25.884 "memory_domains": [ 00:06:25.884 { 00:06:25.884 "dma_device_id": "system", 00:06:25.884 "dma_device_type": 1 00:06:25.884 }, 00:06:25.884 { 00:06:25.884 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:25.884 "dma_device_type": 2 00:06:25.884 } 00:06:25.884 ], 00:06:25.884 "driver_specific": {} 00:06:25.884 }, 00:06:25.884 { 00:06:25.884 "name": "Passthru0", 00:06:25.884 "aliases": [ 00:06:25.884 "c2c62b33-ba04-5792-8ae1-089fad17feb2" 00:06:25.884 ], 00:06:25.884 "product_name": "passthru", 00:06:25.884 "block_size": 512, 00:06:25.884 "num_blocks": 16384, 00:06:25.884 "uuid": "c2c62b33-ba04-5792-8ae1-089fad17feb2", 00:06:25.884 "assigned_rate_limits": { 00:06:25.884 "rw_ios_per_sec": 0, 00:06:25.884 "rw_mbytes_per_sec": 0, 00:06:25.884 "r_mbytes_per_sec": 0, 00:06:25.884 "w_mbytes_per_sec": 0 00:06:25.884 }, 00:06:25.884 "claimed": false, 00:06:25.884 "zoned": false, 00:06:25.884 "supported_io_types": { 00:06:25.884 "read": true, 00:06:25.884 "write": true, 00:06:25.884 "unmap": true, 00:06:25.884 "flush": true, 00:06:25.884 "reset": true, 00:06:25.884 "nvme_admin": false, 00:06:25.884 "nvme_io": false, 00:06:25.884 "nvme_io_md": false, 00:06:25.884 "write_zeroes": true, 00:06:25.884 "zcopy": true, 00:06:25.884 "get_zone_info": false, 00:06:25.884 "zone_management": false, 00:06:25.884 "zone_append": false, 00:06:25.884 "compare": false, 00:06:25.884 "compare_and_write": false, 00:06:25.884 "abort": true, 00:06:25.884 "seek_hole": false, 00:06:25.884 "seek_data": false, 00:06:25.884 "copy": true, 00:06:25.884 "nvme_iov_md": false 00:06:25.884 }, 00:06:25.884 "memory_domains": [ 00:06:25.884 { 00:06:25.884 "dma_device_id": "system", 00:06:25.884 "dma_device_type": 1 00:06:25.884 }, 00:06:25.884 { 00:06:25.884 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:25.884 "dma_device_type": 2 00:06:25.884 } 00:06:25.884 ], 00:06:25.884 "driver_specific": { 00:06:25.884 "passthru": { 00:06:25.884 "name": "Passthru0", 00:06:25.884 "base_bdev_name": "Malloc0" 00:06:25.884 } 00:06:25.884 } 00:06:25.884 } 00:06:25.884 ]' 00:06:25.884 16:30:23 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:25.884 16:30:23 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:25.884 16:30:23 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:25.884 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:25.884 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:25.884 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:25.884 16:30:23 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:25.884 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:25.884 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:25.884 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:25.884 16:30:23 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:25.884 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:25.884 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:25.884 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:25.884 16:30:23 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:25.884 16:30:23 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:25.884 16:30:23 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:25.884 00:06:25.884 real 0m0.267s 00:06:25.884 user 0m0.168s 00:06:25.884 sys 0m0.046s 00:06:25.884 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:25.884 16:30:23 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:25.884 ************************************ 00:06:25.884 END TEST rpc_integrity 00:06:25.884 ************************************ 00:06:26.224 16:30:23 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:26.224 16:30:23 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:26.224 16:30:23 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:26.224 16:30:23 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:26.224 ************************************ 00:06:26.224 START TEST rpc_plugins 00:06:26.224 ************************************ 00:06:26.224 16:30:23 rpc.rpc_plugins -- common/autotest_common.sh@1125 -- # rpc_plugins 00:06:26.224 16:30:23 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:26.224 16:30:23 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:26.224 16:30:23 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:26.224 16:30:23 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:26.224 16:30:23 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:26.224 16:30:23 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:26.224 16:30:23 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:26.224 16:30:23 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:26.224 16:30:23 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:26.224 16:30:23 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:26.224 { 00:06:26.224 "name": "Malloc1", 00:06:26.224 "aliases": [ 00:06:26.224 "11fced88-8446-47ab-a8c9-22eab117abc8" 00:06:26.224 ], 00:06:26.224 "product_name": "Malloc disk", 00:06:26.224 "block_size": 4096, 00:06:26.224 "num_blocks": 256, 00:06:26.224 "uuid": "11fced88-8446-47ab-a8c9-22eab117abc8", 00:06:26.224 "assigned_rate_limits": { 00:06:26.224 "rw_ios_per_sec": 0, 00:06:26.224 "rw_mbytes_per_sec": 0, 00:06:26.224 "r_mbytes_per_sec": 0, 00:06:26.224 "w_mbytes_per_sec": 0 00:06:26.224 }, 00:06:26.224 "claimed": false, 00:06:26.224 "zoned": false, 00:06:26.224 "supported_io_types": { 00:06:26.224 "read": true, 00:06:26.224 "write": true, 00:06:26.224 "unmap": true, 00:06:26.224 "flush": true, 00:06:26.224 "reset": true, 00:06:26.224 "nvme_admin": false, 00:06:26.224 "nvme_io": false, 00:06:26.224 "nvme_io_md": false, 00:06:26.224 "write_zeroes": true, 00:06:26.224 "zcopy": true, 00:06:26.224 "get_zone_info": false, 00:06:26.224 "zone_management": false, 00:06:26.224 "zone_append": false, 00:06:26.224 "compare": false, 00:06:26.224 "compare_and_write": false, 00:06:26.224 "abort": true, 00:06:26.224 "seek_hole": false, 00:06:26.224 "seek_data": false, 00:06:26.224 "copy": true, 00:06:26.224 "nvme_iov_md": false 00:06:26.224 }, 00:06:26.224 "memory_domains": [ 00:06:26.224 { 00:06:26.224 "dma_device_id": "system", 00:06:26.224 "dma_device_type": 1 00:06:26.224 }, 00:06:26.224 { 00:06:26.224 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:26.224 "dma_device_type": 2 00:06:26.224 } 00:06:26.224 ], 00:06:26.224 "driver_specific": {} 00:06:26.224 } 00:06:26.224 ]' 00:06:26.224 16:30:23 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:26.224 16:30:23 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:26.224 16:30:23 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:26.224 16:30:23 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:26.224 16:30:23 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:26.224 16:30:23 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:26.224 16:30:23 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:26.224 16:30:23 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:26.224 16:30:23 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:26.224 16:30:23 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:26.224 16:30:23 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:26.224 16:30:23 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:26.224 16:30:23 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:26.224 00:06:26.224 real 0m0.144s 00:06:26.225 user 0m0.092s 00:06:26.225 sys 0m0.020s 00:06:26.225 16:30:23 rpc.rpc_plugins -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:26.225 16:30:23 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:26.225 ************************************ 00:06:26.225 END TEST rpc_plugins 00:06:26.225 ************************************ 00:06:26.225 16:30:23 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:26.225 16:30:23 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:26.225 16:30:23 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:26.225 16:30:23 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:26.225 ************************************ 00:06:26.225 START TEST rpc_trace_cmd_test 00:06:26.225 ************************************ 00:06:26.225 16:30:23 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1125 -- # rpc_trace_cmd_test 00:06:26.225 16:30:23 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:26.225 16:30:23 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:26.225 16:30:23 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:26.225 16:30:23 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:26.225 16:30:23 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:26.225 16:30:23 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:26.225 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid3741048", 00:06:26.225 "tpoint_group_mask": "0x8", 00:06:26.225 "iscsi_conn": { 00:06:26.225 "mask": "0x2", 00:06:26.225 "tpoint_mask": "0x0" 00:06:26.225 }, 00:06:26.225 "scsi": { 00:06:26.225 "mask": "0x4", 00:06:26.225 "tpoint_mask": "0x0" 00:06:26.225 }, 00:06:26.225 "bdev": { 00:06:26.225 "mask": "0x8", 00:06:26.225 "tpoint_mask": "0xffffffffffffffff" 00:06:26.225 }, 00:06:26.225 "nvmf_rdma": { 00:06:26.225 "mask": "0x10", 00:06:26.225 "tpoint_mask": "0x0" 00:06:26.225 }, 00:06:26.225 "nvmf_tcp": { 00:06:26.225 "mask": "0x20", 00:06:26.225 "tpoint_mask": "0x0" 00:06:26.225 }, 00:06:26.225 "ftl": { 00:06:26.225 "mask": "0x40", 00:06:26.225 "tpoint_mask": "0x0" 00:06:26.225 }, 00:06:26.225 "blobfs": { 00:06:26.225 "mask": "0x80", 00:06:26.225 "tpoint_mask": "0x0" 00:06:26.225 }, 00:06:26.225 "dsa": { 00:06:26.225 "mask": "0x200", 00:06:26.225 "tpoint_mask": "0x0" 00:06:26.225 }, 00:06:26.225 "thread": { 00:06:26.225 "mask": "0x400", 00:06:26.225 "tpoint_mask": "0x0" 00:06:26.225 }, 00:06:26.225 "nvme_pcie": { 00:06:26.225 "mask": "0x800", 00:06:26.225 "tpoint_mask": "0x0" 00:06:26.225 }, 00:06:26.225 "iaa": { 00:06:26.225 "mask": "0x1000", 00:06:26.225 "tpoint_mask": "0x0" 00:06:26.225 }, 00:06:26.225 "nvme_tcp": { 00:06:26.225 "mask": "0x2000", 00:06:26.225 "tpoint_mask": "0x0" 00:06:26.225 }, 00:06:26.225 "bdev_nvme": { 00:06:26.225 "mask": "0x4000", 00:06:26.225 "tpoint_mask": "0x0" 00:06:26.225 }, 00:06:26.225 "sock": { 00:06:26.225 "mask": "0x8000", 00:06:26.225 "tpoint_mask": "0x0" 00:06:26.225 }, 00:06:26.225 "blob": { 00:06:26.225 "mask": "0x10000", 00:06:26.225 "tpoint_mask": "0x0" 00:06:26.225 }, 00:06:26.225 "bdev_raid": { 00:06:26.225 "mask": "0x20000", 00:06:26.225 "tpoint_mask": "0x0" 00:06:26.225 } 00:06:26.225 }' 00:06:26.225 16:30:23 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:26.505 16:30:23 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 18 -gt 2 ']' 00:06:26.505 16:30:23 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:26.505 16:30:23 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:26.505 16:30:23 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:26.506 16:30:23 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:26.506 16:30:23 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:26.506 16:30:23 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:26.506 16:30:23 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:26.506 16:30:24 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:26.506 00:06:26.506 real 0m0.216s 00:06:26.506 user 0m0.174s 00:06:26.506 sys 0m0.032s 00:06:26.506 16:30:24 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:26.506 16:30:24 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:26.506 ************************************ 00:06:26.506 END TEST rpc_trace_cmd_test 00:06:26.506 ************************************ 00:06:26.506 16:30:24 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:26.506 16:30:24 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:26.506 16:30:24 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:26.506 16:30:24 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:26.506 16:30:24 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:26.506 16:30:24 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:26.506 ************************************ 00:06:26.506 START TEST rpc_daemon_integrity 00:06:26.506 ************************************ 00:06:26.506 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:06:26.506 16:30:24 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:26.506 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:26.506 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:26.506 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:26.506 16:30:24 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:26.506 16:30:24 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:26.506 16:30:24 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:26.506 16:30:24 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:26.506 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:26.506 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:26.765 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:26.765 16:30:24 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:26.765 16:30:24 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:26.765 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:26.765 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:26.765 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:26.765 16:30:24 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:26.765 { 00:06:26.765 "name": "Malloc2", 00:06:26.765 "aliases": [ 00:06:26.765 "eea2abe3-897b-4cb3-bb32-30d1a076f50c" 00:06:26.765 ], 00:06:26.765 "product_name": "Malloc disk", 00:06:26.765 "block_size": 512, 00:06:26.765 "num_blocks": 16384, 00:06:26.765 "uuid": "eea2abe3-897b-4cb3-bb32-30d1a076f50c", 00:06:26.765 "assigned_rate_limits": { 00:06:26.765 "rw_ios_per_sec": 0, 00:06:26.765 "rw_mbytes_per_sec": 0, 00:06:26.765 "r_mbytes_per_sec": 0, 00:06:26.765 "w_mbytes_per_sec": 0 00:06:26.765 }, 00:06:26.765 "claimed": false, 00:06:26.765 "zoned": false, 00:06:26.765 "supported_io_types": { 00:06:26.765 "read": true, 00:06:26.765 "write": true, 00:06:26.765 "unmap": true, 00:06:26.765 "flush": true, 00:06:26.765 "reset": true, 00:06:26.765 "nvme_admin": false, 00:06:26.765 "nvme_io": false, 00:06:26.765 "nvme_io_md": false, 00:06:26.765 "write_zeroes": true, 00:06:26.765 "zcopy": true, 00:06:26.765 "get_zone_info": false, 00:06:26.765 "zone_management": false, 00:06:26.765 "zone_append": false, 00:06:26.765 "compare": false, 00:06:26.765 "compare_and_write": false, 00:06:26.765 "abort": true, 00:06:26.765 "seek_hole": false, 00:06:26.765 "seek_data": false, 00:06:26.765 "copy": true, 00:06:26.765 "nvme_iov_md": false 00:06:26.765 }, 00:06:26.765 "memory_domains": [ 00:06:26.765 { 00:06:26.765 "dma_device_id": "system", 00:06:26.765 "dma_device_type": 1 00:06:26.765 }, 00:06:26.765 { 00:06:26.765 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:26.765 "dma_device_type": 2 00:06:26.765 } 00:06:26.765 ], 00:06:26.765 "driver_specific": {} 00:06:26.765 } 00:06:26.765 ]' 00:06:26.765 16:30:24 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:26.765 16:30:24 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:26.765 16:30:24 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:26.765 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:26.765 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:26.765 [2024-11-28 16:30:24.229328] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:26.765 [2024-11-28 16:30:24.229362] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:26.765 [2024-11-28 16:30:24.229378] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x453b1d0 00:06:26.765 [2024-11-28 16:30:24.229387] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:26.765 [2024-11-28 16:30:24.230115] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:26.765 [2024-11-28 16:30:24.230140] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:26.765 Passthru0 00:06:26.765 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:26.765 16:30:24 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:26.765 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:26.765 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:26.765 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:26.765 16:30:24 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:26.765 { 00:06:26.765 "name": "Malloc2", 00:06:26.765 "aliases": [ 00:06:26.765 "eea2abe3-897b-4cb3-bb32-30d1a076f50c" 00:06:26.765 ], 00:06:26.766 "product_name": "Malloc disk", 00:06:26.766 "block_size": 512, 00:06:26.766 "num_blocks": 16384, 00:06:26.766 "uuid": "eea2abe3-897b-4cb3-bb32-30d1a076f50c", 00:06:26.766 "assigned_rate_limits": { 00:06:26.766 "rw_ios_per_sec": 0, 00:06:26.766 "rw_mbytes_per_sec": 0, 00:06:26.766 "r_mbytes_per_sec": 0, 00:06:26.766 "w_mbytes_per_sec": 0 00:06:26.766 }, 00:06:26.766 "claimed": true, 00:06:26.766 "claim_type": "exclusive_write", 00:06:26.766 "zoned": false, 00:06:26.766 "supported_io_types": { 00:06:26.766 "read": true, 00:06:26.766 "write": true, 00:06:26.766 "unmap": true, 00:06:26.766 "flush": true, 00:06:26.766 "reset": true, 00:06:26.766 "nvme_admin": false, 00:06:26.766 "nvme_io": false, 00:06:26.766 "nvme_io_md": false, 00:06:26.766 "write_zeroes": true, 00:06:26.766 "zcopy": true, 00:06:26.766 "get_zone_info": false, 00:06:26.766 "zone_management": false, 00:06:26.766 "zone_append": false, 00:06:26.766 "compare": false, 00:06:26.766 "compare_and_write": false, 00:06:26.766 "abort": true, 00:06:26.766 "seek_hole": false, 00:06:26.766 "seek_data": false, 00:06:26.766 "copy": true, 00:06:26.766 "nvme_iov_md": false 00:06:26.766 }, 00:06:26.766 "memory_domains": [ 00:06:26.766 { 00:06:26.766 "dma_device_id": "system", 00:06:26.766 "dma_device_type": 1 00:06:26.766 }, 00:06:26.766 { 00:06:26.766 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:26.766 "dma_device_type": 2 00:06:26.766 } 00:06:26.766 ], 00:06:26.766 "driver_specific": {} 00:06:26.766 }, 00:06:26.766 { 00:06:26.766 "name": "Passthru0", 00:06:26.766 "aliases": [ 00:06:26.766 "64f5369f-c65c-5faf-9104-9ff0e119ac4a" 00:06:26.766 ], 00:06:26.766 "product_name": "passthru", 00:06:26.766 "block_size": 512, 00:06:26.766 "num_blocks": 16384, 00:06:26.766 "uuid": "64f5369f-c65c-5faf-9104-9ff0e119ac4a", 00:06:26.766 "assigned_rate_limits": { 00:06:26.766 "rw_ios_per_sec": 0, 00:06:26.766 "rw_mbytes_per_sec": 0, 00:06:26.766 "r_mbytes_per_sec": 0, 00:06:26.766 "w_mbytes_per_sec": 0 00:06:26.766 }, 00:06:26.766 "claimed": false, 00:06:26.766 "zoned": false, 00:06:26.766 "supported_io_types": { 00:06:26.766 "read": true, 00:06:26.766 "write": true, 00:06:26.766 "unmap": true, 00:06:26.766 "flush": true, 00:06:26.766 "reset": true, 00:06:26.766 "nvme_admin": false, 00:06:26.766 "nvme_io": false, 00:06:26.766 "nvme_io_md": false, 00:06:26.766 "write_zeroes": true, 00:06:26.766 "zcopy": true, 00:06:26.766 "get_zone_info": false, 00:06:26.766 "zone_management": false, 00:06:26.766 "zone_append": false, 00:06:26.766 "compare": false, 00:06:26.766 "compare_and_write": false, 00:06:26.766 "abort": true, 00:06:26.766 "seek_hole": false, 00:06:26.766 "seek_data": false, 00:06:26.766 "copy": true, 00:06:26.766 "nvme_iov_md": false 00:06:26.766 }, 00:06:26.766 "memory_domains": [ 00:06:26.766 { 00:06:26.766 "dma_device_id": "system", 00:06:26.766 "dma_device_type": 1 00:06:26.766 }, 00:06:26.766 { 00:06:26.766 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:26.766 "dma_device_type": 2 00:06:26.766 } 00:06:26.766 ], 00:06:26.766 "driver_specific": { 00:06:26.766 "passthru": { 00:06:26.766 "name": "Passthru0", 00:06:26.766 "base_bdev_name": "Malloc2" 00:06:26.766 } 00:06:26.766 } 00:06:26.766 } 00:06:26.766 ]' 00:06:26.766 16:30:24 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:26.766 16:30:24 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:26.766 16:30:24 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:26.766 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:26.766 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:26.766 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:26.766 16:30:24 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:26.766 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:26.766 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:26.766 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:26.766 16:30:24 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:26.766 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:26.766 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:26.766 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:26.766 16:30:24 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:26.766 16:30:24 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:26.766 16:30:24 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:26.766 00:06:26.766 real 0m0.281s 00:06:26.766 user 0m0.176s 00:06:26.766 sys 0m0.049s 00:06:26.766 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:26.766 16:30:24 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:26.766 ************************************ 00:06:26.766 END TEST rpc_daemon_integrity 00:06:26.766 ************************************ 00:06:27.026 16:30:24 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:27.026 16:30:24 rpc -- rpc/rpc.sh@84 -- # killprocess 3741048 00:06:27.026 16:30:24 rpc -- common/autotest_common.sh@950 -- # '[' -z 3741048 ']' 00:06:27.026 16:30:24 rpc -- common/autotest_common.sh@954 -- # kill -0 3741048 00:06:27.026 16:30:24 rpc -- common/autotest_common.sh@955 -- # uname 00:06:27.026 16:30:24 rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:27.026 16:30:24 rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3741048 00:06:27.026 16:30:24 rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:27.026 16:30:24 rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:27.026 16:30:24 rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3741048' 00:06:27.026 killing process with pid 3741048 00:06:27.026 16:30:24 rpc -- common/autotest_common.sh@969 -- # kill 3741048 00:06:27.026 16:30:24 rpc -- common/autotest_common.sh@974 -- # wait 3741048 00:06:27.285 00:06:27.285 real 0m2.110s 00:06:27.285 user 0m2.645s 00:06:27.285 sys 0m0.813s 00:06:27.285 16:30:24 rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:27.285 16:30:24 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:27.285 ************************************ 00:06:27.285 END TEST rpc 00:06:27.285 ************************************ 00:06:27.285 16:30:24 -- spdk/autotest.sh@157 -- # run_test skip_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:27.285 16:30:24 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:27.285 16:30:24 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:27.285 16:30:24 -- common/autotest_common.sh@10 -- # set +x 00:06:27.285 ************************************ 00:06:27.285 START TEST skip_rpc 00:06:27.285 ************************************ 00:06:27.285 16:30:24 skip_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:27.545 * Looking for test storage... 00:06:27.545 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:27.545 16:30:24 skip_rpc -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:27.545 16:30:24 skip_rpc -- common/autotest_common.sh@1681 -- # lcov --version 00:06:27.545 16:30:24 skip_rpc -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:27.545 16:30:25 skip_rpc -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@345 -- # : 1 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@365 -- # decimal 1 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@353 -- # local d=1 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@355 -- # echo 1 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@366 -- # decimal 2 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@353 -- # local d=2 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@355 -- # echo 2 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:27.545 16:30:25 skip_rpc -- scripts/common.sh@368 -- # return 0 00:06:27.545 16:30:25 skip_rpc -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:27.545 16:30:25 skip_rpc -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:27.545 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:27.545 --rc genhtml_branch_coverage=1 00:06:27.545 --rc genhtml_function_coverage=1 00:06:27.545 --rc genhtml_legend=1 00:06:27.545 --rc geninfo_all_blocks=1 00:06:27.545 --rc geninfo_unexecuted_blocks=1 00:06:27.545 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:27.545 ' 00:06:27.545 16:30:25 skip_rpc -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:27.545 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:27.545 --rc genhtml_branch_coverage=1 00:06:27.545 --rc genhtml_function_coverage=1 00:06:27.545 --rc genhtml_legend=1 00:06:27.545 --rc geninfo_all_blocks=1 00:06:27.545 --rc geninfo_unexecuted_blocks=1 00:06:27.545 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:27.545 ' 00:06:27.545 16:30:25 skip_rpc -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:27.545 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:27.545 --rc genhtml_branch_coverage=1 00:06:27.545 --rc genhtml_function_coverage=1 00:06:27.545 --rc genhtml_legend=1 00:06:27.545 --rc geninfo_all_blocks=1 00:06:27.545 --rc geninfo_unexecuted_blocks=1 00:06:27.545 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:27.545 ' 00:06:27.546 16:30:25 skip_rpc -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:27.546 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:27.546 --rc genhtml_branch_coverage=1 00:06:27.546 --rc genhtml_function_coverage=1 00:06:27.546 --rc genhtml_legend=1 00:06:27.546 --rc geninfo_all_blocks=1 00:06:27.546 --rc geninfo_unexecuted_blocks=1 00:06:27.546 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:27.546 ' 00:06:27.546 16:30:25 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:27.546 16:30:25 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:27.546 16:30:25 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:27.546 16:30:25 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:27.546 16:30:25 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:27.546 16:30:25 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:27.546 ************************************ 00:06:27.546 START TEST skip_rpc 00:06:27.546 ************************************ 00:06:27.546 16:30:25 skip_rpc.skip_rpc -- common/autotest_common.sh@1125 -- # test_skip_rpc 00:06:27.546 16:30:25 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=3741513 00:06:27.546 16:30:25 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:27.546 16:30:25 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:27.546 16:30:25 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:27.546 [2024-11-28 16:30:25.115236] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:27.546 [2024-11-28 16:30:25.115314] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3741513 ] 00:06:27.546 [2024-11-28 16:30:25.181457] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:27.804 [2024-11-28 16:30:25.220412] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # local es=0 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # rpc_cmd spdk_get_version 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # es=1 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 3741513 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@950 -- # '[' -z 3741513 ']' 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # kill -0 3741513 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # uname 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3741513 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3741513' 00:06:33.080 killing process with pid 3741513 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@969 -- # kill 3741513 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@974 -- # wait 3741513 00:06:33.080 00:06:33.080 real 0m5.396s 00:06:33.080 user 0m5.152s 00:06:33.080 sys 0m0.302s 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:33.080 16:30:30 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:33.080 ************************************ 00:06:33.080 END TEST skip_rpc 00:06:33.080 ************************************ 00:06:33.080 16:30:30 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:33.080 16:30:30 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:33.080 16:30:30 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:33.080 16:30:30 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:33.080 ************************************ 00:06:33.080 START TEST skip_rpc_with_json 00:06:33.080 ************************************ 00:06:33.080 16:30:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_json 00:06:33.080 16:30:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:33.080 16:30:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=3742594 00:06:33.080 16:30:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:33.080 16:30:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:33.080 16:30:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 3742594 00:06:33.080 16:30:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@831 -- # '[' -z 3742594 ']' 00:06:33.080 16:30:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:33.080 16:30:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:33.080 16:30:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:33.080 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:33.080 16:30:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:33.080 16:30:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:33.080 [2024-11-28 16:30:30.599548] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:33.080 [2024-11-28 16:30:30.599650] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3742594 ] 00:06:33.080 [2024-11-28 16:30:30.667845] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:33.080 [2024-11-28 16:30:30.706190] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.339 16:30:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:33.339 16:30:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@864 -- # return 0 00:06:33.339 16:30:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:33.339 16:30:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:33.339 16:30:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:33.339 [2024-11-28 16:30:30.905238] nvmf_rpc.c:2703:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:33.339 request: 00:06:33.339 { 00:06:33.339 "trtype": "tcp", 00:06:33.339 "method": "nvmf_get_transports", 00:06:33.339 "req_id": 1 00:06:33.339 } 00:06:33.339 Got JSON-RPC error response 00:06:33.339 response: 00:06:33.339 { 00:06:33.339 "code": -19, 00:06:33.339 "message": "No such device" 00:06:33.339 } 00:06:33.339 16:30:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:33.339 16:30:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:33.339 16:30:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:33.339 16:30:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:33.339 [2024-11-28 16:30:30.917327] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:33.339 16:30:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:33.339 16:30:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:33.339 16:30:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:33.339 16:30:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:33.599 16:30:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:33.599 16:30:31 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:33.599 { 00:06:33.599 "subsystems": [ 00:06:33.599 { 00:06:33.599 "subsystem": "scheduler", 00:06:33.599 "config": [ 00:06:33.599 { 00:06:33.599 "method": "framework_set_scheduler", 00:06:33.599 "params": { 00:06:33.599 "name": "static" 00:06:33.599 } 00:06:33.599 } 00:06:33.599 ] 00:06:33.599 }, 00:06:33.599 { 00:06:33.599 "subsystem": "vmd", 00:06:33.599 "config": [] 00:06:33.599 }, 00:06:33.599 { 00:06:33.599 "subsystem": "sock", 00:06:33.599 "config": [ 00:06:33.599 { 00:06:33.599 "method": "sock_set_default_impl", 00:06:33.599 "params": { 00:06:33.599 "impl_name": "posix" 00:06:33.599 } 00:06:33.599 }, 00:06:33.599 { 00:06:33.599 "method": "sock_impl_set_options", 00:06:33.599 "params": { 00:06:33.599 "impl_name": "ssl", 00:06:33.599 "recv_buf_size": 4096, 00:06:33.599 "send_buf_size": 4096, 00:06:33.599 "enable_recv_pipe": true, 00:06:33.599 "enable_quickack": false, 00:06:33.599 "enable_placement_id": 0, 00:06:33.599 "enable_zerocopy_send_server": true, 00:06:33.599 "enable_zerocopy_send_client": false, 00:06:33.599 "zerocopy_threshold": 0, 00:06:33.599 "tls_version": 0, 00:06:33.599 "enable_ktls": false 00:06:33.599 } 00:06:33.599 }, 00:06:33.599 { 00:06:33.599 "method": "sock_impl_set_options", 00:06:33.599 "params": { 00:06:33.599 "impl_name": "posix", 00:06:33.599 "recv_buf_size": 2097152, 00:06:33.599 "send_buf_size": 2097152, 00:06:33.599 "enable_recv_pipe": true, 00:06:33.599 "enable_quickack": false, 00:06:33.599 "enable_placement_id": 0, 00:06:33.599 "enable_zerocopy_send_server": true, 00:06:33.599 "enable_zerocopy_send_client": false, 00:06:33.599 "zerocopy_threshold": 0, 00:06:33.599 "tls_version": 0, 00:06:33.599 "enable_ktls": false 00:06:33.599 } 00:06:33.599 } 00:06:33.599 ] 00:06:33.599 }, 00:06:33.599 { 00:06:33.599 "subsystem": "iobuf", 00:06:33.599 "config": [ 00:06:33.599 { 00:06:33.599 "method": "iobuf_set_options", 00:06:33.599 "params": { 00:06:33.599 "small_pool_count": 8192, 00:06:33.599 "large_pool_count": 1024, 00:06:33.599 "small_bufsize": 8192, 00:06:33.599 "large_bufsize": 135168 00:06:33.599 } 00:06:33.599 } 00:06:33.599 ] 00:06:33.599 }, 00:06:33.599 { 00:06:33.599 "subsystem": "keyring", 00:06:33.599 "config": [] 00:06:33.599 }, 00:06:33.599 { 00:06:33.599 "subsystem": "vfio_user_target", 00:06:33.599 "config": null 00:06:33.599 }, 00:06:33.599 { 00:06:33.599 "subsystem": "fsdev", 00:06:33.599 "config": [ 00:06:33.599 { 00:06:33.599 "method": "fsdev_set_opts", 00:06:33.599 "params": { 00:06:33.599 "fsdev_io_pool_size": 65535, 00:06:33.599 "fsdev_io_cache_size": 256 00:06:33.599 } 00:06:33.599 } 00:06:33.599 ] 00:06:33.599 }, 00:06:33.599 { 00:06:33.599 "subsystem": "accel", 00:06:33.599 "config": [ 00:06:33.599 { 00:06:33.599 "method": "accel_set_options", 00:06:33.599 "params": { 00:06:33.599 "small_cache_size": 128, 00:06:33.599 "large_cache_size": 16, 00:06:33.599 "task_count": 2048, 00:06:33.599 "sequence_count": 2048, 00:06:33.599 "buf_count": 2048 00:06:33.599 } 00:06:33.599 } 00:06:33.599 ] 00:06:33.599 }, 00:06:33.599 { 00:06:33.599 "subsystem": "bdev", 00:06:33.599 "config": [ 00:06:33.599 { 00:06:33.599 "method": "bdev_set_options", 00:06:33.599 "params": { 00:06:33.599 "bdev_io_pool_size": 65535, 00:06:33.599 "bdev_io_cache_size": 256, 00:06:33.599 "bdev_auto_examine": true, 00:06:33.599 "iobuf_small_cache_size": 128, 00:06:33.599 "iobuf_large_cache_size": 16 00:06:33.599 } 00:06:33.599 }, 00:06:33.599 { 00:06:33.599 "method": "bdev_raid_set_options", 00:06:33.599 "params": { 00:06:33.599 "process_window_size_kb": 1024, 00:06:33.599 "process_max_bandwidth_mb_sec": 0 00:06:33.599 } 00:06:33.599 }, 00:06:33.599 { 00:06:33.599 "method": "bdev_nvme_set_options", 00:06:33.599 "params": { 00:06:33.599 "action_on_timeout": "none", 00:06:33.599 "timeout_us": 0, 00:06:33.599 "timeout_admin_us": 0, 00:06:33.599 "keep_alive_timeout_ms": 10000, 00:06:33.599 "arbitration_burst": 0, 00:06:33.599 "low_priority_weight": 0, 00:06:33.599 "medium_priority_weight": 0, 00:06:33.599 "high_priority_weight": 0, 00:06:33.599 "nvme_adminq_poll_period_us": 10000, 00:06:33.599 "nvme_ioq_poll_period_us": 0, 00:06:33.599 "io_queue_requests": 0, 00:06:33.599 "delay_cmd_submit": true, 00:06:33.599 "transport_retry_count": 4, 00:06:33.599 "bdev_retry_count": 3, 00:06:33.599 "transport_ack_timeout": 0, 00:06:33.599 "ctrlr_loss_timeout_sec": 0, 00:06:33.599 "reconnect_delay_sec": 0, 00:06:33.599 "fast_io_fail_timeout_sec": 0, 00:06:33.599 "disable_auto_failback": false, 00:06:33.599 "generate_uuids": false, 00:06:33.599 "transport_tos": 0, 00:06:33.599 "nvme_error_stat": false, 00:06:33.599 "rdma_srq_size": 0, 00:06:33.599 "io_path_stat": false, 00:06:33.599 "allow_accel_sequence": false, 00:06:33.599 "rdma_max_cq_size": 0, 00:06:33.599 "rdma_cm_event_timeout_ms": 0, 00:06:33.599 "dhchap_digests": [ 00:06:33.599 "sha256", 00:06:33.599 "sha384", 00:06:33.599 "sha512" 00:06:33.599 ], 00:06:33.599 "dhchap_dhgroups": [ 00:06:33.599 "null", 00:06:33.599 "ffdhe2048", 00:06:33.599 "ffdhe3072", 00:06:33.599 "ffdhe4096", 00:06:33.599 "ffdhe6144", 00:06:33.599 "ffdhe8192" 00:06:33.599 ] 00:06:33.599 } 00:06:33.599 }, 00:06:33.599 { 00:06:33.599 "method": "bdev_nvme_set_hotplug", 00:06:33.599 "params": { 00:06:33.599 "period_us": 100000, 00:06:33.599 "enable": false 00:06:33.599 } 00:06:33.599 }, 00:06:33.599 { 00:06:33.599 "method": "bdev_iscsi_set_options", 00:06:33.599 "params": { 00:06:33.599 "timeout_sec": 30 00:06:33.599 } 00:06:33.599 }, 00:06:33.599 { 00:06:33.599 "method": "bdev_wait_for_examine" 00:06:33.599 } 00:06:33.599 ] 00:06:33.599 }, 00:06:33.599 { 00:06:33.599 "subsystem": "nvmf", 00:06:33.599 "config": [ 00:06:33.599 { 00:06:33.599 "method": "nvmf_set_config", 00:06:33.599 "params": { 00:06:33.599 "discovery_filter": "match_any", 00:06:33.599 "admin_cmd_passthru": { 00:06:33.599 "identify_ctrlr": false 00:06:33.599 }, 00:06:33.599 "dhchap_digests": [ 00:06:33.599 "sha256", 00:06:33.599 "sha384", 00:06:33.599 "sha512" 00:06:33.599 ], 00:06:33.599 "dhchap_dhgroups": [ 00:06:33.599 "null", 00:06:33.599 "ffdhe2048", 00:06:33.599 "ffdhe3072", 00:06:33.599 "ffdhe4096", 00:06:33.599 "ffdhe6144", 00:06:33.599 "ffdhe8192" 00:06:33.599 ] 00:06:33.599 } 00:06:33.599 }, 00:06:33.599 { 00:06:33.599 "method": "nvmf_set_max_subsystems", 00:06:33.599 "params": { 00:06:33.599 "max_subsystems": 1024 00:06:33.599 } 00:06:33.599 }, 00:06:33.599 { 00:06:33.599 "method": "nvmf_set_crdt", 00:06:33.599 "params": { 00:06:33.599 "crdt1": 0, 00:06:33.599 "crdt2": 0, 00:06:33.599 "crdt3": 0 00:06:33.599 } 00:06:33.599 }, 00:06:33.599 { 00:06:33.599 "method": "nvmf_create_transport", 00:06:33.599 "params": { 00:06:33.599 "trtype": "TCP", 00:06:33.599 "max_queue_depth": 128, 00:06:33.599 "max_io_qpairs_per_ctrlr": 127, 00:06:33.599 "in_capsule_data_size": 4096, 00:06:33.599 "max_io_size": 131072, 00:06:33.599 "io_unit_size": 131072, 00:06:33.599 "max_aq_depth": 128, 00:06:33.599 "num_shared_buffers": 511, 00:06:33.599 "buf_cache_size": 4294967295, 00:06:33.599 "dif_insert_or_strip": false, 00:06:33.599 "zcopy": false, 00:06:33.599 "c2h_success": true, 00:06:33.599 "sock_priority": 0, 00:06:33.599 "abort_timeout_sec": 1, 00:06:33.599 "ack_timeout": 0, 00:06:33.599 "data_wr_pool_size": 0 00:06:33.599 } 00:06:33.599 } 00:06:33.599 ] 00:06:33.599 }, 00:06:33.599 { 00:06:33.599 "subsystem": "nbd", 00:06:33.599 "config": [] 00:06:33.599 }, 00:06:33.599 { 00:06:33.599 "subsystem": "ublk", 00:06:33.599 "config": [] 00:06:33.600 }, 00:06:33.600 { 00:06:33.600 "subsystem": "vhost_blk", 00:06:33.600 "config": [] 00:06:33.600 }, 00:06:33.600 { 00:06:33.600 "subsystem": "scsi", 00:06:33.600 "config": null 00:06:33.600 }, 00:06:33.600 { 00:06:33.600 "subsystem": "iscsi", 00:06:33.600 "config": [ 00:06:33.600 { 00:06:33.600 "method": "iscsi_set_options", 00:06:33.600 "params": { 00:06:33.600 "node_base": "iqn.2016-06.io.spdk", 00:06:33.600 "max_sessions": 128, 00:06:33.600 "max_connections_per_session": 2, 00:06:33.600 "max_queue_depth": 64, 00:06:33.600 "default_time2wait": 2, 00:06:33.600 "default_time2retain": 20, 00:06:33.600 "first_burst_length": 8192, 00:06:33.600 "immediate_data": true, 00:06:33.600 "allow_duplicated_isid": false, 00:06:33.600 "error_recovery_level": 0, 00:06:33.600 "nop_timeout": 60, 00:06:33.600 "nop_in_interval": 30, 00:06:33.600 "disable_chap": false, 00:06:33.600 "require_chap": false, 00:06:33.600 "mutual_chap": false, 00:06:33.600 "chap_group": 0, 00:06:33.600 "max_large_datain_per_connection": 64, 00:06:33.600 "max_r2t_per_connection": 4, 00:06:33.600 "pdu_pool_size": 36864, 00:06:33.600 "immediate_data_pool_size": 16384, 00:06:33.600 "data_out_pool_size": 2048 00:06:33.600 } 00:06:33.600 } 00:06:33.600 ] 00:06:33.600 }, 00:06:33.600 { 00:06:33.600 "subsystem": "vhost_scsi", 00:06:33.600 "config": [] 00:06:33.600 } 00:06:33.600 ] 00:06:33.600 } 00:06:33.600 16:30:31 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:33.600 16:30:31 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 3742594 00:06:33.600 16:30:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 3742594 ']' 00:06:33.600 16:30:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 3742594 00:06:33.600 16:30:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:06:33.600 16:30:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:33.600 16:30:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3742594 00:06:33.600 16:30:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:33.600 16:30:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:33.600 16:30:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3742594' 00:06:33.600 killing process with pid 3742594 00:06:33.600 16:30:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 3742594 00:06:33.600 16:30:31 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 3742594 00:06:33.859 16:30:31 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=3742615 00:06:33.859 16:30:31 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:33.859 16:30:31 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:39.135 16:30:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 3742615 00:06:39.135 16:30:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 3742615 ']' 00:06:39.135 16:30:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 3742615 00:06:39.135 16:30:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:06:39.135 16:30:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:39.135 16:30:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3742615 00:06:39.135 16:30:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:39.135 16:30:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:39.135 16:30:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3742615' 00:06:39.135 killing process with pid 3742615 00:06:39.135 16:30:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 3742615 00:06:39.135 16:30:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 3742615 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:39.395 00:06:39.395 real 0m6.275s 00:06:39.395 user 0m5.939s 00:06:39.395 sys 0m0.650s 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:39.395 ************************************ 00:06:39.395 END TEST skip_rpc_with_json 00:06:39.395 ************************************ 00:06:39.395 16:30:36 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:39.395 16:30:36 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:39.395 16:30:36 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:39.395 16:30:36 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:39.395 ************************************ 00:06:39.395 START TEST skip_rpc_with_delay 00:06:39.395 ************************************ 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_delay 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # local es=0 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:39.395 [2024-11-28 16:30:36.964689] app.c: 840:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:39.395 [2024-11-28 16:30:36.964829] app.c: 719:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # es=1 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:39.395 00:06:39.395 real 0m0.049s 00:06:39.395 user 0m0.019s 00:06:39.395 sys 0m0.029s 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:39.395 16:30:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:06:39.395 ************************************ 00:06:39.395 END TEST skip_rpc_with_delay 00:06:39.395 ************************************ 00:06:39.395 16:30:37 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:06:39.395 16:30:37 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:39.395 16:30:37 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:39.395 16:30:37 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:39.395 16:30:37 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:39.395 16:30:37 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:39.655 ************************************ 00:06:39.655 START TEST exit_on_failed_rpc_init 00:06:39.655 ************************************ 00:06:39.655 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1125 -- # test_exit_on_failed_rpc_init 00:06:39.655 16:30:37 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=3743730 00:06:39.655 16:30:37 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 3743730 00:06:39.655 16:30:37 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:39.655 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@831 -- # '[' -z 3743730 ']' 00:06:39.655 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:39.655 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:39.655 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:39.655 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:39.655 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:39.655 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:39.655 [2024-11-28 16:30:37.101468] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:39.655 [2024-11-28 16:30:37.101551] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3743730 ] 00:06:39.655 [2024-11-28 16:30:37.169612] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.655 [2024-11-28 16:30:37.209512] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.914 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:39.914 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@864 -- # return 0 00:06:39.914 16:30:37 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:39.914 16:30:37 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:39.914 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # local es=0 00:06:39.914 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:39.914 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:39.914 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:39.914 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:39.915 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:39.915 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:39.915 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:39.915 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:39.915 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:39.915 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:39.915 [2024-11-28 16:30:37.429665] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:39.915 [2024-11-28 16:30:37.429747] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3743735 ] 00:06:39.915 [2024-11-28 16:30:37.495335] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.915 [2024-11-28 16:30:37.533755] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:39.915 [2024-11-28 16:30:37.533836] rpc.c: 181:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:39.915 [2024-11-28 16:30:37.533849] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:39.915 [2024-11-28 16:30:37.533857] app.c:1061:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:40.174 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # es=234 00:06:40.174 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:40.174 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@662 -- # es=106 00:06:40.174 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@663 -- # case "$es" in 00:06:40.174 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@670 -- # es=1 00:06:40.174 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:40.174 16:30:37 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:40.174 16:30:37 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 3743730 00:06:40.174 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@950 -- # '[' -z 3743730 ']' 00:06:40.174 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # kill -0 3743730 00:06:40.174 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # uname 00:06:40.174 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:40.174 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3743730 00:06:40.174 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:40.174 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:40.174 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3743730' 00:06:40.174 killing process with pid 3743730 00:06:40.174 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@969 -- # kill 3743730 00:06:40.174 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@974 -- # wait 3743730 00:06:40.433 00:06:40.433 real 0m0.887s 00:06:40.433 user 0m0.884s 00:06:40.433 sys 0m0.437s 00:06:40.433 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:40.433 16:30:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:40.433 ************************************ 00:06:40.433 END TEST exit_on_failed_rpc_init 00:06:40.433 ************************************ 00:06:40.433 16:30:38 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:40.433 00:06:40.433 real 0m13.156s 00:06:40.433 user 0m12.233s 00:06:40.433 sys 0m1.771s 00:06:40.433 16:30:38 skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:40.433 16:30:38 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:40.433 ************************************ 00:06:40.433 END TEST skip_rpc 00:06:40.433 ************************************ 00:06:40.434 16:30:38 -- spdk/autotest.sh@158 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:40.434 16:30:38 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:40.434 16:30:38 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:40.434 16:30:38 -- common/autotest_common.sh@10 -- # set +x 00:06:40.693 ************************************ 00:06:40.693 START TEST rpc_client 00:06:40.693 ************************************ 00:06:40.693 16:30:38 rpc_client -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:40.694 * Looking for test storage... 00:06:40.694 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:06:40.694 16:30:38 rpc_client -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:40.694 16:30:38 rpc_client -- common/autotest_common.sh@1681 -- # lcov --version 00:06:40.694 16:30:38 rpc_client -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:40.694 16:30:38 rpc_client -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@336 -- # IFS=.-: 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@336 -- # read -ra ver1 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@337 -- # IFS=.-: 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@337 -- # read -ra ver2 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@338 -- # local 'op=<' 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@340 -- # ver1_l=2 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@341 -- # ver2_l=1 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@344 -- # case "$op" in 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@345 -- # : 1 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@365 -- # decimal 1 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@353 -- # local d=1 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@355 -- # echo 1 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@365 -- # ver1[v]=1 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@366 -- # decimal 2 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@353 -- # local d=2 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@355 -- # echo 2 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@366 -- # ver2[v]=2 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:40.694 16:30:38 rpc_client -- scripts/common.sh@368 -- # return 0 00:06:40.694 16:30:38 rpc_client -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:40.694 16:30:38 rpc_client -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:40.694 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:40.694 --rc genhtml_branch_coverage=1 00:06:40.694 --rc genhtml_function_coverage=1 00:06:40.694 --rc genhtml_legend=1 00:06:40.694 --rc geninfo_all_blocks=1 00:06:40.694 --rc geninfo_unexecuted_blocks=1 00:06:40.694 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:40.694 ' 00:06:40.694 16:30:38 rpc_client -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:40.694 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:40.694 --rc genhtml_branch_coverage=1 00:06:40.694 --rc genhtml_function_coverage=1 00:06:40.694 --rc genhtml_legend=1 00:06:40.694 --rc geninfo_all_blocks=1 00:06:40.694 --rc geninfo_unexecuted_blocks=1 00:06:40.694 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:40.694 ' 00:06:40.694 16:30:38 rpc_client -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:40.694 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:40.694 --rc genhtml_branch_coverage=1 00:06:40.694 --rc genhtml_function_coverage=1 00:06:40.694 --rc genhtml_legend=1 00:06:40.694 --rc geninfo_all_blocks=1 00:06:40.694 --rc geninfo_unexecuted_blocks=1 00:06:40.694 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:40.694 ' 00:06:40.694 16:30:38 rpc_client -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:40.694 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:40.694 --rc genhtml_branch_coverage=1 00:06:40.694 --rc genhtml_function_coverage=1 00:06:40.694 --rc genhtml_legend=1 00:06:40.694 --rc geninfo_all_blocks=1 00:06:40.694 --rc geninfo_unexecuted_blocks=1 00:06:40.694 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:40.694 ' 00:06:40.694 16:30:38 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:40.694 OK 00:06:40.694 16:30:38 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:40.694 00:06:40.694 real 0m0.216s 00:06:40.694 user 0m0.111s 00:06:40.694 sys 0m0.123s 00:06:40.694 16:30:38 rpc_client -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:40.694 16:30:38 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:06:40.694 ************************************ 00:06:40.694 END TEST rpc_client 00:06:40.694 ************************************ 00:06:40.954 16:30:38 -- spdk/autotest.sh@159 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:40.954 16:30:38 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:40.954 16:30:38 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:40.954 16:30:38 -- common/autotest_common.sh@10 -- # set +x 00:06:40.954 ************************************ 00:06:40.954 START TEST json_config 00:06:40.954 ************************************ 00:06:40.954 16:30:38 json_config -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:40.954 16:30:38 json_config -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:40.954 16:30:38 json_config -- common/autotest_common.sh@1681 -- # lcov --version 00:06:40.954 16:30:38 json_config -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:40.954 16:30:38 json_config -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:40.954 16:30:38 json_config -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:40.954 16:30:38 json_config -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:40.954 16:30:38 json_config -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:40.954 16:30:38 json_config -- scripts/common.sh@336 -- # IFS=.-: 00:06:40.954 16:30:38 json_config -- scripts/common.sh@336 -- # read -ra ver1 00:06:40.954 16:30:38 json_config -- scripts/common.sh@337 -- # IFS=.-: 00:06:40.954 16:30:38 json_config -- scripts/common.sh@337 -- # read -ra ver2 00:06:40.954 16:30:38 json_config -- scripts/common.sh@338 -- # local 'op=<' 00:06:40.954 16:30:38 json_config -- scripts/common.sh@340 -- # ver1_l=2 00:06:40.954 16:30:38 json_config -- scripts/common.sh@341 -- # ver2_l=1 00:06:40.954 16:30:38 json_config -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:40.954 16:30:38 json_config -- scripts/common.sh@344 -- # case "$op" in 00:06:40.954 16:30:38 json_config -- scripts/common.sh@345 -- # : 1 00:06:40.954 16:30:38 json_config -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:40.954 16:30:38 json_config -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:40.954 16:30:38 json_config -- scripts/common.sh@365 -- # decimal 1 00:06:40.954 16:30:38 json_config -- scripts/common.sh@353 -- # local d=1 00:06:40.954 16:30:38 json_config -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:40.954 16:30:38 json_config -- scripts/common.sh@355 -- # echo 1 00:06:40.954 16:30:38 json_config -- scripts/common.sh@365 -- # ver1[v]=1 00:06:40.954 16:30:38 json_config -- scripts/common.sh@366 -- # decimal 2 00:06:40.954 16:30:38 json_config -- scripts/common.sh@353 -- # local d=2 00:06:40.954 16:30:38 json_config -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:40.954 16:30:38 json_config -- scripts/common.sh@355 -- # echo 2 00:06:40.954 16:30:38 json_config -- scripts/common.sh@366 -- # ver2[v]=2 00:06:40.954 16:30:38 json_config -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:40.954 16:30:38 json_config -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:40.954 16:30:38 json_config -- scripts/common.sh@368 -- # return 0 00:06:40.954 16:30:38 json_config -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:40.954 16:30:38 json_config -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:40.954 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:40.954 --rc genhtml_branch_coverage=1 00:06:40.954 --rc genhtml_function_coverage=1 00:06:40.954 --rc genhtml_legend=1 00:06:40.954 --rc geninfo_all_blocks=1 00:06:40.954 --rc geninfo_unexecuted_blocks=1 00:06:40.954 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:40.954 ' 00:06:40.954 16:30:38 json_config -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:40.954 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:40.954 --rc genhtml_branch_coverage=1 00:06:40.954 --rc genhtml_function_coverage=1 00:06:40.954 --rc genhtml_legend=1 00:06:40.954 --rc geninfo_all_blocks=1 00:06:40.954 --rc geninfo_unexecuted_blocks=1 00:06:40.954 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:40.954 ' 00:06:40.954 16:30:38 json_config -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:40.954 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:40.954 --rc genhtml_branch_coverage=1 00:06:40.954 --rc genhtml_function_coverage=1 00:06:40.954 --rc genhtml_legend=1 00:06:40.954 --rc geninfo_all_blocks=1 00:06:40.954 --rc geninfo_unexecuted_blocks=1 00:06:40.954 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:40.954 ' 00:06:40.954 16:30:38 json_config -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:40.954 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:40.954 --rc genhtml_branch_coverage=1 00:06:40.954 --rc genhtml_function_coverage=1 00:06:40.954 --rc genhtml_legend=1 00:06:40.954 --rc geninfo_all_blocks=1 00:06:40.954 --rc geninfo_unexecuted_blocks=1 00:06:40.954 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:40.954 ' 00:06:40.954 16:30:38 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@7 -- # uname -s 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:40.954 16:30:38 json_config -- scripts/common.sh@15 -- # shopt -s extglob 00:06:40.954 16:30:38 json_config -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:40.954 16:30:38 json_config -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:40.954 16:30:38 json_config -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:40.954 16:30:38 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:40.954 16:30:38 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:40.954 16:30:38 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:40.954 16:30:38 json_config -- paths/export.sh@5 -- # export PATH 00:06:40.954 16:30:38 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@51 -- # : 0 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:06:40.954 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:06:40.954 16:30:38 json_config -- nvmf/common.sh@55 -- # have_pci_nics=0 00:06:40.954 16:30:38 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:06:40.954 16:30:38 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:40.954 16:30:38 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:40.955 16:30:38 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:40.955 16:30:38 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:40.955 16:30:38 json_config -- json_config/json_config.sh@27 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:06:40.955 WARNING: No tests are enabled so not running JSON configuration tests 00:06:40.955 16:30:38 json_config -- json_config/json_config.sh@28 -- # exit 0 00:06:40.955 00:06:40.955 real 0m0.187s 00:06:40.955 user 0m0.109s 00:06:40.955 sys 0m0.087s 00:06:40.955 16:30:38 json_config -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:40.955 16:30:38 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:40.955 ************************************ 00:06:40.955 END TEST json_config 00:06:40.955 ************************************ 00:06:41.215 16:30:38 -- spdk/autotest.sh@160 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:41.215 16:30:38 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:41.215 16:30:38 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:41.215 16:30:38 -- common/autotest_common.sh@10 -- # set +x 00:06:41.215 ************************************ 00:06:41.215 START TEST json_config_extra_key 00:06:41.215 ************************************ 00:06:41.215 16:30:38 json_config_extra_key -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:41.215 16:30:38 json_config_extra_key -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:41.215 16:30:38 json_config_extra_key -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:41.215 16:30:38 json_config_extra_key -- common/autotest_common.sh@1681 -- # lcov --version 00:06:41.215 16:30:38 json_config_extra_key -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@336 -- # IFS=.-: 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@336 -- # read -ra ver1 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@337 -- # IFS=.-: 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@337 -- # read -ra ver2 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@338 -- # local 'op=<' 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@340 -- # ver1_l=2 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@341 -- # ver2_l=1 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@344 -- # case "$op" in 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@345 -- # : 1 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@365 -- # decimal 1 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@353 -- # local d=1 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@355 -- # echo 1 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@365 -- # ver1[v]=1 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@366 -- # decimal 2 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@353 -- # local d=2 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@355 -- # echo 2 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@366 -- # ver2[v]=2 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:41.215 16:30:38 json_config_extra_key -- scripts/common.sh@368 -- # return 0 00:06:41.215 16:30:38 json_config_extra_key -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:41.215 16:30:38 json_config_extra_key -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:41.215 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:41.215 --rc genhtml_branch_coverage=1 00:06:41.215 --rc genhtml_function_coverage=1 00:06:41.215 --rc genhtml_legend=1 00:06:41.215 --rc geninfo_all_blocks=1 00:06:41.215 --rc geninfo_unexecuted_blocks=1 00:06:41.215 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:41.215 ' 00:06:41.215 16:30:38 json_config_extra_key -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:41.215 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:41.215 --rc genhtml_branch_coverage=1 00:06:41.215 --rc genhtml_function_coverage=1 00:06:41.215 --rc genhtml_legend=1 00:06:41.215 --rc geninfo_all_blocks=1 00:06:41.215 --rc geninfo_unexecuted_blocks=1 00:06:41.215 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:41.215 ' 00:06:41.215 16:30:38 json_config_extra_key -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:41.215 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:41.215 --rc genhtml_branch_coverage=1 00:06:41.215 --rc genhtml_function_coverage=1 00:06:41.215 --rc genhtml_legend=1 00:06:41.215 --rc geninfo_all_blocks=1 00:06:41.215 --rc geninfo_unexecuted_blocks=1 00:06:41.215 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:41.215 ' 00:06:41.215 16:30:38 json_config_extra_key -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:41.215 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:41.215 --rc genhtml_branch_coverage=1 00:06:41.215 --rc genhtml_function_coverage=1 00:06:41.215 --rc genhtml_legend=1 00:06:41.215 --rc geninfo_all_blocks=1 00:06:41.215 --rc geninfo_unexecuted_blocks=1 00:06:41.215 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:41.215 ' 00:06:41.216 16:30:38 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:41.216 16:30:38 json_config_extra_key -- scripts/common.sh@15 -- # shopt -s extglob 00:06:41.216 16:30:38 json_config_extra_key -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:41.216 16:30:38 json_config_extra_key -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:41.216 16:30:38 json_config_extra_key -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:41.216 16:30:38 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:41.216 16:30:38 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:41.216 16:30:38 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:41.216 16:30:38 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:06:41.216 16:30:38 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@51 -- # : 0 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:06:41.216 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:06:41.216 16:30:38 json_config_extra_key -- nvmf/common.sh@55 -- # have_pci_nics=0 00:06:41.216 16:30:38 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:06:41.216 16:30:38 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:06:41.216 16:30:38 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:06:41.216 16:30:38 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:41.216 16:30:38 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:06:41.216 16:30:38 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:41.216 16:30:38 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:06:41.216 16:30:38 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:41.216 16:30:38 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:06:41.216 16:30:38 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:41.216 16:30:38 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:06:41.216 INFO: launching applications... 00:06:41.216 16:30:38 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:41.216 16:30:38 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:06:41.216 16:30:38 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:06:41.216 16:30:38 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:41.216 16:30:38 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:41.216 16:30:38 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:06:41.216 16:30:38 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:41.216 16:30:38 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:41.216 16:30:38 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=3744169 00:06:41.216 16:30:38 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:41.216 Waiting for target to run... 00:06:41.216 16:30:38 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 3744169 /var/tmp/spdk_tgt.sock 00:06:41.216 16:30:38 json_config_extra_key -- common/autotest_common.sh@831 -- # '[' -z 3744169 ']' 00:06:41.216 16:30:38 json_config_extra_key -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:41.216 16:30:38 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:41.216 16:30:38 json_config_extra_key -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:41.216 16:30:38 json_config_extra_key -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:41.216 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:41.216 16:30:38 json_config_extra_key -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:41.216 16:30:38 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:41.476 [2024-11-28 16:30:38.878992] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:41.476 [2024-11-28 16:30:38.879080] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3744169 ] 00:06:41.736 [2024-11-28 16:30:39.312858] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:41.736 [2024-11-28 16:30:39.342305] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.304 16:30:39 json_config_extra_key -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:42.304 16:30:39 json_config_extra_key -- common/autotest_common.sh@864 -- # return 0 00:06:42.304 16:30:39 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:06:42.304 00:06:42.304 16:30:39 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:06:42.304 INFO: shutting down applications... 00:06:42.304 16:30:39 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:06:42.305 16:30:39 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:06:42.305 16:30:39 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:42.305 16:30:39 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 3744169 ]] 00:06:42.305 16:30:39 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 3744169 00:06:42.305 16:30:39 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:42.305 16:30:39 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:42.305 16:30:39 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 3744169 00:06:42.305 16:30:39 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:06:42.874 16:30:40 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:06:42.874 16:30:40 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:42.874 16:30:40 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 3744169 00:06:42.874 16:30:40 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:42.874 16:30:40 json_config_extra_key -- json_config/common.sh@43 -- # break 00:06:42.874 16:30:40 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:42.874 16:30:40 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:42.874 SPDK target shutdown done 00:06:42.874 16:30:40 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:06:42.874 Success 00:06:42.874 00:06:42.874 real 0m1.584s 00:06:42.874 user 0m1.164s 00:06:42.874 sys 0m0.584s 00:06:42.874 16:30:40 json_config_extra_key -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:42.874 16:30:40 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:42.874 ************************************ 00:06:42.874 END TEST json_config_extra_key 00:06:42.874 ************************************ 00:06:42.874 16:30:40 -- spdk/autotest.sh@161 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:42.874 16:30:40 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:42.874 16:30:40 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:42.874 16:30:40 -- common/autotest_common.sh@10 -- # set +x 00:06:42.874 ************************************ 00:06:42.874 START TEST alias_rpc 00:06:42.874 ************************************ 00:06:42.874 16:30:40 alias_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:42.874 * Looking for test storage... 00:06:42.874 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:06:42.874 16:30:40 alias_rpc -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:42.874 16:30:40 alias_rpc -- common/autotest_common.sh@1681 -- # lcov --version 00:06:42.874 16:30:40 alias_rpc -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:42.874 16:30:40 alias_rpc -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@345 -- # : 1 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@365 -- # decimal 1 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@353 -- # local d=1 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@355 -- # echo 1 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@366 -- # decimal 2 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@353 -- # local d=2 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@355 -- # echo 2 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:42.874 16:30:40 alias_rpc -- scripts/common.sh@368 -- # return 0 00:06:42.874 16:30:40 alias_rpc -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:42.874 16:30:40 alias_rpc -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:42.874 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:42.874 --rc genhtml_branch_coverage=1 00:06:42.875 --rc genhtml_function_coverage=1 00:06:42.875 --rc genhtml_legend=1 00:06:42.875 --rc geninfo_all_blocks=1 00:06:42.875 --rc geninfo_unexecuted_blocks=1 00:06:42.875 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:42.875 ' 00:06:42.875 16:30:40 alias_rpc -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:42.875 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:42.875 --rc genhtml_branch_coverage=1 00:06:42.875 --rc genhtml_function_coverage=1 00:06:42.875 --rc genhtml_legend=1 00:06:42.875 --rc geninfo_all_blocks=1 00:06:42.875 --rc geninfo_unexecuted_blocks=1 00:06:42.875 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:42.875 ' 00:06:42.875 16:30:40 alias_rpc -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:42.875 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:42.875 --rc genhtml_branch_coverage=1 00:06:42.875 --rc genhtml_function_coverage=1 00:06:42.875 --rc genhtml_legend=1 00:06:42.875 --rc geninfo_all_blocks=1 00:06:42.875 --rc geninfo_unexecuted_blocks=1 00:06:42.875 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:42.875 ' 00:06:42.875 16:30:40 alias_rpc -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:42.875 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:42.875 --rc genhtml_branch_coverage=1 00:06:42.875 --rc genhtml_function_coverage=1 00:06:42.875 --rc genhtml_legend=1 00:06:42.875 --rc geninfo_all_blocks=1 00:06:42.875 --rc geninfo_unexecuted_blocks=1 00:06:42.875 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:42.875 ' 00:06:42.875 16:30:40 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:42.875 16:30:40 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=3744497 00:06:42.875 16:30:40 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 3744497 00:06:42.875 16:30:40 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:42.875 16:30:40 alias_rpc -- common/autotest_common.sh@831 -- # '[' -z 3744497 ']' 00:06:42.875 16:30:40 alias_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:42.875 16:30:40 alias_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:42.875 16:30:40 alias_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:42.875 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:42.875 16:30:40 alias_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:42.875 16:30:40 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:43.134 [2024-11-28 16:30:40.537779] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:43.134 [2024-11-28 16:30:40.537860] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3744497 ] 00:06:43.134 [2024-11-28 16:30:40.603883] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.134 [2024-11-28 16:30:40.641428] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.394 16:30:40 alias_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:43.394 16:30:40 alias_rpc -- common/autotest_common.sh@864 -- # return 0 00:06:43.394 16:30:40 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:43.653 16:30:41 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 3744497 00:06:43.653 16:30:41 alias_rpc -- common/autotest_common.sh@950 -- # '[' -z 3744497 ']' 00:06:43.653 16:30:41 alias_rpc -- common/autotest_common.sh@954 -- # kill -0 3744497 00:06:43.654 16:30:41 alias_rpc -- common/autotest_common.sh@955 -- # uname 00:06:43.654 16:30:41 alias_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:43.654 16:30:41 alias_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3744497 00:06:43.654 16:30:41 alias_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:43.654 16:30:41 alias_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:43.654 16:30:41 alias_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3744497' 00:06:43.654 killing process with pid 3744497 00:06:43.654 16:30:41 alias_rpc -- common/autotest_common.sh@969 -- # kill 3744497 00:06:43.654 16:30:41 alias_rpc -- common/autotest_common.sh@974 -- # wait 3744497 00:06:43.914 00:06:43.914 real 0m1.096s 00:06:43.914 user 0m1.077s 00:06:43.914 sys 0m0.456s 00:06:43.914 16:30:41 alias_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:43.914 16:30:41 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:43.914 ************************************ 00:06:43.914 END TEST alias_rpc 00:06:43.914 ************************************ 00:06:43.914 16:30:41 -- spdk/autotest.sh@163 -- # [[ 0 -eq 0 ]] 00:06:43.914 16:30:41 -- spdk/autotest.sh@164 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:43.914 16:30:41 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:43.914 16:30:41 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:43.914 16:30:41 -- common/autotest_common.sh@10 -- # set +x 00:06:43.914 ************************************ 00:06:43.914 START TEST spdkcli_tcp 00:06:43.914 ************************************ 00:06:43.914 16:30:41 spdkcli_tcp -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:44.174 * Looking for test storage... 00:06:44.174 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:06:44.174 16:30:41 spdkcli_tcp -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:44.174 16:30:41 spdkcli_tcp -- common/autotest_common.sh@1681 -- # lcov --version 00:06:44.174 16:30:41 spdkcli_tcp -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:44.174 16:30:41 spdkcli_tcp -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@336 -- # IFS=.-: 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@336 -- # read -ra ver1 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@337 -- # IFS=.-: 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@337 -- # read -ra ver2 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@338 -- # local 'op=<' 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@340 -- # ver1_l=2 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@341 -- # ver2_l=1 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@344 -- # case "$op" in 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@345 -- # : 1 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@365 -- # decimal 1 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@353 -- # local d=1 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@355 -- # echo 1 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@365 -- # ver1[v]=1 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@366 -- # decimal 2 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@353 -- # local d=2 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@355 -- # echo 2 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@366 -- # ver2[v]=2 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:44.174 16:30:41 spdkcli_tcp -- scripts/common.sh@368 -- # return 0 00:06:44.174 16:30:41 spdkcli_tcp -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:44.174 16:30:41 spdkcli_tcp -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:44.174 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:44.174 --rc genhtml_branch_coverage=1 00:06:44.174 --rc genhtml_function_coverage=1 00:06:44.174 --rc genhtml_legend=1 00:06:44.174 --rc geninfo_all_blocks=1 00:06:44.174 --rc geninfo_unexecuted_blocks=1 00:06:44.174 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:44.174 ' 00:06:44.174 16:30:41 spdkcli_tcp -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:44.174 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:44.174 --rc genhtml_branch_coverage=1 00:06:44.174 --rc genhtml_function_coverage=1 00:06:44.174 --rc genhtml_legend=1 00:06:44.174 --rc geninfo_all_blocks=1 00:06:44.174 --rc geninfo_unexecuted_blocks=1 00:06:44.174 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:44.174 ' 00:06:44.174 16:30:41 spdkcli_tcp -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:44.174 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:44.174 --rc genhtml_branch_coverage=1 00:06:44.174 --rc genhtml_function_coverage=1 00:06:44.174 --rc genhtml_legend=1 00:06:44.174 --rc geninfo_all_blocks=1 00:06:44.174 --rc geninfo_unexecuted_blocks=1 00:06:44.174 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:44.174 ' 00:06:44.174 16:30:41 spdkcli_tcp -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:44.174 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:44.174 --rc genhtml_branch_coverage=1 00:06:44.174 --rc genhtml_function_coverage=1 00:06:44.174 --rc genhtml_legend=1 00:06:44.174 --rc geninfo_all_blocks=1 00:06:44.174 --rc geninfo_unexecuted_blocks=1 00:06:44.174 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:44.174 ' 00:06:44.174 16:30:41 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:06:44.174 16:30:41 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:44.174 16:30:41 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:06:44.174 16:30:41 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:44.174 16:30:41 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:44.174 16:30:41 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:44.174 16:30:41 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:44.174 16:30:41 spdkcli_tcp -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:44.174 16:30:41 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:44.174 16:30:41 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=3744820 00:06:44.174 16:30:41 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 3744820 00:06:44.174 16:30:41 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:44.174 16:30:41 spdkcli_tcp -- common/autotest_common.sh@831 -- # '[' -z 3744820 ']' 00:06:44.174 16:30:41 spdkcli_tcp -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:44.174 16:30:41 spdkcli_tcp -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:44.174 16:30:41 spdkcli_tcp -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:44.174 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:44.174 16:30:41 spdkcli_tcp -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:44.174 16:30:41 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:44.174 [2024-11-28 16:30:41.729984] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:44.174 [2024-11-28 16:30:41.730049] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3744820 ] 00:06:44.174 [2024-11-28 16:30:41.795664] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:44.434 [2024-11-28 16:30:41.834876] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:44.434 [2024-11-28 16:30:41.834879] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:44.434 16:30:42 spdkcli_tcp -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:44.434 16:30:42 spdkcli_tcp -- common/autotest_common.sh@864 -- # return 0 00:06:44.434 16:30:42 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=3744831 00:06:44.434 16:30:42 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:44.434 16:30:42 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:44.695 [ 00:06:44.695 "spdk_get_version", 00:06:44.695 "rpc_get_methods", 00:06:44.695 "notify_get_notifications", 00:06:44.695 "notify_get_types", 00:06:44.695 "trace_get_info", 00:06:44.695 "trace_get_tpoint_group_mask", 00:06:44.695 "trace_disable_tpoint_group", 00:06:44.695 "trace_enable_tpoint_group", 00:06:44.695 "trace_clear_tpoint_mask", 00:06:44.695 "trace_set_tpoint_mask", 00:06:44.695 "fsdev_set_opts", 00:06:44.695 "fsdev_get_opts", 00:06:44.695 "framework_get_pci_devices", 00:06:44.695 "framework_get_config", 00:06:44.695 "framework_get_subsystems", 00:06:44.695 "vfu_tgt_set_base_path", 00:06:44.695 "keyring_get_keys", 00:06:44.695 "iobuf_get_stats", 00:06:44.695 "iobuf_set_options", 00:06:44.695 "sock_get_default_impl", 00:06:44.695 "sock_set_default_impl", 00:06:44.695 "sock_impl_set_options", 00:06:44.695 "sock_impl_get_options", 00:06:44.695 "vmd_rescan", 00:06:44.695 "vmd_remove_device", 00:06:44.695 "vmd_enable", 00:06:44.695 "accel_get_stats", 00:06:44.695 "accel_set_options", 00:06:44.695 "accel_set_driver", 00:06:44.695 "accel_crypto_key_destroy", 00:06:44.695 "accel_crypto_keys_get", 00:06:44.695 "accel_crypto_key_create", 00:06:44.695 "accel_assign_opc", 00:06:44.695 "accel_get_module_info", 00:06:44.695 "accel_get_opc_assignments", 00:06:44.695 "bdev_get_histogram", 00:06:44.695 "bdev_enable_histogram", 00:06:44.695 "bdev_set_qos_limit", 00:06:44.695 "bdev_set_qd_sampling_period", 00:06:44.695 "bdev_get_bdevs", 00:06:44.695 "bdev_reset_iostat", 00:06:44.695 "bdev_get_iostat", 00:06:44.695 "bdev_examine", 00:06:44.695 "bdev_wait_for_examine", 00:06:44.695 "bdev_set_options", 00:06:44.695 "scsi_get_devices", 00:06:44.695 "thread_set_cpumask", 00:06:44.695 "scheduler_set_options", 00:06:44.695 "framework_get_governor", 00:06:44.695 "framework_get_scheduler", 00:06:44.695 "framework_set_scheduler", 00:06:44.695 "framework_get_reactors", 00:06:44.695 "thread_get_io_channels", 00:06:44.695 "thread_get_pollers", 00:06:44.695 "thread_get_stats", 00:06:44.695 "framework_monitor_context_switch", 00:06:44.695 "spdk_kill_instance", 00:06:44.695 "log_enable_timestamps", 00:06:44.695 "log_get_flags", 00:06:44.695 "log_clear_flag", 00:06:44.695 "log_set_flag", 00:06:44.695 "log_get_level", 00:06:44.695 "log_set_level", 00:06:44.695 "log_get_print_level", 00:06:44.695 "log_set_print_level", 00:06:44.695 "framework_enable_cpumask_locks", 00:06:44.695 "framework_disable_cpumask_locks", 00:06:44.695 "framework_wait_init", 00:06:44.695 "framework_start_init", 00:06:44.695 "virtio_blk_create_transport", 00:06:44.695 "virtio_blk_get_transports", 00:06:44.695 "vhost_controller_set_coalescing", 00:06:44.695 "vhost_get_controllers", 00:06:44.695 "vhost_delete_controller", 00:06:44.695 "vhost_create_blk_controller", 00:06:44.695 "vhost_scsi_controller_remove_target", 00:06:44.695 "vhost_scsi_controller_add_target", 00:06:44.695 "vhost_start_scsi_controller", 00:06:44.695 "vhost_create_scsi_controller", 00:06:44.695 "ublk_recover_disk", 00:06:44.695 "ublk_get_disks", 00:06:44.695 "ublk_stop_disk", 00:06:44.695 "ublk_start_disk", 00:06:44.695 "ublk_destroy_target", 00:06:44.695 "ublk_create_target", 00:06:44.695 "nbd_get_disks", 00:06:44.695 "nbd_stop_disk", 00:06:44.695 "nbd_start_disk", 00:06:44.695 "env_dpdk_get_mem_stats", 00:06:44.695 "nvmf_stop_mdns_prr", 00:06:44.695 "nvmf_publish_mdns_prr", 00:06:44.695 "nvmf_subsystem_get_listeners", 00:06:44.695 "nvmf_subsystem_get_qpairs", 00:06:44.695 "nvmf_subsystem_get_controllers", 00:06:44.695 "nvmf_get_stats", 00:06:44.695 "nvmf_get_transports", 00:06:44.695 "nvmf_create_transport", 00:06:44.695 "nvmf_get_targets", 00:06:44.695 "nvmf_delete_target", 00:06:44.695 "nvmf_create_target", 00:06:44.695 "nvmf_subsystem_allow_any_host", 00:06:44.695 "nvmf_subsystem_set_keys", 00:06:44.695 "nvmf_subsystem_remove_host", 00:06:44.695 "nvmf_subsystem_add_host", 00:06:44.695 "nvmf_ns_remove_host", 00:06:44.695 "nvmf_ns_add_host", 00:06:44.695 "nvmf_subsystem_remove_ns", 00:06:44.695 "nvmf_subsystem_set_ns_ana_group", 00:06:44.695 "nvmf_subsystem_add_ns", 00:06:44.695 "nvmf_subsystem_listener_set_ana_state", 00:06:44.695 "nvmf_discovery_get_referrals", 00:06:44.695 "nvmf_discovery_remove_referral", 00:06:44.695 "nvmf_discovery_add_referral", 00:06:44.695 "nvmf_subsystem_remove_listener", 00:06:44.695 "nvmf_subsystem_add_listener", 00:06:44.695 "nvmf_delete_subsystem", 00:06:44.695 "nvmf_create_subsystem", 00:06:44.695 "nvmf_get_subsystems", 00:06:44.695 "nvmf_set_crdt", 00:06:44.695 "nvmf_set_config", 00:06:44.695 "nvmf_set_max_subsystems", 00:06:44.695 "iscsi_get_histogram", 00:06:44.695 "iscsi_enable_histogram", 00:06:44.695 "iscsi_set_options", 00:06:44.695 "iscsi_get_auth_groups", 00:06:44.695 "iscsi_auth_group_remove_secret", 00:06:44.695 "iscsi_auth_group_add_secret", 00:06:44.695 "iscsi_delete_auth_group", 00:06:44.695 "iscsi_create_auth_group", 00:06:44.695 "iscsi_set_discovery_auth", 00:06:44.695 "iscsi_get_options", 00:06:44.695 "iscsi_target_node_request_logout", 00:06:44.695 "iscsi_target_node_set_redirect", 00:06:44.695 "iscsi_target_node_set_auth", 00:06:44.695 "iscsi_target_node_add_lun", 00:06:44.695 "iscsi_get_stats", 00:06:44.695 "iscsi_get_connections", 00:06:44.695 "iscsi_portal_group_set_auth", 00:06:44.695 "iscsi_start_portal_group", 00:06:44.695 "iscsi_delete_portal_group", 00:06:44.695 "iscsi_create_portal_group", 00:06:44.695 "iscsi_get_portal_groups", 00:06:44.695 "iscsi_delete_target_node", 00:06:44.695 "iscsi_target_node_remove_pg_ig_maps", 00:06:44.695 "iscsi_target_node_add_pg_ig_maps", 00:06:44.695 "iscsi_create_target_node", 00:06:44.695 "iscsi_get_target_nodes", 00:06:44.695 "iscsi_delete_initiator_group", 00:06:44.695 "iscsi_initiator_group_remove_initiators", 00:06:44.695 "iscsi_initiator_group_add_initiators", 00:06:44.695 "iscsi_create_initiator_group", 00:06:44.695 "iscsi_get_initiator_groups", 00:06:44.695 "fsdev_aio_delete", 00:06:44.695 "fsdev_aio_create", 00:06:44.695 "keyring_linux_set_options", 00:06:44.696 "keyring_file_remove_key", 00:06:44.696 "keyring_file_add_key", 00:06:44.696 "vfu_virtio_create_fs_endpoint", 00:06:44.696 "vfu_virtio_create_scsi_endpoint", 00:06:44.696 "vfu_virtio_scsi_remove_target", 00:06:44.696 "vfu_virtio_scsi_add_target", 00:06:44.696 "vfu_virtio_create_blk_endpoint", 00:06:44.696 "vfu_virtio_delete_endpoint", 00:06:44.696 "iaa_scan_accel_module", 00:06:44.696 "dsa_scan_accel_module", 00:06:44.696 "ioat_scan_accel_module", 00:06:44.696 "accel_error_inject_error", 00:06:44.696 "bdev_iscsi_delete", 00:06:44.696 "bdev_iscsi_create", 00:06:44.696 "bdev_iscsi_set_options", 00:06:44.696 "bdev_virtio_attach_controller", 00:06:44.696 "bdev_virtio_scsi_get_devices", 00:06:44.696 "bdev_virtio_detach_controller", 00:06:44.696 "bdev_virtio_blk_set_hotplug", 00:06:44.696 "bdev_ftl_set_property", 00:06:44.696 "bdev_ftl_get_properties", 00:06:44.696 "bdev_ftl_get_stats", 00:06:44.696 "bdev_ftl_unmap", 00:06:44.696 "bdev_ftl_unload", 00:06:44.696 "bdev_ftl_delete", 00:06:44.696 "bdev_ftl_load", 00:06:44.696 "bdev_ftl_create", 00:06:44.696 "bdev_aio_delete", 00:06:44.696 "bdev_aio_rescan", 00:06:44.696 "bdev_aio_create", 00:06:44.696 "blobfs_create", 00:06:44.696 "blobfs_detect", 00:06:44.696 "blobfs_set_cache_size", 00:06:44.696 "bdev_zone_block_delete", 00:06:44.696 "bdev_zone_block_create", 00:06:44.696 "bdev_delay_delete", 00:06:44.696 "bdev_delay_create", 00:06:44.696 "bdev_delay_update_latency", 00:06:44.696 "bdev_split_delete", 00:06:44.696 "bdev_split_create", 00:06:44.696 "bdev_error_inject_error", 00:06:44.696 "bdev_error_delete", 00:06:44.696 "bdev_error_create", 00:06:44.696 "bdev_raid_set_options", 00:06:44.696 "bdev_raid_remove_base_bdev", 00:06:44.696 "bdev_raid_add_base_bdev", 00:06:44.696 "bdev_raid_delete", 00:06:44.696 "bdev_raid_create", 00:06:44.696 "bdev_raid_get_bdevs", 00:06:44.696 "bdev_lvol_set_parent_bdev", 00:06:44.696 "bdev_lvol_set_parent", 00:06:44.696 "bdev_lvol_check_shallow_copy", 00:06:44.696 "bdev_lvol_start_shallow_copy", 00:06:44.696 "bdev_lvol_grow_lvstore", 00:06:44.696 "bdev_lvol_get_lvols", 00:06:44.696 "bdev_lvol_get_lvstores", 00:06:44.696 "bdev_lvol_delete", 00:06:44.696 "bdev_lvol_set_read_only", 00:06:44.696 "bdev_lvol_resize", 00:06:44.696 "bdev_lvol_decouple_parent", 00:06:44.696 "bdev_lvol_inflate", 00:06:44.696 "bdev_lvol_rename", 00:06:44.696 "bdev_lvol_clone_bdev", 00:06:44.696 "bdev_lvol_clone", 00:06:44.696 "bdev_lvol_snapshot", 00:06:44.696 "bdev_lvol_create", 00:06:44.696 "bdev_lvol_delete_lvstore", 00:06:44.696 "bdev_lvol_rename_lvstore", 00:06:44.696 "bdev_lvol_create_lvstore", 00:06:44.696 "bdev_passthru_delete", 00:06:44.696 "bdev_passthru_create", 00:06:44.696 "bdev_nvme_cuse_unregister", 00:06:44.696 "bdev_nvme_cuse_register", 00:06:44.696 "bdev_opal_new_user", 00:06:44.696 "bdev_opal_set_lock_state", 00:06:44.696 "bdev_opal_delete", 00:06:44.696 "bdev_opal_get_info", 00:06:44.696 "bdev_opal_create", 00:06:44.696 "bdev_nvme_opal_revert", 00:06:44.696 "bdev_nvme_opal_init", 00:06:44.696 "bdev_nvme_send_cmd", 00:06:44.696 "bdev_nvme_set_keys", 00:06:44.696 "bdev_nvme_get_path_iostat", 00:06:44.696 "bdev_nvme_get_mdns_discovery_info", 00:06:44.696 "bdev_nvme_stop_mdns_discovery", 00:06:44.696 "bdev_nvme_start_mdns_discovery", 00:06:44.696 "bdev_nvme_set_multipath_policy", 00:06:44.696 "bdev_nvme_set_preferred_path", 00:06:44.696 "bdev_nvme_get_io_paths", 00:06:44.696 "bdev_nvme_remove_error_injection", 00:06:44.696 "bdev_nvme_add_error_injection", 00:06:44.696 "bdev_nvme_get_discovery_info", 00:06:44.696 "bdev_nvme_stop_discovery", 00:06:44.696 "bdev_nvme_start_discovery", 00:06:44.696 "bdev_nvme_get_controller_health_info", 00:06:44.696 "bdev_nvme_disable_controller", 00:06:44.696 "bdev_nvme_enable_controller", 00:06:44.696 "bdev_nvme_reset_controller", 00:06:44.696 "bdev_nvme_get_transport_statistics", 00:06:44.696 "bdev_nvme_apply_firmware", 00:06:44.696 "bdev_nvme_detach_controller", 00:06:44.696 "bdev_nvme_get_controllers", 00:06:44.696 "bdev_nvme_attach_controller", 00:06:44.696 "bdev_nvme_set_hotplug", 00:06:44.696 "bdev_nvme_set_options", 00:06:44.696 "bdev_null_resize", 00:06:44.696 "bdev_null_delete", 00:06:44.696 "bdev_null_create", 00:06:44.696 "bdev_malloc_delete", 00:06:44.696 "bdev_malloc_create" 00:06:44.696 ] 00:06:44.696 16:30:42 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:44.696 16:30:42 spdkcli_tcp -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:44.696 16:30:42 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:44.696 16:30:42 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:44.696 16:30:42 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 3744820 00:06:44.696 16:30:42 spdkcli_tcp -- common/autotest_common.sh@950 -- # '[' -z 3744820 ']' 00:06:44.696 16:30:42 spdkcli_tcp -- common/autotest_common.sh@954 -- # kill -0 3744820 00:06:44.696 16:30:42 spdkcli_tcp -- common/autotest_common.sh@955 -- # uname 00:06:44.696 16:30:42 spdkcli_tcp -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:44.696 16:30:42 spdkcli_tcp -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3744820 00:06:44.696 16:30:42 spdkcli_tcp -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:44.696 16:30:42 spdkcli_tcp -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:44.696 16:30:42 spdkcli_tcp -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3744820' 00:06:44.696 killing process with pid 3744820 00:06:44.696 16:30:42 spdkcli_tcp -- common/autotest_common.sh@969 -- # kill 3744820 00:06:44.696 16:30:42 spdkcli_tcp -- common/autotest_common.sh@974 -- # wait 3744820 00:06:45.266 00:06:45.266 real 0m1.120s 00:06:45.266 user 0m1.833s 00:06:45.266 sys 0m0.487s 00:06:45.266 16:30:42 spdkcli_tcp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:45.266 16:30:42 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:45.266 ************************************ 00:06:45.266 END TEST spdkcli_tcp 00:06:45.266 ************************************ 00:06:45.266 16:30:42 -- spdk/autotest.sh@167 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:45.266 16:30:42 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:45.266 16:30:42 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:45.266 16:30:42 -- common/autotest_common.sh@10 -- # set +x 00:06:45.266 ************************************ 00:06:45.266 START TEST dpdk_mem_utility 00:06:45.266 ************************************ 00:06:45.266 16:30:42 dpdk_mem_utility -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:45.266 * Looking for test storage... 00:06:45.266 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:06:45.266 16:30:42 dpdk_mem_utility -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:45.266 16:30:42 dpdk_mem_utility -- common/autotest_common.sh@1681 -- # lcov --version 00:06:45.266 16:30:42 dpdk_mem_utility -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:45.266 16:30:42 dpdk_mem_utility -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@336 -- # IFS=.-: 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@336 -- # read -ra ver1 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@337 -- # IFS=.-: 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@337 -- # read -ra ver2 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@338 -- # local 'op=<' 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@340 -- # ver1_l=2 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@341 -- # ver2_l=1 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@344 -- # case "$op" in 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@345 -- # : 1 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@365 -- # decimal 1 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=1 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 1 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@365 -- # ver1[v]=1 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@366 -- # decimal 2 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=2 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 2 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@366 -- # ver2[v]=2 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:45.266 16:30:42 dpdk_mem_utility -- scripts/common.sh@368 -- # return 0 00:06:45.266 16:30:42 dpdk_mem_utility -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:45.266 16:30:42 dpdk_mem_utility -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:45.266 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:45.266 --rc genhtml_branch_coverage=1 00:06:45.266 --rc genhtml_function_coverage=1 00:06:45.266 --rc genhtml_legend=1 00:06:45.266 --rc geninfo_all_blocks=1 00:06:45.266 --rc geninfo_unexecuted_blocks=1 00:06:45.266 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:45.266 ' 00:06:45.266 16:30:42 dpdk_mem_utility -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:45.266 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:45.266 --rc genhtml_branch_coverage=1 00:06:45.266 --rc genhtml_function_coverage=1 00:06:45.266 --rc genhtml_legend=1 00:06:45.266 --rc geninfo_all_blocks=1 00:06:45.266 --rc geninfo_unexecuted_blocks=1 00:06:45.266 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:45.266 ' 00:06:45.266 16:30:42 dpdk_mem_utility -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:45.266 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:45.266 --rc genhtml_branch_coverage=1 00:06:45.266 --rc genhtml_function_coverage=1 00:06:45.266 --rc genhtml_legend=1 00:06:45.266 --rc geninfo_all_blocks=1 00:06:45.266 --rc geninfo_unexecuted_blocks=1 00:06:45.266 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:45.266 ' 00:06:45.266 16:30:42 dpdk_mem_utility -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:45.266 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:45.266 --rc genhtml_branch_coverage=1 00:06:45.266 --rc genhtml_function_coverage=1 00:06:45.266 --rc genhtml_legend=1 00:06:45.266 --rc geninfo_all_blocks=1 00:06:45.266 --rc geninfo_unexecuted_blocks=1 00:06:45.266 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:45.266 ' 00:06:45.266 16:30:42 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:45.266 16:30:42 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=3745156 00:06:45.266 16:30:42 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 3745156 00:06:45.266 16:30:42 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:45.266 16:30:42 dpdk_mem_utility -- common/autotest_common.sh@831 -- # '[' -z 3745156 ']' 00:06:45.266 16:30:42 dpdk_mem_utility -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:45.266 16:30:42 dpdk_mem_utility -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:45.266 16:30:42 dpdk_mem_utility -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:45.266 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:45.266 16:30:42 dpdk_mem_utility -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:45.266 16:30:42 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:45.526 [2024-11-28 16:30:42.914232] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:45.526 [2024-11-28 16:30:42.914299] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3745156 ] 00:06:45.526 [2024-11-28 16:30:42.980060] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.526 [2024-11-28 16:30:43.017950] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.787 16:30:43 dpdk_mem_utility -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:45.787 16:30:43 dpdk_mem_utility -- common/autotest_common.sh@864 -- # return 0 00:06:45.787 16:30:43 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:45.787 16:30:43 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:45.787 16:30:43 dpdk_mem_utility -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:45.787 16:30:43 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:45.787 { 00:06:45.787 "filename": "/tmp/spdk_mem_dump.txt" 00:06:45.787 } 00:06:45.787 16:30:43 dpdk_mem_utility -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:45.787 16:30:43 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:45.787 DPDK memory size 860.000000 MiB in 1 heap(s) 00:06:45.787 1 heaps totaling size 860.000000 MiB 00:06:45.787 size: 860.000000 MiB heap id: 0 00:06:45.787 end heaps---------- 00:06:45.788 9 mempools totaling size 642.649841 MiB 00:06:45.788 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:45.788 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:45.788 size: 92.545471 MiB name: bdev_io_3745156 00:06:45.788 size: 51.011292 MiB name: evtpool_3745156 00:06:45.788 size: 50.003479 MiB name: msgpool_3745156 00:06:45.788 size: 36.509338 MiB name: fsdev_io_3745156 00:06:45.788 size: 21.763794 MiB name: PDU_Pool 00:06:45.788 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:45.788 size: 0.026123 MiB name: Session_Pool 00:06:45.788 end mempools------- 00:06:45.788 6 memzones totaling size 4.142822 MiB 00:06:45.788 size: 1.000366 MiB name: RG_ring_0_3745156 00:06:45.788 size: 1.000366 MiB name: RG_ring_1_3745156 00:06:45.788 size: 1.000366 MiB name: RG_ring_4_3745156 00:06:45.788 size: 1.000366 MiB name: RG_ring_5_3745156 00:06:45.788 size: 0.125366 MiB name: RG_ring_2_3745156 00:06:45.788 size: 0.015991 MiB name: RG_ring_3_3745156 00:06:45.788 end memzones------- 00:06:45.788 16:30:43 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:45.788 heap id: 0 total size: 860.000000 MiB number of busy elements: 44 number of free elements: 16 00:06:45.788 list of free elements. size: 13.984680 MiB 00:06:45.788 element at address: 0x200000400000 with size: 1.999512 MiB 00:06:45.788 element at address: 0x200000800000 with size: 1.996948 MiB 00:06:45.788 element at address: 0x20001bc00000 with size: 0.999878 MiB 00:06:45.788 element at address: 0x20001be00000 with size: 0.999878 MiB 00:06:45.788 element at address: 0x200034a00000 with size: 0.994446 MiB 00:06:45.788 element at address: 0x20000b200000 with size: 0.959839 MiB 00:06:45.788 element at address: 0x200015e00000 with size: 0.954285 MiB 00:06:45.788 element at address: 0x20001c000000 with size: 0.936584 MiB 00:06:45.788 element at address: 0x200000200000 with size: 0.841614 MiB 00:06:45.788 element at address: 0x20001d800000 with size: 0.582886 MiB 00:06:45.788 element at address: 0x200003e00000 with size: 0.495605 MiB 00:06:45.788 element at address: 0x200007000000 with size: 0.490723 MiB 00:06:45.788 element at address: 0x20001c200000 with size: 0.485657 MiB 00:06:45.788 element at address: 0x200013800000 with size: 0.481934 MiB 00:06:45.788 element at address: 0x20002ac00000 with size: 0.410034 MiB 00:06:45.788 element at address: 0x200003a00000 with size: 0.354858 MiB 00:06:45.788 list of standard malloc elements. size: 199.218628 MiB 00:06:45.788 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:06:45.788 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:06:45.788 element at address: 0x20001bcfff80 with size: 1.000122 MiB 00:06:45.788 element at address: 0x20001befff80 with size: 1.000122 MiB 00:06:45.788 element at address: 0x20001c0fff80 with size: 1.000122 MiB 00:06:45.788 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:06:45.788 element at address: 0x20001c0eff00 with size: 0.062622 MiB 00:06:45.788 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:45.788 element at address: 0x20001c0efdc0 with size: 0.000305 MiB 00:06:45.788 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:06:45.788 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:06:45.788 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:06:45.788 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:06:45.788 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:06:45.788 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:06:45.788 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:06:45.788 element at address: 0x200003a5ad80 with size: 0.000183 MiB 00:06:45.788 element at address: 0x200003a5af80 with size: 0.000183 MiB 00:06:45.788 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:06:45.788 element at address: 0x200003adb300 with size: 0.000183 MiB 00:06:45.788 element at address: 0x200003adb500 with size: 0.000183 MiB 00:06:45.788 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:06:45.788 element at address: 0x200003affa80 with size: 0.000183 MiB 00:06:45.788 element at address: 0x200003affb40 with size: 0.000183 MiB 00:06:45.788 element at address: 0x200003e7ee00 with size: 0.000183 MiB 00:06:45.788 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:06:45.788 element at address: 0x20000707da00 with size: 0.000183 MiB 00:06:45.788 element at address: 0x20000707dac0 with size: 0.000183 MiB 00:06:45.788 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:06:45.788 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:06:45.788 element at address: 0x20001387b600 with size: 0.000183 MiB 00:06:45.788 element at address: 0x20001387b6c0 with size: 0.000183 MiB 00:06:45.788 element at address: 0x2000138fb980 with size: 0.000183 MiB 00:06:45.788 element at address: 0x200015ef44c0 with size: 0.000183 MiB 00:06:45.788 element at address: 0x20001c0efc40 with size: 0.000183 MiB 00:06:45.788 element at address: 0x20001c0efd00 with size: 0.000183 MiB 00:06:45.788 element at address: 0x20001c2bc740 with size: 0.000183 MiB 00:06:45.788 element at address: 0x20001d895380 with size: 0.000183 MiB 00:06:45.788 element at address: 0x20001d895440 with size: 0.000183 MiB 00:06:45.788 element at address: 0x20002ac68f80 with size: 0.000183 MiB 00:06:45.788 element at address: 0x20002ac69040 with size: 0.000183 MiB 00:06:45.788 element at address: 0x20002ac6fc40 with size: 0.000183 MiB 00:06:45.788 element at address: 0x20002ac6fe40 with size: 0.000183 MiB 00:06:45.788 element at address: 0x20002ac6ff00 with size: 0.000183 MiB 00:06:45.788 list of memzone associated elements. size: 646.796692 MiB 00:06:45.788 element at address: 0x20001d895500 with size: 211.416748 MiB 00:06:45.788 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:45.788 element at address: 0x20002ac6ffc0 with size: 157.562561 MiB 00:06:45.788 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:45.788 element at address: 0x200015ff4780 with size: 92.045044 MiB 00:06:45.788 associated memzone info: size: 92.044922 MiB name: MP_bdev_io_3745156_0 00:06:45.788 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:06:45.788 associated memzone info: size: 48.002930 MiB name: MP_evtpool_3745156_0 00:06:45.788 element at address: 0x200003fff380 with size: 48.003052 MiB 00:06:45.788 associated memzone info: size: 48.002930 MiB name: MP_msgpool_3745156_0 00:06:45.788 element at address: 0x2000139fdb80 with size: 36.008911 MiB 00:06:45.788 associated memzone info: size: 36.008789 MiB name: MP_fsdev_io_3745156_0 00:06:45.788 element at address: 0x20001c3be940 with size: 20.255554 MiB 00:06:45.788 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:45.788 element at address: 0x200034bfeb40 with size: 18.005066 MiB 00:06:45.788 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:45.788 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:06:45.788 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_3745156 00:06:45.788 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:06:45.788 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_3745156 00:06:45.788 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:06:45.788 associated memzone info: size: 1.007996 MiB name: MP_evtpool_3745156 00:06:45.788 element at address: 0x2000138fba40 with size: 1.008118 MiB 00:06:45.788 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:45.788 element at address: 0x20001c2bc800 with size: 1.008118 MiB 00:06:45.788 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:45.788 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:06:45.788 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:45.788 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:06:45.788 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:45.788 element at address: 0x200003eff180 with size: 1.000488 MiB 00:06:45.788 associated memzone info: size: 1.000366 MiB name: RG_ring_0_3745156 00:06:45.788 element at address: 0x200003affc00 with size: 1.000488 MiB 00:06:45.788 associated memzone info: size: 1.000366 MiB name: RG_ring_1_3745156 00:06:45.788 element at address: 0x200015ef4580 with size: 1.000488 MiB 00:06:45.788 associated memzone info: size: 1.000366 MiB name: RG_ring_4_3745156 00:06:45.788 element at address: 0x200034afe940 with size: 1.000488 MiB 00:06:45.788 associated memzone info: size: 1.000366 MiB name: RG_ring_5_3745156 00:06:45.788 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:06:45.788 associated memzone info: size: 0.500366 MiB name: RG_MP_fsdev_io_3745156 00:06:45.788 element at address: 0x200003e7eec0 with size: 0.500488 MiB 00:06:45.788 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_3745156 00:06:45.788 element at address: 0x20001387b780 with size: 0.500488 MiB 00:06:45.788 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:45.788 element at address: 0x20000707db80 with size: 0.500488 MiB 00:06:45.788 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:45.788 element at address: 0x20001c27c540 with size: 0.250488 MiB 00:06:45.788 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:45.788 element at address: 0x200003adf880 with size: 0.125488 MiB 00:06:45.788 associated memzone info: size: 0.125366 MiB name: RG_ring_2_3745156 00:06:45.788 element at address: 0x20000b2f5b80 with size: 0.031738 MiB 00:06:45.788 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:45.788 element at address: 0x20002ac69100 with size: 0.023743 MiB 00:06:45.788 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:45.788 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:06:45.788 associated memzone info: size: 0.015991 MiB name: RG_ring_3_3745156 00:06:45.788 element at address: 0x20002ac6f240 with size: 0.002441 MiB 00:06:45.788 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:45.788 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:06:45.788 associated memzone info: size: 0.000183 MiB name: MP_msgpool_3745156 00:06:45.788 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:06:45.788 associated memzone info: size: 0.000183 MiB name: MP_fsdev_io_3745156 00:06:45.788 element at address: 0x200003a5ae40 with size: 0.000305 MiB 00:06:45.788 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_3745156 00:06:45.788 element at address: 0x20002ac6fd00 with size: 0.000305 MiB 00:06:45.789 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:45.789 16:30:43 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:45.789 16:30:43 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 3745156 00:06:45.789 16:30:43 dpdk_mem_utility -- common/autotest_common.sh@950 -- # '[' -z 3745156 ']' 00:06:45.789 16:30:43 dpdk_mem_utility -- common/autotest_common.sh@954 -- # kill -0 3745156 00:06:45.789 16:30:43 dpdk_mem_utility -- common/autotest_common.sh@955 -- # uname 00:06:45.789 16:30:43 dpdk_mem_utility -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:45.789 16:30:43 dpdk_mem_utility -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3745156 00:06:45.789 16:30:43 dpdk_mem_utility -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:45.789 16:30:43 dpdk_mem_utility -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:45.789 16:30:43 dpdk_mem_utility -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3745156' 00:06:45.789 killing process with pid 3745156 00:06:45.789 16:30:43 dpdk_mem_utility -- common/autotest_common.sh@969 -- # kill 3745156 00:06:45.789 16:30:43 dpdk_mem_utility -- common/autotest_common.sh@974 -- # wait 3745156 00:06:46.049 00:06:46.049 real 0m0.984s 00:06:46.049 user 0m0.885s 00:06:46.049 sys 0m0.451s 00:06:46.049 16:30:43 dpdk_mem_utility -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:46.049 16:30:43 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:46.049 ************************************ 00:06:46.049 END TEST dpdk_mem_utility 00:06:46.049 ************************************ 00:06:46.309 16:30:43 -- spdk/autotest.sh@168 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:46.309 16:30:43 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:46.309 16:30:43 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:46.309 16:30:43 -- common/autotest_common.sh@10 -- # set +x 00:06:46.309 ************************************ 00:06:46.309 START TEST event 00:06:46.309 ************************************ 00:06:46.309 16:30:43 event -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:46.309 * Looking for test storage... 00:06:46.309 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:46.309 16:30:43 event -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:46.309 16:30:43 event -- common/autotest_common.sh@1681 -- # lcov --version 00:06:46.309 16:30:43 event -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:46.309 16:30:43 event -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:46.309 16:30:43 event -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:46.309 16:30:43 event -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:46.309 16:30:43 event -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:46.309 16:30:43 event -- scripts/common.sh@336 -- # IFS=.-: 00:06:46.309 16:30:43 event -- scripts/common.sh@336 -- # read -ra ver1 00:06:46.309 16:30:43 event -- scripts/common.sh@337 -- # IFS=.-: 00:06:46.309 16:30:43 event -- scripts/common.sh@337 -- # read -ra ver2 00:06:46.309 16:30:43 event -- scripts/common.sh@338 -- # local 'op=<' 00:06:46.309 16:30:43 event -- scripts/common.sh@340 -- # ver1_l=2 00:06:46.309 16:30:43 event -- scripts/common.sh@341 -- # ver2_l=1 00:06:46.309 16:30:43 event -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:46.309 16:30:43 event -- scripts/common.sh@344 -- # case "$op" in 00:06:46.309 16:30:43 event -- scripts/common.sh@345 -- # : 1 00:06:46.309 16:30:43 event -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:46.309 16:30:43 event -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:46.309 16:30:43 event -- scripts/common.sh@365 -- # decimal 1 00:06:46.309 16:30:43 event -- scripts/common.sh@353 -- # local d=1 00:06:46.309 16:30:43 event -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:46.309 16:30:43 event -- scripts/common.sh@355 -- # echo 1 00:06:46.309 16:30:43 event -- scripts/common.sh@365 -- # ver1[v]=1 00:06:46.568 16:30:43 event -- scripts/common.sh@366 -- # decimal 2 00:06:46.568 16:30:43 event -- scripts/common.sh@353 -- # local d=2 00:06:46.568 16:30:43 event -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:46.568 16:30:43 event -- scripts/common.sh@355 -- # echo 2 00:06:46.568 16:30:43 event -- scripts/common.sh@366 -- # ver2[v]=2 00:06:46.568 16:30:43 event -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:46.568 16:30:43 event -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:46.568 16:30:43 event -- scripts/common.sh@368 -- # return 0 00:06:46.568 16:30:43 event -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:46.568 16:30:43 event -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:46.568 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:46.568 --rc genhtml_branch_coverage=1 00:06:46.568 --rc genhtml_function_coverage=1 00:06:46.568 --rc genhtml_legend=1 00:06:46.568 --rc geninfo_all_blocks=1 00:06:46.568 --rc geninfo_unexecuted_blocks=1 00:06:46.568 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:46.568 ' 00:06:46.568 16:30:43 event -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:46.568 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:46.568 --rc genhtml_branch_coverage=1 00:06:46.568 --rc genhtml_function_coverage=1 00:06:46.568 --rc genhtml_legend=1 00:06:46.568 --rc geninfo_all_blocks=1 00:06:46.569 --rc geninfo_unexecuted_blocks=1 00:06:46.569 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:46.569 ' 00:06:46.569 16:30:43 event -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:46.569 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:46.569 --rc genhtml_branch_coverage=1 00:06:46.569 --rc genhtml_function_coverage=1 00:06:46.569 --rc genhtml_legend=1 00:06:46.569 --rc geninfo_all_blocks=1 00:06:46.569 --rc geninfo_unexecuted_blocks=1 00:06:46.569 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:46.569 ' 00:06:46.569 16:30:43 event -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:46.569 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:46.569 --rc genhtml_branch_coverage=1 00:06:46.569 --rc genhtml_function_coverage=1 00:06:46.569 --rc genhtml_legend=1 00:06:46.569 --rc geninfo_all_blocks=1 00:06:46.569 --rc geninfo_unexecuted_blocks=1 00:06:46.569 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:46.569 ' 00:06:46.569 16:30:43 event -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:46.569 16:30:43 event -- bdev/nbd_common.sh@6 -- # set -e 00:06:46.569 16:30:43 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:46.569 16:30:43 event -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:06:46.569 16:30:43 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:46.569 16:30:43 event -- common/autotest_common.sh@10 -- # set +x 00:06:46.569 ************************************ 00:06:46.569 START TEST event_perf 00:06:46.569 ************************************ 00:06:46.569 16:30:44 event.event_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:46.569 Running I/O for 1 seconds...[2024-11-28 16:30:44.021835] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:46.569 [2024-11-28 16:30:44.021938] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3745295 ] 00:06:46.569 [2024-11-28 16:30:44.093964] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:46.569 [2024-11-28 16:30:44.135423] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:46.569 [2024-11-28 16:30:44.135519] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:06:46.569 [2024-11-28 16:30:44.135580] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:06:46.569 [2024-11-28 16:30:44.135582] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.948 Running I/O for 1 seconds... 00:06:47.948 lcore 0: 197269 00:06:47.948 lcore 1: 197267 00:06:47.948 lcore 2: 197267 00:06:47.948 lcore 3: 197267 00:06:47.948 done. 00:06:47.948 00:06:47.948 real 0m1.187s 00:06:47.948 user 0m4.088s 00:06:47.948 sys 0m0.095s 00:06:47.948 16:30:45 event.event_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:47.948 16:30:45 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:06:47.948 ************************************ 00:06:47.948 END TEST event_perf 00:06:47.948 ************************************ 00:06:47.948 16:30:45 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:47.948 16:30:45 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:06:47.948 16:30:45 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:47.948 16:30:45 event -- common/autotest_common.sh@10 -- # set +x 00:06:47.948 ************************************ 00:06:47.948 START TEST event_reactor 00:06:47.948 ************************************ 00:06:47.948 16:30:45 event.event_reactor -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:47.948 [2024-11-28 16:30:45.277308] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:47.948 [2024-11-28 16:30:45.277361] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3745528 ] 00:06:47.948 [2024-11-28 16:30:45.340945] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.948 [2024-11-28 16:30:45.377595] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.885 test_start 00:06:48.885 oneshot 00:06:48.885 tick 100 00:06:48.885 tick 100 00:06:48.885 tick 250 00:06:48.885 tick 100 00:06:48.885 tick 100 00:06:48.885 tick 100 00:06:48.885 tick 250 00:06:48.885 tick 500 00:06:48.885 tick 100 00:06:48.885 tick 100 00:06:48.885 tick 250 00:06:48.885 tick 100 00:06:48.885 tick 100 00:06:48.885 test_end 00:06:48.885 00:06:48.885 real 0m1.161s 00:06:48.885 user 0m1.082s 00:06:48.885 sys 0m0.074s 00:06:48.885 16:30:46 event.event_reactor -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:48.885 16:30:46 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:06:48.885 ************************************ 00:06:48.885 END TEST event_reactor 00:06:48.885 ************************************ 00:06:48.885 16:30:46 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:48.885 16:30:46 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:06:48.885 16:30:46 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:48.885 16:30:46 event -- common/autotest_common.sh@10 -- # set +x 00:06:48.885 ************************************ 00:06:48.885 START TEST event_reactor_perf 00:06:48.885 ************************************ 00:06:48.885 16:30:46 event.event_reactor_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:48.885 [2024-11-28 16:30:46.519613] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:48.885 [2024-11-28 16:30:46.519718] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3745808 ] 00:06:49.144 [2024-11-28 16:30:46.589148] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.144 [2024-11-28 16:30:46.626125] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.090 test_start 00:06:50.090 test_end 00:06:50.090 Performance: 944832 events per second 00:06:50.090 00:06:50.090 real 0m1.176s 00:06:50.090 user 0m1.086s 00:06:50.090 sys 0m0.086s 00:06:50.090 16:30:47 event.event_reactor_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:50.090 16:30:47 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:06:50.090 ************************************ 00:06:50.090 END TEST event_reactor_perf 00:06:50.090 ************************************ 00:06:50.090 16:30:47 event -- event/event.sh@49 -- # uname -s 00:06:50.090 16:30:47 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:50.090 16:30:47 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:50.090 16:30:47 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:50.090 16:30:47 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:50.090 16:30:47 event -- common/autotest_common.sh@10 -- # set +x 00:06:50.350 ************************************ 00:06:50.350 START TEST event_scheduler 00:06:50.350 ************************************ 00:06:50.350 16:30:47 event.event_scheduler -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:50.350 * Looking for test storage... 00:06:50.350 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:06:50.350 16:30:47 event.event_scheduler -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:06:50.350 16:30:47 event.event_scheduler -- common/autotest_common.sh@1681 -- # lcov --version 00:06:50.350 16:30:47 event.event_scheduler -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:06:50.350 16:30:47 event.event_scheduler -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@336 -- # IFS=.-: 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@336 -- # read -ra ver1 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@337 -- # IFS=.-: 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@337 -- # read -ra ver2 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@338 -- # local 'op=<' 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@340 -- # ver1_l=2 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@341 -- # ver2_l=1 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@344 -- # case "$op" in 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@345 -- # : 1 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@365 -- # decimal 1 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@353 -- # local d=1 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@355 -- # echo 1 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@365 -- # ver1[v]=1 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@366 -- # decimal 2 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@353 -- # local d=2 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@355 -- # echo 2 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@366 -- # ver2[v]=2 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:50.350 16:30:47 event.event_scheduler -- scripts/common.sh@368 -- # return 0 00:06:50.350 16:30:47 event.event_scheduler -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:50.350 16:30:47 event.event_scheduler -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:06:50.350 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.350 --rc genhtml_branch_coverage=1 00:06:50.350 --rc genhtml_function_coverage=1 00:06:50.350 --rc genhtml_legend=1 00:06:50.350 --rc geninfo_all_blocks=1 00:06:50.350 --rc geninfo_unexecuted_blocks=1 00:06:50.350 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.350 ' 00:06:50.350 16:30:47 event.event_scheduler -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:06:50.350 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.350 --rc genhtml_branch_coverage=1 00:06:50.350 --rc genhtml_function_coverage=1 00:06:50.350 --rc genhtml_legend=1 00:06:50.351 --rc geninfo_all_blocks=1 00:06:50.351 --rc geninfo_unexecuted_blocks=1 00:06:50.351 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.351 ' 00:06:50.351 16:30:47 event.event_scheduler -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:06:50.351 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.351 --rc genhtml_branch_coverage=1 00:06:50.351 --rc genhtml_function_coverage=1 00:06:50.351 --rc genhtml_legend=1 00:06:50.351 --rc geninfo_all_blocks=1 00:06:50.351 --rc geninfo_unexecuted_blocks=1 00:06:50.351 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.351 ' 00:06:50.351 16:30:47 event.event_scheduler -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:06:50.351 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:50.351 --rc genhtml_branch_coverage=1 00:06:50.351 --rc genhtml_function_coverage=1 00:06:50.351 --rc genhtml_legend=1 00:06:50.351 --rc geninfo_all_blocks=1 00:06:50.351 --rc geninfo_unexecuted_blocks=1 00:06:50.351 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:50.351 ' 00:06:50.351 16:30:47 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:50.351 16:30:47 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=3746127 00:06:50.351 16:30:47 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:50.351 16:30:47 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:50.351 16:30:47 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 3746127 00:06:50.351 16:30:47 event.event_scheduler -- common/autotest_common.sh@831 -- # '[' -z 3746127 ']' 00:06:50.351 16:30:47 event.event_scheduler -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:50.351 16:30:47 event.event_scheduler -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:50.351 16:30:47 event.event_scheduler -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:50.351 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:50.351 16:30:47 event.event_scheduler -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:50.351 16:30:47 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:50.351 [2024-11-28 16:30:47.978474] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:50.351 [2024-11-28 16:30:47.978559] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3746127 ] 00:06:50.611 [2024-11-28 16:30:48.042522] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:50.611 [2024-11-28 16:30:48.083544] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.611 [2024-11-28 16:30:48.083631] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:50.611 [2024-11-28 16:30:48.083688] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:06:50.611 [2024-11-28 16:30:48.083690] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:06:50.611 16:30:48 event.event_scheduler -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:50.611 16:30:48 event.event_scheduler -- common/autotest_common.sh@864 -- # return 0 00:06:50.611 16:30:48 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:06:50.611 16:30:48 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:50.611 16:30:48 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:50.611 [2024-11-28 16:30:48.160416] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:06:50.611 [2024-11-28 16:30:48.160436] scheduler_dynamic.c: 280:init: *NOTICE*: Unable to initialize dpdk governor 00:06:50.611 [2024-11-28 16:30:48.160449] scheduler_dynamic.c: 427:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:06:50.611 [2024-11-28 16:30:48.160456] scheduler_dynamic.c: 429:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:06:50.611 [2024-11-28 16:30:48.160464] scheduler_dynamic.c: 431:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:06:50.611 16:30:48 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:50.611 16:30:48 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:06:50.611 16:30:48 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:50.611 16:30:48 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:50.611 [2024-11-28 16:30:48.232156] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:06:50.611 16:30:48 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:50.611 16:30:48 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:06:50.611 16:30:48 event.event_scheduler -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:50.611 16:30:48 event.event_scheduler -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:50.611 16:30:48 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:50.870 ************************************ 00:06:50.870 START TEST scheduler_create_thread 00:06:50.870 ************************************ 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1125 -- # scheduler_create_thread 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:50.870 2 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:50.870 3 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:50.870 4 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:50.870 5 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:50.870 6 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:50.870 7 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:50.870 16:30:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:06:50.871 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:50.871 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:50.871 8 00:06:50.871 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:50.871 16:30:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:06:50.871 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:50.871 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:50.871 9 00:06:50.871 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:50.871 16:30:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:06:50.871 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:50.871 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:50.871 10 00:06:50.871 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:50.871 16:30:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:06:50.871 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:50.871 16:30:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:52.247 16:30:49 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:52.247 16:30:49 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:06:52.247 16:30:49 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:06:52.247 16:30:49 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:52.247 16:30:49 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:53.184 16:30:50 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:53.184 16:30:50 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:06:53.184 16:30:50 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:53.184 16:30:50 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:54.122 16:30:51 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:54.122 16:30:51 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:06:54.122 16:30:51 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:06:54.122 16:30:51 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:54.122 16:30:51 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:54.691 16:30:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:54.691 00:06:54.691 real 0m3.893s 00:06:54.691 user 0m0.027s 00:06:54.691 sys 0m0.005s 00:06:54.691 16:30:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:54.691 16:30:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:54.691 ************************************ 00:06:54.691 END TEST scheduler_create_thread 00:06:54.691 ************************************ 00:06:54.691 16:30:52 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:06:54.691 16:30:52 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 3746127 00:06:54.691 16:30:52 event.event_scheduler -- common/autotest_common.sh@950 -- # '[' -z 3746127 ']' 00:06:54.691 16:30:52 event.event_scheduler -- common/autotest_common.sh@954 -- # kill -0 3746127 00:06:54.691 16:30:52 event.event_scheduler -- common/autotest_common.sh@955 -- # uname 00:06:54.691 16:30:52 event.event_scheduler -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:54.691 16:30:52 event.event_scheduler -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3746127 00:06:54.691 16:30:52 event.event_scheduler -- common/autotest_common.sh@956 -- # process_name=reactor_2 00:06:54.691 16:30:52 event.event_scheduler -- common/autotest_common.sh@960 -- # '[' reactor_2 = sudo ']' 00:06:54.691 16:30:52 event.event_scheduler -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3746127' 00:06:54.691 killing process with pid 3746127 00:06:54.691 16:30:52 event.event_scheduler -- common/autotest_common.sh@969 -- # kill 3746127 00:06:54.691 16:30:52 event.event_scheduler -- common/autotest_common.sh@974 -- # wait 3746127 00:06:54.951 [2024-11-28 16:30:52.547857] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:06:55.210 00:06:55.210 real 0m5.043s 00:06:55.210 user 0m9.578s 00:06:55.210 sys 0m0.436s 00:06:55.210 16:30:52 event.event_scheduler -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:55.210 16:30:52 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:55.210 ************************************ 00:06:55.210 END TEST event_scheduler 00:06:55.210 ************************************ 00:06:55.210 16:30:52 event -- event/event.sh@51 -- # modprobe -n nbd 00:06:55.210 16:30:52 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:06:55.210 16:30:52 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:55.210 16:30:52 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:55.210 16:30:52 event -- common/autotest_common.sh@10 -- # set +x 00:06:55.469 ************************************ 00:06:55.469 START TEST app_repeat 00:06:55.469 ************************************ 00:06:55.469 16:30:52 event.app_repeat -- common/autotest_common.sh@1125 -- # app_repeat_test 00:06:55.469 16:30:52 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:55.469 16:30:52 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:55.469 16:30:52 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:06:55.469 16:30:52 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:55.469 16:30:52 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:06:55.469 16:30:52 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:06:55.469 16:30:52 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:06:55.469 16:30:52 event.app_repeat -- event/event.sh@19 -- # repeat_pid=3746984 00:06:55.469 16:30:52 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:06:55.469 16:30:52 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 3746984' 00:06:55.469 Process app_repeat pid: 3746984 00:06:55.469 16:30:52 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:55.469 16:30:52 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:06:55.469 spdk_app_start Round 0 00:06:55.470 16:30:52 event.app_repeat -- event/event.sh@25 -- # waitforlisten 3746984 /var/tmp/spdk-nbd.sock 00:06:55.470 16:30:52 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 3746984 ']' 00:06:55.470 16:30:52 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:55.470 16:30:52 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:06:55.470 16:30:52 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:55.470 16:30:52 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:55.470 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:55.470 16:30:52 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:55.470 16:30:52 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:55.470 [2024-11-28 16:30:52.907688] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:06:55.470 [2024-11-28 16:30:52.907787] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3746984 ] 00:06:55.470 [2024-11-28 16:30:52.976782] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:55.470 [2024-11-28 16:30:53.017239] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:55.470 [2024-11-28 16:30:53.017242] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.470 16:30:53 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:55.470 16:30:53 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:06:55.470 16:30:53 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:55.732 Malloc0 00:06:55.732 16:30:53 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:55.992 Malloc1 00:06:55.992 16:30:53 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:55.992 16:30:53 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:55.992 16:30:53 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:55.992 16:30:53 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:55.992 16:30:53 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:55.992 16:30:53 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:55.992 16:30:53 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:55.992 16:30:53 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:55.992 16:30:53 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:55.992 16:30:53 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:55.992 16:30:53 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:55.992 16:30:53 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:55.992 16:30:53 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:55.992 16:30:53 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:55.992 16:30:53 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:55.992 16:30:53 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:56.251 /dev/nbd0 00:06:56.251 16:30:53 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:56.251 16:30:53 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:56.251 16:30:53 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:06:56.251 16:30:53 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:06:56.251 16:30:53 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:06:56.251 16:30:53 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:06:56.251 16:30:53 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:06:56.251 16:30:53 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:06:56.251 16:30:53 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:06:56.251 16:30:53 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:06:56.251 16:30:53 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:56.251 1+0 records in 00:06:56.251 1+0 records out 00:06:56.251 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256616 s, 16.0 MB/s 00:06:56.251 16:30:53 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:56.251 16:30:53 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:06:56.251 16:30:53 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:56.251 16:30:53 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:06:56.251 16:30:53 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:06:56.251 16:30:53 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:56.251 16:30:53 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:56.251 16:30:53 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:56.510 /dev/nbd1 00:06:56.510 16:30:53 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:56.510 16:30:53 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:56.510 16:30:53 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:06:56.510 16:30:53 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:06:56.510 16:30:53 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:06:56.510 16:30:53 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:06:56.510 16:30:53 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:06:56.510 16:30:53 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:06:56.510 16:30:53 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:06:56.510 16:30:53 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:06:56.510 16:30:53 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:56.510 1+0 records in 00:06:56.510 1+0 records out 00:06:56.510 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251283 s, 16.3 MB/s 00:06:56.510 16:30:54 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:56.510 16:30:54 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:06:56.510 16:30:54 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:56.510 16:30:54 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:06:56.510 16:30:54 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:06:56.510 16:30:54 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:56.510 16:30:54 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:56.510 16:30:54 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:56.510 16:30:54 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:56.510 16:30:54 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:56.770 { 00:06:56.770 "nbd_device": "/dev/nbd0", 00:06:56.770 "bdev_name": "Malloc0" 00:06:56.770 }, 00:06:56.770 { 00:06:56.770 "nbd_device": "/dev/nbd1", 00:06:56.770 "bdev_name": "Malloc1" 00:06:56.770 } 00:06:56.770 ]' 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:56.770 { 00:06:56.770 "nbd_device": "/dev/nbd0", 00:06:56.770 "bdev_name": "Malloc0" 00:06:56.770 }, 00:06:56.770 { 00:06:56.770 "nbd_device": "/dev/nbd1", 00:06:56.770 "bdev_name": "Malloc1" 00:06:56.770 } 00:06:56.770 ]' 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:56.770 /dev/nbd1' 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:56.770 /dev/nbd1' 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:56.770 256+0 records in 00:06:56.770 256+0 records out 00:06:56.770 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0111018 s, 94.5 MB/s 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:56.770 256+0 records in 00:06:56.770 256+0 records out 00:06:56.770 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0209583 s, 50.0 MB/s 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:56.770 256+0 records in 00:06:56.770 256+0 records out 00:06:56.770 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0213295 s, 49.2 MB/s 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:56.770 16:30:54 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:57.030 16:30:54 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:57.030 16:30:54 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:57.030 16:30:54 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:57.030 16:30:54 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:57.030 16:30:54 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:57.030 16:30:54 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:57.030 16:30:54 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:57.030 16:30:54 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:57.030 16:30:54 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:57.030 16:30:54 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:57.289 16:30:54 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:57.289 16:30:54 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:57.289 16:30:54 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:57.289 16:30:54 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:57.289 16:30:54 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:57.289 16:30:54 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:57.289 16:30:54 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:57.289 16:30:54 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:57.289 16:30:54 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:57.289 16:30:54 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:57.289 16:30:54 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:57.549 16:30:54 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:57.549 16:30:54 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:57.549 16:30:54 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:57.549 16:30:54 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:57.549 16:30:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:57.549 16:30:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:57.549 16:30:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:57.549 16:30:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:57.549 16:30:54 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:57.549 16:30:54 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:57.549 16:30:54 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:57.549 16:30:54 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:57.549 16:30:54 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:57.808 16:30:55 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:57.808 [2024-11-28 16:30:55.367346] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:57.808 [2024-11-28 16:30:55.402004] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:06:57.808 [2024-11-28 16:30:55.402006] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.808 [2024-11-28 16:30:55.443377] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:57.808 [2024-11-28 16:30:55.443435] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:01.098 16:30:58 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:01.098 16:30:58 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:01.098 spdk_app_start Round 1 00:07:01.098 16:30:58 event.app_repeat -- event/event.sh@25 -- # waitforlisten 3746984 /var/tmp/spdk-nbd.sock 00:07:01.098 16:30:58 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 3746984 ']' 00:07:01.098 16:30:58 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:01.098 16:30:58 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:01.098 16:30:58 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:01.098 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:01.098 16:30:58 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:01.098 16:30:58 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:01.098 16:30:58 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:01.098 16:30:58 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:07:01.098 16:30:58 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:01.098 Malloc0 00:07:01.098 16:30:58 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:01.357 Malloc1 00:07:01.357 16:30:58 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:01.357 16:30:58 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:01.357 16:30:58 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:01.357 16:30:58 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:01.357 16:30:58 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:01.357 16:30:58 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:01.357 16:30:58 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:01.357 16:30:58 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:01.357 16:30:58 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:01.357 16:30:58 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:01.357 16:30:58 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:01.357 16:30:58 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:01.357 16:30:58 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:01.357 16:30:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:01.357 16:30:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:01.358 16:30:58 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:01.358 /dev/nbd0 00:07:01.358 16:30:58 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:01.358 16:30:58 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:01.358 16:30:58 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:07:01.358 16:30:58 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:01.358 16:30:58 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:01.358 16:30:58 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:01.358 16:30:58 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:07:01.358 16:30:58 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:01.358 16:30:58 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:01.358 16:30:58 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:01.358 16:30:58 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:01.358 1+0 records in 00:07:01.358 1+0 records out 00:07:01.358 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235607 s, 17.4 MB/s 00:07:01.358 16:30:58 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:01.358 16:30:58 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:01.358 16:30:58 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:01.358 16:30:58 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:01.358 16:30:58 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:01.358 16:30:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:01.358 16:30:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:01.358 16:30:58 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:01.617 /dev/nbd1 00:07:01.617 16:30:59 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:01.617 16:30:59 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:01.617 16:30:59 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:07:01.617 16:30:59 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:01.617 16:30:59 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:01.617 16:30:59 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:01.617 16:30:59 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:07:01.617 16:30:59 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:01.617 16:30:59 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:01.617 16:30:59 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:01.617 16:30:59 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:01.617 1+0 records in 00:07:01.617 1+0 records out 00:07:01.617 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240616 s, 17.0 MB/s 00:07:01.617 16:30:59 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:01.617 16:30:59 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:01.617 16:30:59 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:01.617 16:30:59 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:01.617 16:30:59 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:01.617 16:30:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:01.617 16:30:59 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:01.617 16:30:59 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:01.617 16:30:59 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:01.617 16:30:59 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:01.876 16:30:59 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:01.876 { 00:07:01.876 "nbd_device": "/dev/nbd0", 00:07:01.876 "bdev_name": "Malloc0" 00:07:01.876 }, 00:07:01.876 { 00:07:01.876 "nbd_device": "/dev/nbd1", 00:07:01.876 "bdev_name": "Malloc1" 00:07:01.876 } 00:07:01.876 ]' 00:07:01.876 16:30:59 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:01.877 { 00:07:01.877 "nbd_device": "/dev/nbd0", 00:07:01.877 "bdev_name": "Malloc0" 00:07:01.877 }, 00:07:01.877 { 00:07:01.877 "nbd_device": "/dev/nbd1", 00:07:01.877 "bdev_name": "Malloc1" 00:07:01.877 } 00:07:01.877 ]' 00:07:01.877 16:30:59 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:01.877 16:30:59 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:01.877 /dev/nbd1' 00:07:01.877 16:30:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:01.877 /dev/nbd1' 00:07:01.877 16:30:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:01.877 16:30:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:01.877 16:30:59 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:01.877 16:30:59 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:01.877 16:30:59 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:01.877 16:30:59 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:01.877 16:30:59 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:01.877 16:30:59 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:01.877 16:30:59 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:01.877 16:30:59 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:01.877 16:30:59 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:01.877 16:30:59 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:01.877 256+0 records in 00:07:01.877 256+0 records out 00:07:01.877 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0110233 s, 95.1 MB/s 00:07:01.877 16:30:59 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:01.877 16:30:59 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:01.877 256+0 records in 00:07:01.877 256+0 records out 00:07:01.877 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0195888 s, 53.5 MB/s 00:07:01.877 16:30:59 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:01.877 16:30:59 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:02.136 256+0 records in 00:07:02.136 256+0 records out 00:07:02.136 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.02102 s, 49.9 MB/s 00:07:02.136 16:30:59 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:02.136 16:30:59 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:02.136 16:30:59 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:02.136 16:30:59 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:02.136 16:30:59 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:02.136 16:30:59 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:02.136 16:30:59 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:02.136 16:30:59 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:02.136 16:30:59 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:02.136 16:30:59 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:02.136 16:30:59 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:02.136 16:30:59 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:02.136 16:30:59 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:02.136 16:30:59 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:02.136 16:30:59 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:02.136 16:30:59 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:02.137 16:30:59 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:02.137 16:30:59 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:02.137 16:30:59 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:02.137 16:30:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:02.137 16:30:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:02.137 16:30:59 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:02.137 16:30:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:02.137 16:30:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:02.137 16:30:59 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:02.137 16:30:59 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:02.137 16:30:59 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:02.137 16:30:59 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:02.137 16:30:59 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:02.395 16:30:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:02.395 16:30:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:02.395 16:30:59 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:02.395 16:30:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:02.395 16:30:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:02.395 16:30:59 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:02.395 16:30:59 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:02.395 16:30:59 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:02.395 16:30:59 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:02.395 16:30:59 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:02.396 16:30:59 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:02.655 16:31:00 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:02.655 16:31:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:02.655 16:31:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:02.655 16:31:00 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:02.655 16:31:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:02.655 16:31:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:02.655 16:31:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:02.655 16:31:00 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:02.655 16:31:00 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:02.655 16:31:00 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:02.655 16:31:00 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:02.655 16:31:00 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:02.655 16:31:00 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:02.914 16:31:00 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:03.178 [2024-11-28 16:31:00.582240] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:03.178 [2024-11-28 16:31:00.617124] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:03.178 [2024-11-28 16:31:00.617126] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.178 [2024-11-28 16:31:00.659319] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:03.178 [2024-11-28 16:31:00.659365] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:06.463 16:31:03 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:06.463 16:31:03 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:06.463 spdk_app_start Round 2 00:07:06.463 16:31:03 event.app_repeat -- event/event.sh@25 -- # waitforlisten 3746984 /var/tmp/spdk-nbd.sock 00:07:06.463 16:31:03 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 3746984 ']' 00:07:06.463 16:31:03 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:06.463 16:31:03 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:06.463 16:31:03 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:06.463 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:06.463 16:31:03 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:06.463 16:31:03 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:06.463 16:31:03 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:06.463 16:31:03 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:07:06.463 16:31:03 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:06.463 Malloc0 00:07:06.463 16:31:03 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:06.463 Malloc1 00:07:06.463 16:31:03 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:06.463 16:31:03 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:06.463 16:31:03 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:06.463 16:31:03 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:06.463 16:31:03 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:06.463 16:31:03 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:06.463 16:31:03 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:06.463 16:31:03 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:06.463 16:31:03 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:06.463 16:31:03 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:06.463 16:31:03 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:06.463 16:31:03 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:06.463 16:31:03 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:06.463 16:31:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:06.463 16:31:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:06.463 16:31:03 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:06.722 /dev/nbd0 00:07:06.722 16:31:04 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:06.722 16:31:04 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:06.722 16:31:04 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:07:06.722 16:31:04 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:06.722 16:31:04 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:06.722 16:31:04 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:06.722 16:31:04 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:07:06.722 16:31:04 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:06.722 16:31:04 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:06.722 16:31:04 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:06.722 16:31:04 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:06.722 1+0 records in 00:07:06.722 1+0 records out 00:07:06.722 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265522 s, 15.4 MB/s 00:07:06.722 16:31:04 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:06.722 16:31:04 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:06.722 16:31:04 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:06.722 16:31:04 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:06.722 16:31:04 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:06.722 16:31:04 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:06.722 16:31:04 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:06.722 16:31:04 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:06.981 /dev/nbd1 00:07:06.981 16:31:04 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:06.981 16:31:04 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:06.981 16:31:04 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:07:06.981 16:31:04 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:06.981 16:31:04 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:06.981 16:31:04 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:06.981 16:31:04 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:07:06.981 16:31:04 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:06.981 16:31:04 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:06.981 16:31:04 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:06.981 16:31:04 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:06.981 1+0 records in 00:07:06.981 1+0 records out 00:07:06.981 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000227873 s, 18.0 MB/s 00:07:06.981 16:31:04 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:06.981 16:31:04 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:06.981 16:31:04 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:06.981 16:31:04 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:06.981 16:31:04 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:06.981 16:31:04 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:06.981 16:31:04 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:06.981 16:31:04 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:06.981 16:31:04 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:06.981 16:31:04 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:07.241 { 00:07:07.241 "nbd_device": "/dev/nbd0", 00:07:07.241 "bdev_name": "Malloc0" 00:07:07.241 }, 00:07:07.241 { 00:07:07.241 "nbd_device": "/dev/nbd1", 00:07:07.241 "bdev_name": "Malloc1" 00:07:07.241 } 00:07:07.241 ]' 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:07.241 { 00:07:07.241 "nbd_device": "/dev/nbd0", 00:07:07.241 "bdev_name": "Malloc0" 00:07:07.241 }, 00:07:07.241 { 00:07:07.241 "nbd_device": "/dev/nbd1", 00:07:07.241 "bdev_name": "Malloc1" 00:07:07.241 } 00:07:07.241 ]' 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:07.241 /dev/nbd1' 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:07.241 /dev/nbd1' 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:07.241 256+0 records in 00:07:07.241 256+0 records out 00:07:07.241 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0110483 s, 94.9 MB/s 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:07.241 256+0 records in 00:07:07.241 256+0 records out 00:07:07.241 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0199327 s, 52.6 MB/s 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:07.241 256+0 records in 00:07:07.241 256+0 records out 00:07:07.241 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0213449 s, 49.1 MB/s 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:07.241 16:31:04 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:07.242 16:31:04 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:07.242 16:31:04 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:07.242 16:31:04 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:07.242 16:31:04 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:07.500 16:31:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:07.500 16:31:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:07.500 16:31:04 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:07.500 16:31:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:07.500 16:31:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:07.500 16:31:04 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:07.500 16:31:05 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:07.500 16:31:05 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:07.500 16:31:05 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:07.500 16:31:05 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:07.759 16:31:05 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:07.759 16:31:05 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:07.759 16:31:05 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:07.759 16:31:05 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:07.759 16:31:05 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:07.759 16:31:05 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:07.759 16:31:05 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:07.759 16:31:05 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:07.759 16:31:05 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:07.759 16:31:05 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:07.759 16:31:05 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:08.018 16:31:05 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:08.018 16:31:05 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:08.018 16:31:05 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:08.018 16:31:05 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:08.018 16:31:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:08.018 16:31:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:08.018 16:31:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:08.018 16:31:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:08.018 16:31:05 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:08.018 16:31:05 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:08.018 16:31:05 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:08.018 16:31:05 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:08.018 16:31:05 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:08.018 16:31:05 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:08.277 [2024-11-28 16:31:05.823231] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:08.277 [2024-11-28 16:31:05.857948] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:08.277 [2024-11-28 16:31:05.857951] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.277 [2024-11-28 16:31:05.899324] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:08.277 [2024-11-28 16:31:05.899371] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:11.566 16:31:08 event.app_repeat -- event/event.sh@38 -- # waitforlisten 3746984 /var/tmp/spdk-nbd.sock 00:07:11.566 16:31:08 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 3746984 ']' 00:07:11.566 16:31:08 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:11.566 16:31:08 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:11.566 16:31:08 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:11.566 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:11.566 16:31:08 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:11.566 16:31:08 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:11.566 16:31:08 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:11.566 16:31:08 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:07:11.566 16:31:08 event.app_repeat -- event/event.sh@39 -- # killprocess 3746984 00:07:11.566 16:31:08 event.app_repeat -- common/autotest_common.sh@950 -- # '[' -z 3746984 ']' 00:07:11.566 16:31:08 event.app_repeat -- common/autotest_common.sh@954 -- # kill -0 3746984 00:07:11.566 16:31:08 event.app_repeat -- common/autotest_common.sh@955 -- # uname 00:07:11.566 16:31:08 event.app_repeat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:11.566 16:31:08 event.app_repeat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3746984 00:07:11.566 16:31:08 event.app_repeat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:11.566 16:31:08 event.app_repeat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:11.566 16:31:08 event.app_repeat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3746984' 00:07:11.566 killing process with pid 3746984 00:07:11.566 16:31:08 event.app_repeat -- common/autotest_common.sh@969 -- # kill 3746984 00:07:11.566 16:31:08 event.app_repeat -- common/autotest_common.sh@974 -- # wait 3746984 00:07:11.566 spdk_app_start is called in Round 0. 00:07:11.566 Shutdown signal received, stop current app iteration 00:07:11.566 Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 reinitialization... 00:07:11.566 spdk_app_start is called in Round 1. 00:07:11.566 Shutdown signal received, stop current app iteration 00:07:11.566 Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 reinitialization... 00:07:11.566 spdk_app_start is called in Round 2. 00:07:11.566 Shutdown signal received, stop current app iteration 00:07:11.566 Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 reinitialization... 00:07:11.566 spdk_app_start is called in Round 3. 00:07:11.566 Shutdown signal received, stop current app iteration 00:07:11.566 16:31:09 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:11.566 16:31:09 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:11.566 00:07:11.566 real 0m16.179s 00:07:11.566 user 0m34.748s 00:07:11.566 sys 0m3.161s 00:07:11.566 16:31:09 event.app_repeat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:11.566 16:31:09 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:11.566 ************************************ 00:07:11.566 END TEST app_repeat 00:07:11.566 ************************************ 00:07:11.566 16:31:09 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:11.566 16:31:09 event -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:11.566 16:31:09 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:11.566 16:31:09 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:11.566 16:31:09 event -- common/autotest_common.sh@10 -- # set +x 00:07:11.566 ************************************ 00:07:11.566 START TEST cpu_locks 00:07:11.566 ************************************ 00:07:11.566 16:31:09 event.cpu_locks -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:11.825 * Looking for test storage... 00:07:11.825 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:07:11.825 16:31:09 event.cpu_locks -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:11.825 16:31:09 event.cpu_locks -- common/autotest_common.sh@1681 -- # lcov --version 00:07:11.825 16:31:09 event.cpu_locks -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:11.825 16:31:09 event.cpu_locks -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@336 -- # IFS=.-: 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@336 -- # read -ra ver1 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@337 -- # IFS=.-: 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@337 -- # read -ra ver2 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@338 -- # local 'op=<' 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@340 -- # ver1_l=2 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@341 -- # ver2_l=1 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@344 -- # case "$op" in 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@345 -- # : 1 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@365 -- # decimal 1 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@353 -- # local d=1 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@355 -- # echo 1 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@365 -- # ver1[v]=1 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@366 -- # decimal 2 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@353 -- # local d=2 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@355 -- # echo 2 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@366 -- # ver2[v]=2 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:11.825 16:31:09 event.cpu_locks -- scripts/common.sh@368 -- # return 0 00:07:11.825 16:31:09 event.cpu_locks -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:11.825 16:31:09 event.cpu_locks -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:11.825 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:11.825 --rc genhtml_branch_coverage=1 00:07:11.825 --rc genhtml_function_coverage=1 00:07:11.825 --rc genhtml_legend=1 00:07:11.825 --rc geninfo_all_blocks=1 00:07:11.825 --rc geninfo_unexecuted_blocks=1 00:07:11.825 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:11.825 ' 00:07:11.825 16:31:09 event.cpu_locks -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:11.825 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:11.825 --rc genhtml_branch_coverage=1 00:07:11.825 --rc genhtml_function_coverage=1 00:07:11.825 --rc genhtml_legend=1 00:07:11.825 --rc geninfo_all_blocks=1 00:07:11.825 --rc geninfo_unexecuted_blocks=1 00:07:11.825 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:11.825 ' 00:07:11.825 16:31:09 event.cpu_locks -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:11.825 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:11.825 --rc genhtml_branch_coverage=1 00:07:11.825 --rc genhtml_function_coverage=1 00:07:11.825 --rc genhtml_legend=1 00:07:11.825 --rc geninfo_all_blocks=1 00:07:11.825 --rc geninfo_unexecuted_blocks=1 00:07:11.825 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:11.825 ' 00:07:11.825 16:31:09 event.cpu_locks -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:11.825 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:11.826 --rc genhtml_branch_coverage=1 00:07:11.826 --rc genhtml_function_coverage=1 00:07:11.826 --rc genhtml_legend=1 00:07:11.826 --rc geninfo_all_blocks=1 00:07:11.826 --rc geninfo_unexecuted_blocks=1 00:07:11.826 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:11.826 ' 00:07:11.826 16:31:09 event.cpu_locks -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:07:11.826 16:31:09 event.cpu_locks -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:07:11.826 16:31:09 event.cpu_locks -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:07:11.826 16:31:09 event.cpu_locks -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:07:11.826 16:31:09 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:11.826 16:31:09 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:11.826 16:31:09 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:11.826 ************************************ 00:07:11.826 START TEST default_locks 00:07:11.826 ************************************ 00:07:11.826 16:31:09 event.cpu_locks.default_locks -- common/autotest_common.sh@1125 -- # default_locks 00:07:11.826 16:31:09 event.cpu_locks.default_locks -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:11.826 16:31:09 event.cpu_locks.default_locks -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=3750156 00:07:11.826 16:31:09 event.cpu_locks.default_locks -- event/cpu_locks.sh@47 -- # waitforlisten 3750156 00:07:11.826 16:31:09 event.cpu_locks.default_locks -- common/autotest_common.sh@831 -- # '[' -z 3750156 ']' 00:07:11.826 16:31:09 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:11.826 16:31:09 event.cpu_locks.default_locks -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:11.826 16:31:09 event.cpu_locks.default_locks -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:11.826 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:11.826 16:31:09 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:11.826 16:31:09 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:11.826 [2024-11-28 16:31:09.377553] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:11.826 [2024-11-28 16:31:09.377617] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3750156 ] 00:07:11.826 [2024-11-28 16:31:09.442864] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.085 [2024-11-28 16:31:09.482786] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.085 16:31:09 event.cpu_locks.default_locks -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:12.085 16:31:09 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # return 0 00:07:12.085 16:31:09 event.cpu_locks.default_locks -- event/cpu_locks.sh@49 -- # locks_exist 3750156 00:07:12.085 16:31:09 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # lslocks -p 3750156 00:07:12.085 16:31:09 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:12.653 lslocks: write error 00:07:12.653 16:31:10 event.cpu_locks.default_locks -- event/cpu_locks.sh@50 -- # killprocess 3750156 00:07:12.653 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@950 -- # '[' -z 3750156 ']' 00:07:12.653 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@954 -- # kill -0 3750156 00:07:12.654 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@955 -- # uname 00:07:12.654 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:12.654 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3750156 00:07:12.654 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:12.654 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:12.654 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3750156' 00:07:12.654 killing process with pid 3750156 00:07:12.654 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@969 -- # kill 3750156 00:07:12.654 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@974 -- # wait 3750156 00:07:12.913 16:31:10 event.cpu_locks.default_locks -- event/cpu_locks.sh@52 -- # NOT waitforlisten 3750156 00:07:12.913 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@650 -- # local es=0 00:07:12.913 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 3750156 00:07:12.913 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:07:12.913 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:12.913 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:07:12.913 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:12.913 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@653 -- # waitforlisten 3750156 00:07:12.913 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@831 -- # '[' -z 3750156 ']' 00:07:12.913 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:12.913 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:12.913 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:12.913 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:12.913 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:12.913 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:12.913 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 846: kill: (3750156) - No such process 00:07:12.913 ERROR: process (pid: 3750156) is no longer running 00:07:12.913 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:12.913 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # return 1 00:07:12.913 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@653 -- # es=1 00:07:12.914 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:12.914 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:12.914 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:12.914 16:31:10 event.cpu_locks.default_locks -- event/cpu_locks.sh@54 -- # no_locks 00:07:12.914 16:31:10 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:12.914 16:31:10 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # local lock_files 00:07:12.914 16:31:10 event.cpu_locks.default_locks -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:12.914 00:07:12.914 real 0m1.120s 00:07:12.914 user 0m1.098s 00:07:12.914 sys 0m0.551s 00:07:12.914 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:12.914 16:31:10 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:12.914 ************************************ 00:07:12.914 END TEST default_locks 00:07:12.914 ************************************ 00:07:12.914 16:31:10 event.cpu_locks -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:07:12.914 16:31:10 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:12.914 16:31:10 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:12.914 16:31:10 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:12.914 ************************************ 00:07:12.914 START TEST default_locks_via_rpc 00:07:12.914 ************************************ 00:07:12.914 16:31:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1125 -- # default_locks_via_rpc 00:07:12.914 16:31:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=3750347 00:07:13.174 16:31:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@63 -- # waitforlisten 3750347 00:07:13.174 16:31:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:13.174 16:31:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 3750347 ']' 00:07:13.174 16:31:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:13.174 16:31:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:13.174 16:31:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:13.174 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:13.174 16:31:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:13.174 16:31:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:13.174 [2024-11-28 16:31:10.584287] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:13.174 [2024-11-28 16:31:10.584365] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3750347 ] 00:07:13.174 [2024-11-28 16:31:10.652432] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.174 [2024-11-28 16:31:10.692029] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.473 16:31:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:13.473 16:31:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:13.473 16:31:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:07:13.473 16:31:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:13.473 16:31:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:13.473 16:31:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:13.473 16:31:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@67 -- # no_locks 00:07:13.473 16:31:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:13.473 16:31:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # local lock_files 00:07:13.473 16:31:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:13.473 16:31:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:07:13.473 16:31:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:13.473 16:31:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:13.473 16:31:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:13.473 16:31:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@71 -- # locks_exist 3750347 00:07:13.473 16:31:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:13.473 16:31:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # lslocks -p 3750347 00:07:13.798 16:31:11 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@73 -- # killprocess 3750347 00:07:13.799 16:31:11 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@950 -- # '[' -z 3750347 ']' 00:07:13.799 16:31:11 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@954 -- # kill -0 3750347 00:07:13.799 16:31:11 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@955 -- # uname 00:07:13.799 16:31:11 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:13.799 16:31:11 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3750347 00:07:14.058 16:31:11 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:14.058 16:31:11 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:14.058 16:31:11 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3750347' 00:07:14.058 killing process with pid 3750347 00:07:14.058 16:31:11 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@969 -- # kill 3750347 00:07:14.058 16:31:11 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@974 -- # wait 3750347 00:07:14.317 00:07:14.317 real 0m1.195s 00:07:14.317 user 0m1.160s 00:07:14.317 sys 0m0.591s 00:07:14.317 16:31:11 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:14.317 16:31:11 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:14.317 ************************************ 00:07:14.317 END TEST default_locks_via_rpc 00:07:14.317 ************************************ 00:07:14.317 16:31:11 event.cpu_locks -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:07:14.317 16:31:11 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:14.317 16:31:11 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:14.317 16:31:11 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:14.317 ************************************ 00:07:14.317 START TEST non_locking_app_on_locked_coremask 00:07:14.317 ************************************ 00:07:14.317 16:31:11 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1125 -- # non_locking_app_on_locked_coremask 00:07:14.317 16:31:11 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=3750512 00:07:14.317 16:31:11 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@81 -- # waitforlisten 3750512 /var/tmp/spdk.sock 00:07:14.317 16:31:11 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:14.317 16:31:11 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 3750512 ']' 00:07:14.317 16:31:11 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:14.318 16:31:11 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:14.318 16:31:11 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:14.318 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:14.318 16:31:11 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:14.318 16:31:11 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:14.318 [2024-11-28 16:31:11.857221] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:14.318 [2024-11-28 16:31:11.857294] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3750512 ] 00:07:14.318 [2024-11-28 16:31:11.925450] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.577 [2024-11-28 16:31:11.965119] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.577 16:31:12 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:14.577 16:31:12 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:14.577 16:31:12 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=3750673 00:07:14.577 16:31:12 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@85 -- # waitforlisten 3750673 /var/tmp/spdk2.sock 00:07:14.577 16:31:12 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:07:14.577 16:31:12 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 3750673 ']' 00:07:14.577 16:31:12 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:14.577 16:31:12 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:14.577 16:31:12 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:14.577 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:14.577 16:31:12 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:14.577 16:31:12 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:14.577 [2024-11-28 16:31:12.183456] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:14.577 [2024-11-28 16:31:12.183547] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3750673 ] 00:07:14.836 [2024-11-28 16:31:12.273277] app.c: 914:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:14.836 [2024-11-28 16:31:12.273307] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.836 [2024-11-28 16:31:12.353242] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.403 16:31:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:15.403 16:31:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:15.403 16:31:13 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@87 -- # locks_exist 3750512 00:07:15.403 16:31:13 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 3750512 00:07:15.403 16:31:13 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:16.373 lslocks: write error 00:07:16.373 16:31:13 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@89 -- # killprocess 3750512 00:07:16.373 16:31:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@950 -- # '[' -z 3750512 ']' 00:07:16.373 16:31:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # kill -0 3750512 00:07:16.373 16:31:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:16.373 16:31:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:16.373 16:31:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3750512 00:07:16.373 16:31:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:16.373 16:31:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:16.373 16:31:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3750512' 00:07:16.373 killing process with pid 3750512 00:07:16.373 16:31:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@969 -- # kill 3750512 00:07:16.373 16:31:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@974 -- # wait 3750512 00:07:16.942 16:31:14 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@90 -- # killprocess 3750673 00:07:16.942 16:31:14 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@950 -- # '[' -z 3750673 ']' 00:07:16.942 16:31:14 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # kill -0 3750673 00:07:16.942 16:31:14 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:16.942 16:31:14 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:16.942 16:31:14 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3750673 00:07:16.942 16:31:14 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:16.942 16:31:14 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:16.942 16:31:14 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3750673' 00:07:16.942 killing process with pid 3750673 00:07:16.942 16:31:14 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@969 -- # kill 3750673 00:07:16.942 16:31:14 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@974 -- # wait 3750673 00:07:17.202 00:07:17.202 real 0m2.912s 00:07:17.202 user 0m3.000s 00:07:17.202 sys 0m1.107s 00:07:17.202 16:31:14 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:17.202 16:31:14 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:17.202 ************************************ 00:07:17.202 END TEST non_locking_app_on_locked_coremask 00:07:17.202 ************************************ 00:07:17.202 16:31:14 event.cpu_locks -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:07:17.202 16:31:14 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:17.202 16:31:14 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:17.202 16:31:14 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:17.202 ************************************ 00:07:17.202 START TEST locking_app_on_unlocked_coremask 00:07:17.202 ************************************ 00:07:17.202 16:31:14 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1125 -- # locking_app_on_unlocked_coremask 00:07:17.202 16:31:14 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=3751068 00:07:17.202 16:31:14 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@99 -- # waitforlisten 3751068 /var/tmp/spdk.sock 00:07:17.202 16:31:14 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:07:17.202 16:31:14 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@831 -- # '[' -z 3751068 ']' 00:07:17.202 16:31:14 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:17.202 16:31:14 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:17.202 16:31:14 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:17.202 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:17.202 16:31:14 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:17.202 16:31:14 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:17.461 [2024-11-28 16:31:14.851261] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:17.461 [2024-11-28 16:31:14.851340] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3751068 ] 00:07:17.461 [2024-11-28 16:31:14.920912] app.c: 914:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:17.461 [2024-11-28 16:31:14.920939] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.461 [2024-11-28 16:31:14.958941] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.730 16:31:15 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:17.730 16:31:15 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:17.730 16:31:15 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=3751185 00:07:17.730 16:31:15 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@103 -- # waitforlisten 3751185 /var/tmp/spdk2.sock 00:07:17.730 16:31:15 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:17.730 16:31:15 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@831 -- # '[' -z 3751185 ']' 00:07:17.730 16:31:15 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:17.730 16:31:15 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:17.730 16:31:15 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:17.730 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:17.730 16:31:15 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:17.730 16:31:15 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:17.730 [2024-11-28 16:31:15.178660] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:17.730 [2024-11-28 16:31:15.178747] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3751185 ] 00:07:17.731 [2024-11-28 16:31:15.267337] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.731 [2024-11-28 16:31:15.344796] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.674 16:31:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:18.674 16:31:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:18.674 16:31:16 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@105 -- # locks_exist 3751185 00:07:18.674 16:31:16 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 3751185 00:07:18.674 16:31:16 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:19.243 lslocks: write error 00:07:19.243 16:31:16 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@107 -- # killprocess 3751068 00:07:19.243 16:31:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@950 -- # '[' -z 3751068 ']' 00:07:19.502 16:31:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # kill -0 3751068 00:07:19.503 16:31:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:19.503 16:31:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:19.503 16:31:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3751068 00:07:19.503 16:31:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:19.503 16:31:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:19.503 16:31:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3751068' 00:07:19.503 killing process with pid 3751068 00:07:19.503 16:31:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@969 -- # kill 3751068 00:07:19.503 16:31:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@974 -- # wait 3751068 00:07:20.073 16:31:17 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@108 -- # killprocess 3751185 00:07:20.073 16:31:17 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@950 -- # '[' -z 3751185 ']' 00:07:20.074 16:31:17 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # kill -0 3751185 00:07:20.074 16:31:17 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:20.074 16:31:17 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:20.074 16:31:17 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3751185 00:07:20.074 16:31:17 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:20.074 16:31:17 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:20.074 16:31:17 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3751185' 00:07:20.074 killing process with pid 3751185 00:07:20.074 16:31:17 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@969 -- # kill 3751185 00:07:20.074 16:31:17 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@974 -- # wait 3751185 00:07:20.334 00:07:20.334 real 0m3.096s 00:07:20.334 user 0m3.227s 00:07:20.334 sys 0m1.153s 00:07:20.334 16:31:17 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:20.334 16:31:17 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:20.334 ************************************ 00:07:20.334 END TEST locking_app_on_unlocked_coremask 00:07:20.334 ************************************ 00:07:20.334 16:31:17 event.cpu_locks -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:07:20.334 16:31:17 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:20.334 16:31:17 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:20.334 16:31:17 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:20.594 ************************************ 00:07:20.594 START TEST locking_app_on_locked_coremask 00:07:20.594 ************************************ 00:07:20.594 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1125 -- # locking_app_on_locked_coremask 00:07:20.594 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=3751644 00:07:20.594 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@116 -- # waitforlisten 3751644 /var/tmp/spdk.sock 00:07:20.594 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:20.594 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 3751644 ']' 00:07:20.594 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:20.594 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:20.594 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:20.594 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:20.594 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:20.594 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:20.594 [2024-11-28 16:31:18.030353] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:20.594 [2024-11-28 16:31:18.030436] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3751644 ] 00:07:20.594 [2024-11-28 16:31:18.098341] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.594 [2024-11-28 16:31:18.136015] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.854 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:20.854 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:20.854 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=3751799 00:07:20.854 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@120 -- # NOT waitforlisten 3751799 /var/tmp/spdk2.sock 00:07:20.854 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:20.854 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@650 -- # local es=0 00:07:20.854 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 3751799 /var/tmp/spdk2.sock 00:07:20.854 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:07:20.854 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:20.854 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:07:20.854 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:20.854 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@653 -- # waitforlisten 3751799 /var/tmp/spdk2.sock 00:07:20.854 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@831 -- # '[' -z 3751799 ']' 00:07:20.854 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:20.854 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:20.854 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:20.854 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:20.854 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:20.854 16:31:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:20.854 [2024-11-28 16:31:18.357135] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:20.854 [2024-11-28 16:31:18.357223] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3751799 ] 00:07:20.854 [2024-11-28 16:31:18.443430] app.c: 780:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 3751644 has claimed it. 00:07:20.854 [2024-11-28 16:31:18.443471] app.c: 910:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:21.423 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 846: kill: (3751799) - No such process 00:07:21.423 ERROR: process (pid: 3751799) is no longer running 00:07:21.423 16:31:19 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:21.423 16:31:19 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # return 1 00:07:21.423 16:31:19 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@653 -- # es=1 00:07:21.423 16:31:19 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:21.423 16:31:19 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:21.423 16:31:19 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:21.423 16:31:19 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@122 -- # locks_exist 3751644 00:07:21.423 16:31:19 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 3751644 00:07:21.423 16:31:19 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:22.362 lslocks: write error 00:07:22.362 16:31:19 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@124 -- # killprocess 3751644 00:07:22.362 16:31:19 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@950 -- # '[' -z 3751644 ']' 00:07:22.362 16:31:19 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # kill -0 3751644 00:07:22.362 16:31:19 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # uname 00:07:22.362 16:31:19 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:22.362 16:31:19 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3751644 00:07:22.362 16:31:19 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:22.362 16:31:19 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:22.362 16:31:19 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3751644' 00:07:22.362 killing process with pid 3751644 00:07:22.362 16:31:19 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@969 -- # kill 3751644 00:07:22.362 16:31:19 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@974 -- # wait 3751644 00:07:22.621 00:07:22.621 real 0m2.122s 00:07:22.621 user 0m2.263s 00:07:22.621 sys 0m0.830s 00:07:22.621 16:31:20 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:22.621 16:31:20 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:22.621 ************************************ 00:07:22.621 END TEST locking_app_on_locked_coremask 00:07:22.621 ************************************ 00:07:22.621 16:31:20 event.cpu_locks -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:07:22.621 16:31:20 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:22.621 16:31:20 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:22.621 16:31:20 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:22.621 ************************************ 00:07:22.621 START TEST locking_overlapped_coremask 00:07:22.621 ************************************ 00:07:22.621 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1125 -- # locking_overlapped_coremask 00:07:22.621 16:31:20 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=3752201 00:07:22.621 16:31:20 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@133 -- # waitforlisten 3752201 /var/tmp/spdk.sock 00:07:22.621 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@831 -- # '[' -z 3752201 ']' 00:07:22.622 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:22.622 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:22.622 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:22.622 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:22.622 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:22.622 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:22.622 16:31:20 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:07:22.622 [2024-11-28 16:31:20.222097] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:22.622 [2024-11-28 16:31:20.222178] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3752201 ] 00:07:22.881 [2024-11-28 16:31:20.289418] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:22.881 [2024-11-28 16:31:20.329653] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:22.881 [2024-11-28 16:31:20.329749] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:07:22.881 [2024-11-28 16:31:20.329752] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.881 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:22.881 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # return 0 00:07:22.881 16:31:20 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=3752208 00:07:22.881 16:31:20 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@137 -- # NOT waitforlisten 3752208 /var/tmp/spdk2.sock 00:07:22.881 16:31:20 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:07:22.881 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@650 -- # local es=0 00:07:22.882 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 3752208 /var/tmp/spdk2.sock 00:07:22.882 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:07:23.140 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:23.140 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:07:23.140 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:23.140 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@653 -- # waitforlisten 3752208 /var/tmp/spdk2.sock 00:07:23.140 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@831 -- # '[' -z 3752208 ']' 00:07:23.140 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:23.140 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:23.140 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:23.140 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:23.140 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:23.140 16:31:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:23.140 [2024-11-28 16:31:20.550179] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:23.140 [2024-11-28 16:31:20.550253] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3752208 ] 00:07:23.140 [2024-11-28 16:31:20.641241] app.c: 780:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 3752201 has claimed it. 00:07:23.140 [2024-11-28 16:31:20.641277] app.c: 910:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:23.708 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 846: kill: (3752208) - No such process 00:07:23.708 ERROR: process (pid: 3752208) is no longer running 00:07:23.708 16:31:21 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:23.708 16:31:21 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # return 1 00:07:23.708 16:31:21 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@653 -- # es=1 00:07:23.708 16:31:21 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:23.708 16:31:21 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:23.708 16:31:21 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:23.708 16:31:21 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:07:23.708 16:31:21 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:23.708 16:31:21 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:23.708 16:31:21 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:23.708 16:31:21 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@141 -- # killprocess 3752201 00:07:23.708 16:31:21 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@950 -- # '[' -z 3752201 ']' 00:07:23.708 16:31:21 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@954 -- # kill -0 3752201 00:07:23.708 16:31:21 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@955 -- # uname 00:07:23.708 16:31:21 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:23.708 16:31:21 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3752201 00:07:23.708 16:31:21 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:23.708 16:31:21 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:23.708 16:31:21 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3752201' 00:07:23.708 killing process with pid 3752201 00:07:23.708 16:31:21 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@969 -- # kill 3752201 00:07:23.708 16:31:21 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@974 -- # wait 3752201 00:07:23.967 00:07:23.967 real 0m1.389s 00:07:23.967 user 0m3.804s 00:07:23.967 sys 0m0.430s 00:07:23.967 16:31:21 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:23.967 16:31:21 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:23.967 ************************************ 00:07:23.967 END TEST locking_overlapped_coremask 00:07:23.967 ************************************ 00:07:24.227 16:31:21 event.cpu_locks -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:07:24.227 16:31:21 event.cpu_locks -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:24.227 16:31:21 event.cpu_locks -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:24.227 16:31:21 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:24.227 ************************************ 00:07:24.227 START TEST locking_overlapped_coremask_via_rpc 00:07:24.227 ************************************ 00:07:24.227 16:31:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1125 -- # locking_overlapped_coremask_via_rpc 00:07:24.227 16:31:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=3752498 00:07:24.227 16:31:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@149 -- # waitforlisten 3752498 /var/tmp/spdk.sock 00:07:24.227 16:31:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:07:24.227 16:31:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 3752498 ']' 00:07:24.227 16:31:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:24.227 16:31:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:24.227 16:31:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:24.227 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:24.227 16:31:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:24.227 16:31:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:24.227 [2024-11-28 16:31:21.691616] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:24.227 [2024-11-28 16:31:21.691677] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3752498 ] 00:07:24.227 [2024-11-28 16:31:21.758174] app.c: 914:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:24.227 [2024-11-28 16:31:21.758199] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:24.227 [2024-11-28 16:31:21.799860] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:24.227 [2024-11-28 16:31:21.799954] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:07:24.227 [2024-11-28 16:31:21.799956] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.487 16:31:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:24.487 16:31:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:24.487 16:31:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=3752503 00:07:24.487 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@153 -- # waitforlisten 3752503 /var/tmp/spdk2.sock 00:07:24.487 16:31:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:07:24.487 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 3752503 ']' 00:07:24.487 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:24.487 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:24.487 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:24.487 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:24.487 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:24.487 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:24.487 [2024-11-28 16:31:22.024886] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:24.487 [2024-11-28 16:31:22.024976] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3752503 ] 00:07:24.487 [2024-11-28 16:31:22.114881] app.c: 914:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:24.487 [2024-11-28 16:31:22.114909] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:24.747 [2024-11-28 16:31:22.195407] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 3 00:07:24.747 [2024-11-28 16:31:22.198647] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 2 00:07:24.747 [2024-11-28 16:31:22.198648] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 4 00:07:25.317 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:25.317 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:25.317 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:07:25.317 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:25.317 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:25.317 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:25.317 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:25.317 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@650 -- # local es=0 00:07:25.317 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:25.317 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:07:25.317 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:25.317 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:07:25.317 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:25.317 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@653 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:25.317 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:25.317 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:25.317 [2024-11-28 16:31:22.893666] app.c: 780:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 3752498 has claimed it. 00:07:25.317 request: 00:07:25.318 { 00:07:25.318 "method": "framework_enable_cpumask_locks", 00:07:25.318 "req_id": 1 00:07:25.318 } 00:07:25.318 Got JSON-RPC error response 00:07:25.318 response: 00:07:25.318 { 00:07:25.318 "code": -32603, 00:07:25.318 "message": "Failed to claim CPU core: 2" 00:07:25.318 } 00:07:25.318 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:07:25.318 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@653 -- # es=1 00:07:25.318 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:25.318 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:25.318 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:25.318 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@158 -- # waitforlisten 3752498 /var/tmp/spdk.sock 00:07:25.318 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 3752498 ']' 00:07:25.318 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:25.318 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:25.318 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:25.318 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:25.318 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:25.318 16:31:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:25.577 16:31:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:25.577 16:31:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:25.577 16:31:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@159 -- # waitforlisten 3752503 /var/tmp/spdk2.sock 00:07:25.577 16:31:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@831 -- # '[' -z 3752503 ']' 00:07:25.577 16:31:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:25.577 16:31:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:25.577 16:31:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:25.577 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:25.577 16:31:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:25.577 16:31:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:25.837 16:31:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:25.837 16:31:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:25.837 16:31:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:07:25.837 16:31:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:25.837 16:31:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:25.837 16:31:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:25.837 00:07:25.837 real 0m1.659s 00:07:25.837 user 0m0.785s 00:07:25.837 sys 0m0.174s 00:07:25.837 16:31:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:25.837 16:31:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:25.837 ************************************ 00:07:25.837 END TEST locking_overlapped_coremask_via_rpc 00:07:25.837 ************************************ 00:07:25.837 16:31:23 event.cpu_locks -- event/cpu_locks.sh@174 -- # cleanup 00:07:25.837 16:31:23 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 3752498 ]] 00:07:25.837 16:31:23 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 3752498 00:07:25.837 16:31:23 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 3752498 ']' 00:07:25.837 16:31:23 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 3752498 00:07:25.837 16:31:23 event.cpu_locks -- common/autotest_common.sh@955 -- # uname 00:07:25.837 16:31:23 event.cpu_locks -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:25.837 16:31:23 event.cpu_locks -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3752498 00:07:25.837 16:31:23 event.cpu_locks -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:25.837 16:31:23 event.cpu_locks -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:25.837 16:31:23 event.cpu_locks -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3752498' 00:07:25.837 killing process with pid 3752498 00:07:25.837 16:31:23 event.cpu_locks -- common/autotest_common.sh@969 -- # kill 3752498 00:07:25.837 16:31:23 event.cpu_locks -- common/autotest_common.sh@974 -- # wait 3752498 00:07:26.407 16:31:23 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 3752503 ]] 00:07:26.407 16:31:23 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 3752503 00:07:26.407 16:31:23 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 3752503 ']' 00:07:26.407 16:31:23 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 3752503 00:07:26.407 16:31:23 event.cpu_locks -- common/autotest_common.sh@955 -- # uname 00:07:26.407 16:31:23 event.cpu_locks -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:26.407 16:31:23 event.cpu_locks -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3752503 00:07:26.407 16:31:23 event.cpu_locks -- common/autotest_common.sh@956 -- # process_name=reactor_2 00:07:26.407 16:31:23 event.cpu_locks -- common/autotest_common.sh@960 -- # '[' reactor_2 = sudo ']' 00:07:26.407 16:31:23 event.cpu_locks -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3752503' 00:07:26.407 killing process with pid 3752503 00:07:26.407 16:31:23 event.cpu_locks -- common/autotest_common.sh@969 -- # kill 3752503 00:07:26.407 16:31:23 event.cpu_locks -- common/autotest_common.sh@974 -- # wait 3752503 00:07:26.667 16:31:24 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:26.667 16:31:24 event.cpu_locks -- event/cpu_locks.sh@1 -- # cleanup 00:07:26.667 16:31:24 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 3752498 ]] 00:07:26.667 16:31:24 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 3752498 00:07:26.667 16:31:24 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 3752498 ']' 00:07:26.667 16:31:24 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 3752498 00:07:26.667 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 954: kill: (3752498) - No such process 00:07:26.667 16:31:24 event.cpu_locks -- common/autotest_common.sh@977 -- # echo 'Process with pid 3752498 is not found' 00:07:26.667 Process with pid 3752498 is not found 00:07:26.667 16:31:24 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 3752503 ]] 00:07:26.667 16:31:24 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 3752503 00:07:26.667 16:31:24 event.cpu_locks -- common/autotest_common.sh@950 -- # '[' -z 3752503 ']' 00:07:26.667 16:31:24 event.cpu_locks -- common/autotest_common.sh@954 -- # kill -0 3752503 00:07:26.667 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 954: kill: (3752503) - No such process 00:07:26.667 16:31:24 event.cpu_locks -- common/autotest_common.sh@977 -- # echo 'Process with pid 3752503 is not found' 00:07:26.667 Process with pid 3752503 is not found 00:07:26.667 16:31:24 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:26.667 00:07:26.667 real 0m14.997s 00:07:26.667 user 0m25.193s 00:07:26.667 sys 0m5.927s 00:07:26.667 16:31:24 event.cpu_locks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:26.667 16:31:24 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:26.667 ************************************ 00:07:26.667 END TEST cpu_locks 00:07:26.667 ************************************ 00:07:26.667 00:07:26.667 real 0m40.416s 00:07:26.667 user 1m16.059s 00:07:26.667 sys 0m10.220s 00:07:26.667 16:31:24 event -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:26.667 16:31:24 event -- common/autotest_common.sh@10 -- # set +x 00:07:26.667 ************************************ 00:07:26.667 END TEST event 00:07:26.667 ************************************ 00:07:26.667 16:31:24 -- spdk/autotest.sh@169 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:26.667 16:31:24 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:26.667 16:31:24 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:26.667 16:31:24 -- common/autotest_common.sh@10 -- # set +x 00:07:26.667 ************************************ 00:07:26.667 START TEST thread 00:07:26.667 ************************************ 00:07:26.667 16:31:24 thread -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:26.927 * Looking for test storage... 00:07:26.927 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:07:26.927 16:31:24 thread -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:26.927 16:31:24 thread -- common/autotest_common.sh@1681 -- # lcov --version 00:07:26.927 16:31:24 thread -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:26.927 16:31:24 thread -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:26.927 16:31:24 thread -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:26.927 16:31:24 thread -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:26.927 16:31:24 thread -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:26.927 16:31:24 thread -- scripts/common.sh@336 -- # IFS=.-: 00:07:26.927 16:31:24 thread -- scripts/common.sh@336 -- # read -ra ver1 00:07:26.927 16:31:24 thread -- scripts/common.sh@337 -- # IFS=.-: 00:07:26.927 16:31:24 thread -- scripts/common.sh@337 -- # read -ra ver2 00:07:26.927 16:31:24 thread -- scripts/common.sh@338 -- # local 'op=<' 00:07:26.927 16:31:24 thread -- scripts/common.sh@340 -- # ver1_l=2 00:07:26.927 16:31:24 thread -- scripts/common.sh@341 -- # ver2_l=1 00:07:26.927 16:31:24 thread -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:26.927 16:31:24 thread -- scripts/common.sh@344 -- # case "$op" in 00:07:26.927 16:31:24 thread -- scripts/common.sh@345 -- # : 1 00:07:26.927 16:31:24 thread -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:26.927 16:31:24 thread -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:26.927 16:31:24 thread -- scripts/common.sh@365 -- # decimal 1 00:07:26.927 16:31:24 thread -- scripts/common.sh@353 -- # local d=1 00:07:26.927 16:31:24 thread -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:26.927 16:31:24 thread -- scripts/common.sh@355 -- # echo 1 00:07:26.927 16:31:24 thread -- scripts/common.sh@365 -- # ver1[v]=1 00:07:26.927 16:31:24 thread -- scripts/common.sh@366 -- # decimal 2 00:07:26.927 16:31:24 thread -- scripts/common.sh@353 -- # local d=2 00:07:26.927 16:31:24 thread -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:26.927 16:31:24 thread -- scripts/common.sh@355 -- # echo 2 00:07:26.927 16:31:24 thread -- scripts/common.sh@366 -- # ver2[v]=2 00:07:26.927 16:31:24 thread -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:26.927 16:31:24 thread -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:26.927 16:31:24 thread -- scripts/common.sh@368 -- # return 0 00:07:26.927 16:31:24 thread -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:26.927 16:31:24 thread -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:26.927 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:26.927 --rc genhtml_branch_coverage=1 00:07:26.927 --rc genhtml_function_coverage=1 00:07:26.927 --rc genhtml_legend=1 00:07:26.927 --rc geninfo_all_blocks=1 00:07:26.927 --rc geninfo_unexecuted_blocks=1 00:07:26.927 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:26.927 ' 00:07:26.927 16:31:24 thread -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:26.927 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:26.927 --rc genhtml_branch_coverage=1 00:07:26.927 --rc genhtml_function_coverage=1 00:07:26.927 --rc genhtml_legend=1 00:07:26.927 --rc geninfo_all_blocks=1 00:07:26.927 --rc geninfo_unexecuted_blocks=1 00:07:26.927 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:26.927 ' 00:07:26.927 16:31:24 thread -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:26.927 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:26.927 --rc genhtml_branch_coverage=1 00:07:26.927 --rc genhtml_function_coverage=1 00:07:26.927 --rc genhtml_legend=1 00:07:26.927 --rc geninfo_all_blocks=1 00:07:26.927 --rc geninfo_unexecuted_blocks=1 00:07:26.927 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:26.927 ' 00:07:26.927 16:31:24 thread -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:26.927 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:26.927 --rc genhtml_branch_coverage=1 00:07:26.927 --rc genhtml_function_coverage=1 00:07:26.927 --rc genhtml_legend=1 00:07:26.927 --rc geninfo_all_blocks=1 00:07:26.927 --rc geninfo_unexecuted_blocks=1 00:07:26.927 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:26.927 ' 00:07:26.927 16:31:24 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:26.927 16:31:24 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:07:26.927 16:31:24 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:26.927 16:31:24 thread -- common/autotest_common.sh@10 -- # set +x 00:07:26.927 ************************************ 00:07:26.927 START TEST thread_poller_perf 00:07:26.927 ************************************ 00:07:26.927 16:31:24 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:26.927 [2024-11-28 16:31:24.460736] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:26.927 [2024-11-28 16:31:24.460790] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3753001 ] 00:07:26.927 [2024-11-28 16:31:24.525207] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.927 [2024-11-28 16:31:24.563397] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.927 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:28.305 [2024-11-28T15:31:25.951Z] ====================================== 00:07:28.305 [2024-11-28T15:31:25.951Z] busy:2504185766 (cyc) 00:07:28.305 [2024-11-28T15:31:25.951Z] total_run_count: 856000 00:07:28.305 [2024-11-28T15:31:25.951Z] tsc_hz: 2500000000 (cyc) 00:07:28.305 [2024-11-28T15:31:25.951Z] ====================================== 00:07:28.305 [2024-11-28T15:31:25.951Z] poller_cost: 2925 (cyc), 1170 (nsec) 00:07:28.305 00:07:28.305 real 0m1.169s 00:07:28.305 user 0m1.081s 00:07:28.305 sys 0m0.085s 00:07:28.305 16:31:25 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:28.305 16:31:25 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:28.305 ************************************ 00:07:28.305 END TEST thread_poller_perf 00:07:28.305 ************************************ 00:07:28.305 16:31:25 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:28.305 16:31:25 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:07:28.305 16:31:25 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:28.305 16:31:25 thread -- common/autotest_common.sh@10 -- # set +x 00:07:28.305 ************************************ 00:07:28.305 START TEST thread_poller_perf 00:07:28.305 ************************************ 00:07:28.305 16:31:25 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:28.305 [2024-11-28 16:31:25.703874] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:28.305 [2024-11-28 16:31:25.703978] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3753176 ] 00:07:28.305 [2024-11-28 16:31:25.774089] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.306 [2024-11-28 16:31:25.811850] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.306 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:29.243 [2024-11-28T15:31:26.889Z] ====================================== 00:07:29.243 [2024-11-28T15:31:26.889Z] busy:2501504510 (cyc) 00:07:29.243 [2024-11-28T15:31:26.889Z] total_run_count: 13403000 00:07:29.243 [2024-11-28T15:31:26.889Z] tsc_hz: 2500000000 (cyc) 00:07:29.243 [2024-11-28T15:31:26.889Z] ====================================== 00:07:29.243 [2024-11-28T15:31:26.889Z] poller_cost: 186 (cyc), 74 (nsec) 00:07:29.243 00:07:29.243 real 0m1.178s 00:07:29.243 user 0m1.086s 00:07:29.243 sys 0m0.088s 00:07:29.243 16:31:26 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:29.243 16:31:26 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:29.243 ************************************ 00:07:29.243 END TEST thread_poller_perf 00:07:29.243 ************************************ 00:07:29.502 16:31:26 thread -- thread/thread.sh@17 -- # [[ n != \y ]] 00:07:29.502 16:31:26 thread -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:29.502 16:31:26 thread -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:29.502 16:31:26 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:29.502 16:31:26 thread -- common/autotest_common.sh@10 -- # set +x 00:07:29.502 ************************************ 00:07:29.502 START TEST thread_spdk_lock 00:07:29.502 ************************************ 00:07:29.502 16:31:26 thread.thread_spdk_lock -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:29.502 [2024-11-28 16:31:26.946321] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:29.502 [2024-11-28 16:31:26.946419] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3753461 ] 00:07:29.502 [2024-11-28 16:31:27.016776] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:29.502 [2024-11-28 16:31:27.055047] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 1 00:07:29.502 [2024-11-28 16:31:27.055050] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.071 [2024-11-28 16:31:27.544004] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 967:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:30.071 [2024-11-28 16:31:27.544036] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3080:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:07:30.071 [2024-11-28 16:31:27.544047] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3035:sspin_stacks_print: *ERROR*: spinlock 0x130cd00 00:07:30.071 [2024-11-28 16:31:27.544912] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 862:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:30.071 [2024-11-28 16:31:27.545016] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1028:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:30.071 [2024-11-28 16:31:27.545035] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 862:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:30.071 Starting test contend 00:07:30.071 Worker Delay Wait us Hold us Total us 00:07:30.071 0 3 164545 184286 348832 00:07:30.071 1 5 79968 285746 365714 00:07:30.071 PASS test contend 00:07:30.071 Starting test hold_by_poller 00:07:30.071 PASS test hold_by_poller 00:07:30.071 Starting test hold_by_message 00:07:30.071 PASS test hold_by_message 00:07:30.071 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:07:30.071 100014 assertions passed 00:07:30.071 0 assertions failed 00:07:30.071 00:07:30.071 real 0m0.666s 00:07:30.071 user 0m1.064s 00:07:30.071 sys 0m0.089s 00:07:30.071 16:31:27 thread.thread_spdk_lock -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:30.071 16:31:27 thread.thread_spdk_lock -- common/autotest_common.sh@10 -- # set +x 00:07:30.071 ************************************ 00:07:30.071 END TEST thread_spdk_lock 00:07:30.071 ************************************ 00:07:30.071 00:07:30.071 real 0m3.380s 00:07:30.071 user 0m3.394s 00:07:30.071 sys 0m0.494s 00:07:30.071 16:31:27 thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:30.071 16:31:27 thread -- common/autotest_common.sh@10 -- # set +x 00:07:30.071 ************************************ 00:07:30.071 END TEST thread 00:07:30.071 ************************************ 00:07:30.071 16:31:27 -- spdk/autotest.sh@171 -- # [[ 0 -eq 1 ]] 00:07:30.071 16:31:27 -- spdk/autotest.sh@176 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:30.071 16:31:27 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:30.071 16:31:27 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:30.071 16:31:27 -- common/autotest_common.sh@10 -- # set +x 00:07:30.071 ************************************ 00:07:30.071 START TEST app_cmdline 00:07:30.071 ************************************ 00:07:30.071 16:31:27 app_cmdline -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:30.331 * Looking for test storage... 00:07:30.331 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:30.331 16:31:27 app_cmdline -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:30.331 16:31:27 app_cmdline -- common/autotest_common.sh@1681 -- # lcov --version 00:07:30.331 16:31:27 app_cmdline -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:30.331 16:31:27 app_cmdline -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@336 -- # IFS=.-: 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@336 -- # read -ra ver1 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@337 -- # IFS=.-: 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@337 -- # read -ra ver2 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@338 -- # local 'op=<' 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@340 -- # ver1_l=2 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@341 -- # ver2_l=1 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@344 -- # case "$op" in 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@345 -- # : 1 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@365 -- # decimal 1 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@353 -- # local d=1 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@355 -- # echo 1 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@365 -- # ver1[v]=1 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@366 -- # decimal 2 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@353 -- # local d=2 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@355 -- # echo 2 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@366 -- # ver2[v]=2 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:30.331 16:31:27 app_cmdline -- scripts/common.sh@368 -- # return 0 00:07:30.331 16:31:27 app_cmdline -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:30.331 16:31:27 app_cmdline -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:30.331 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.331 --rc genhtml_branch_coverage=1 00:07:30.331 --rc genhtml_function_coverage=1 00:07:30.331 --rc genhtml_legend=1 00:07:30.331 --rc geninfo_all_blocks=1 00:07:30.331 --rc geninfo_unexecuted_blocks=1 00:07:30.331 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.331 ' 00:07:30.331 16:31:27 app_cmdline -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:30.331 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.331 --rc genhtml_branch_coverage=1 00:07:30.331 --rc genhtml_function_coverage=1 00:07:30.331 --rc genhtml_legend=1 00:07:30.331 --rc geninfo_all_blocks=1 00:07:30.331 --rc geninfo_unexecuted_blocks=1 00:07:30.331 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.331 ' 00:07:30.331 16:31:27 app_cmdline -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:30.331 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.331 --rc genhtml_branch_coverage=1 00:07:30.331 --rc genhtml_function_coverage=1 00:07:30.331 --rc genhtml_legend=1 00:07:30.331 --rc geninfo_all_blocks=1 00:07:30.331 --rc geninfo_unexecuted_blocks=1 00:07:30.331 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.331 ' 00:07:30.331 16:31:27 app_cmdline -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:30.331 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:30.331 --rc genhtml_branch_coverage=1 00:07:30.331 --rc genhtml_function_coverage=1 00:07:30.331 --rc genhtml_legend=1 00:07:30.331 --rc geninfo_all_blocks=1 00:07:30.331 --rc geninfo_unexecuted_blocks=1 00:07:30.331 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:30.331 ' 00:07:30.331 16:31:27 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:30.331 16:31:27 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=3753785 00:07:30.331 16:31:27 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 3753785 00:07:30.331 16:31:27 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:30.331 16:31:27 app_cmdline -- common/autotest_common.sh@831 -- # '[' -z 3753785 ']' 00:07:30.331 16:31:27 app_cmdline -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:30.331 16:31:27 app_cmdline -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:30.331 16:31:27 app_cmdline -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:30.331 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:30.331 16:31:27 app_cmdline -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:30.331 16:31:27 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:30.331 [2024-11-28 16:31:27.903214] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:30.331 [2024-11-28 16:31:27.903301] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3753785 ] 00:07:30.331 [2024-11-28 16:31:27.970623] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.590 [2024-11-28 16:31:28.010289] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.590 16:31:28 app_cmdline -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:30.590 16:31:28 app_cmdline -- common/autotest_common.sh@864 -- # return 0 00:07:30.590 16:31:28 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:30.850 { 00:07:30.850 "version": "SPDK v24.09.1-pre git sha1 b18e1bd62", 00:07:30.850 "fields": { 00:07:30.850 "major": 24, 00:07:30.850 "minor": 9, 00:07:30.850 "patch": 1, 00:07:30.850 "suffix": "-pre", 00:07:30.850 "commit": "b18e1bd62" 00:07:30.850 } 00:07:30.850 } 00:07:30.850 16:31:28 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:07:30.850 16:31:28 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:30.850 16:31:28 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:30.850 16:31:28 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:30.850 16:31:28 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:30.850 16:31:28 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:30.850 16:31:28 app_cmdline -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:30.850 16:31:28 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:30.850 16:31:28 app_cmdline -- app/cmdline.sh@26 -- # sort 00:07:30.850 16:31:28 app_cmdline -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:30.850 16:31:28 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:30.850 16:31:28 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:30.850 16:31:28 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:30.850 16:31:28 app_cmdline -- common/autotest_common.sh@650 -- # local es=0 00:07:30.850 16:31:28 app_cmdline -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:30.850 16:31:28 app_cmdline -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:30.850 16:31:28 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:30.850 16:31:28 app_cmdline -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:30.850 16:31:28 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:30.850 16:31:28 app_cmdline -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:30.850 16:31:28 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:30.850 16:31:28 app_cmdline -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:30.850 16:31:28 app_cmdline -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:30.850 16:31:28 app_cmdline -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:31.110 request: 00:07:31.110 { 00:07:31.110 "method": "env_dpdk_get_mem_stats", 00:07:31.110 "req_id": 1 00:07:31.110 } 00:07:31.110 Got JSON-RPC error response 00:07:31.110 response: 00:07:31.110 { 00:07:31.110 "code": -32601, 00:07:31.110 "message": "Method not found" 00:07:31.110 } 00:07:31.110 16:31:28 app_cmdline -- common/autotest_common.sh@653 -- # es=1 00:07:31.110 16:31:28 app_cmdline -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:31.110 16:31:28 app_cmdline -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:31.110 16:31:28 app_cmdline -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:31.110 16:31:28 app_cmdline -- app/cmdline.sh@1 -- # killprocess 3753785 00:07:31.110 16:31:28 app_cmdline -- common/autotest_common.sh@950 -- # '[' -z 3753785 ']' 00:07:31.110 16:31:28 app_cmdline -- common/autotest_common.sh@954 -- # kill -0 3753785 00:07:31.110 16:31:28 app_cmdline -- common/autotest_common.sh@955 -- # uname 00:07:31.110 16:31:28 app_cmdline -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:31.110 16:31:28 app_cmdline -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 3753785 00:07:31.110 16:31:28 app_cmdline -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:31.110 16:31:28 app_cmdline -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:31.110 16:31:28 app_cmdline -- common/autotest_common.sh@968 -- # echo 'killing process with pid 3753785' 00:07:31.110 killing process with pid 3753785 00:07:31.110 16:31:28 app_cmdline -- common/autotest_common.sh@969 -- # kill 3753785 00:07:31.110 16:31:28 app_cmdline -- common/autotest_common.sh@974 -- # wait 3753785 00:07:31.370 00:07:31.370 real 0m1.298s 00:07:31.370 user 0m1.448s 00:07:31.370 sys 0m0.515s 00:07:31.370 16:31:28 app_cmdline -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:31.370 16:31:28 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:31.370 ************************************ 00:07:31.370 END TEST app_cmdline 00:07:31.370 ************************************ 00:07:31.628 16:31:29 -- spdk/autotest.sh@177 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:31.628 16:31:29 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:31.628 16:31:29 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:31.628 16:31:29 -- common/autotest_common.sh@10 -- # set +x 00:07:31.628 ************************************ 00:07:31.628 START TEST version 00:07:31.628 ************************************ 00:07:31.628 16:31:29 version -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:31.628 * Looking for test storage... 00:07:31.628 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:31.628 16:31:29 version -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:31.628 16:31:29 version -- common/autotest_common.sh@1681 -- # lcov --version 00:07:31.628 16:31:29 version -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:31.628 16:31:29 version -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:31.628 16:31:29 version -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:31.628 16:31:29 version -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:31.628 16:31:29 version -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:31.628 16:31:29 version -- scripts/common.sh@336 -- # IFS=.-: 00:07:31.628 16:31:29 version -- scripts/common.sh@336 -- # read -ra ver1 00:07:31.628 16:31:29 version -- scripts/common.sh@337 -- # IFS=.-: 00:07:31.628 16:31:29 version -- scripts/common.sh@337 -- # read -ra ver2 00:07:31.628 16:31:29 version -- scripts/common.sh@338 -- # local 'op=<' 00:07:31.628 16:31:29 version -- scripts/common.sh@340 -- # ver1_l=2 00:07:31.628 16:31:29 version -- scripts/common.sh@341 -- # ver2_l=1 00:07:31.628 16:31:29 version -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:31.628 16:31:29 version -- scripts/common.sh@344 -- # case "$op" in 00:07:31.628 16:31:29 version -- scripts/common.sh@345 -- # : 1 00:07:31.628 16:31:29 version -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:31.628 16:31:29 version -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:31.628 16:31:29 version -- scripts/common.sh@365 -- # decimal 1 00:07:31.628 16:31:29 version -- scripts/common.sh@353 -- # local d=1 00:07:31.628 16:31:29 version -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:31.628 16:31:29 version -- scripts/common.sh@355 -- # echo 1 00:07:31.628 16:31:29 version -- scripts/common.sh@365 -- # ver1[v]=1 00:07:31.628 16:31:29 version -- scripts/common.sh@366 -- # decimal 2 00:07:31.628 16:31:29 version -- scripts/common.sh@353 -- # local d=2 00:07:31.628 16:31:29 version -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:31.628 16:31:29 version -- scripts/common.sh@355 -- # echo 2 00:07:31.628 16:31:29 version -- scripts/common.sh@366 -- # ver2[v]=2 00:07:31.628 16:31:29 version -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:31.628 16:31:29 version -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:31.628 16:31:29 version -- scripts/common.sh@368 -- # return 0 00:07:31.628 16:31:29 version -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:31.628 16:31:29 version -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:31.628 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.628 --rc genhtml_branch_coverage=1 00:07:31.628 --rc genhtml_function_coverage=1 00:07:31.628 --rc genhtml_legend=1 00:07:31.628 --rc geninfo_all_blocks=1 00:07:31.628 --rc geninfo_unexecuted_blocks=1 00:07:31.628 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.628 ' 00:07:31.628 16:31:29 version -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:31.628 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.628 --rc genhtml_branch_coverage=1 00:07:31.628 --rc genhtml_function_coverage=1 00:07:31.628 --rc genhtml_legend=1 00:07:31.628 --rc geninfo_all_blocks=1 00:07:31.628 --rc geninfo_unexecuted_blocks=1 00:07:31.628 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.628 ' 00:07:31.628 16:31:29 version -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:31.628 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.628 --rc genhtml_branch_coverage=1 00:07:31.628 --rc genhtml_function_coverage=1 00:07:31.628 --rc genhtml_legend=1 00:07:31.628 --rc geninfo_all_blocks=1 00:07:31.628 --rc geninfo_unexecuted_blocks=1 00:07:31.628 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.629 ' 00:07:31.629 16:31:29 version -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:31.629 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.629 --rc genhtml_branch_coverage=1 00:07:31.629 --rc genhtml_function_coverage=1 00:07:31.629 --rc genhtml_legend=1 00:07:31.629 --rc geninfo_all_blocks=1 00:07:31.629 --rc geninfo_unexecuted_blocks=1 00:07:31.629 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:31.629 ' 00:07:31.629 16:31:29 version -- app/version.sh@17 -- # get_header_version major 00:07:31.629 16:31:29 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:31.629 16:31:29 version -- app/version.sh@14 -- # cut -f2 00:07:31.629 16:31:29 version -- app/version.sh@14 -- # tr -d '"' 00:07:31.889 16:31:29 version -- app/version.sh@17 -- # major=24 00:07:31.889 16:31:29 version -- app/version.sh@18 -- # get_header_version minor 00:07:31.889 16:31:29 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:31.889 16:31:29 version -- app/version.sh@14 -- # cut -f2 00:07:31.889 16:31:29 version -- app/version.sh@14 -- # tr -d '"' 00:07:31.889 16:31:29 version -- app/version.sh@18 -- # minor=9 00:07:31.889 16:31:29 version -- app/version.sh@19 -- # get_header_version patch 00:07:31.889 16:31:29 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:31.889 16:31:29 version -- app/version.sh@14 -- # cut -f2 00:07:31.889 16:31:29 version -- app/version.sh@14 -- # tr -d '"' 00:07:31.889 16:31:29 version -- app/version.sh@19 -- # patch=1 00:07:31.889 16:31:29 version -- app/version.sh@20 -- # get_header_version suffix 00:07:31.889 16:31:29 version -- app/version.sh@14 -- # cut -f2 00:07:31.889 16:31:29 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:31.889 16:31:29 version -- app/version.sh@14 -- # tr -d '"' 00:07:31.889 16:31:29 version -- app/version.sh@20 -- # suffix=-pre 00:07:31.889 16:31:29 version -- app/version.sh@22 -- # version=24.9 00:07:31.889 16:31:29 version -- app/version.sh@25 -- # (( patch != 0 )) 00:07:31.889 16:31:29 version -- app/version.sh@25 -- # version=24.9.1 00:07:31.889 16:31:29 version -- app/version.sh@28 -- # version=24.9.1rc0 00:07:31.889 16:31:29 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:31.889 16:31:29 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:31.889 16:31:29 version -- app/version.sh@30 -- # py_version=24.9.1rc0 00:07:31.889 16:31:29 version -- app/version.sh@31 -- # [[ 24.9.1rc0 == \2\4\.\9\.\1\r\c\0 ]] 00:07:31.889 00:07:31.889 real 0m0.270s 00:07:31.889 user 0m0.153s 00:07:31.889 sys 0m0.167s 00:07:31.889 16:31:29 version -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:31.889 16:31:29 version -- common/autotest_common.sh@10 -- # set +x 00:07:31.889 ************************************ 00:07:31.889 END TEST version 00:07:31.889 ************************************ 00:07:31.889 16:31:29 -- spdk/autotest.sh@179 -- # '[' 0 -eq 1 ']' 00:07:31.889 16:31:29 -- spdk/autotest.sh@188 -- # [[ 0 -eq 1 ]] 00:07:31.889 16:31:29 -- spdk/autotest.sh@194 -- # uname -s 00:07:31.889 16:31:29 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:07:31.889 16:31:29 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:31.889 16:31:29 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:31.889 16:31:29 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:07:31.889 16:31:29 -- spdk/autotest.sh@252 -- # '[' 0 -eq 1 ']' 00:07:31.889 16:31:29 -- spdk/autotest.sh@256 -- # timing_exit lib 00:07:31.889 16:31:29 -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:31.889 16:31:29 -- common/autotest_common.sh@10 -- # set +x 00:07:31.889 16:31:29 -- spdk/autotest.sh@258 -- # '[' 0 -eq 1 ']' 00:07:31.889 16:31:29 -- spdk/autotest.sh@263 -- # '[' 0 -eq 1 ']' 00:07:31.889 16:31:29 -- spdk/autotest.sh@272 -- # '[' 0 -eq 1 ']' 00:07:31.889 16:31:29 -- spdk/autotest.sh@307 -- # '[' 0 -eq 1 ']' 00:07:31.889 16:31:29 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:31.889 16:31:29 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:07:31.889 16:31:29 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:07:31.889 16:31:29 -- spdk/autotest.sh@329 -- # '[' 0 -eq 1 ']' 00:07:31.889 16:31:29 -- spdk/autotest.sh@334 -- # '[' 0 -eq 1 ']' 00:07:31.889 16:31:29 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:07:31.889 16:31:29 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:31.889 16:31:29 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:31.889 16:31:29 -- spdk/autotest.sh@351 -- # '[' 0 -eq 1 ']' 00:07:31.889 16:31:29 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:07:31.889 16:31:29 -- spdk/autotest.sh@362 -- # [[ 0 -eq 1 ]] 00:07:31.889 16:31:29 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:07:31.889 16:31:29 -- spdk/autotest.sh@370 -- # [[ 1 -eq 1 ]] 00:07:31.889 16:31:29 -- spdk/autotest.sh@371 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:31.889 16:31:29 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:31.889 16:31:29 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:31.889 16:31:29 -- common/autotest_common.sh@10 -- # set +x 00:07:31.889 ************************************ 00:07:31.889 START TEST llvm_fuzz 00:07:31.889 ************************************ 00:07:31.889 16:31:29 llvm_fuzz -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:32.149 * Looking for test storage... 00:07:32.149 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:32.149 16:31:29 llvm_fuzz -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:32.149 16:31:29 llvm_fuzz -- common/autotest_common.sh@1681 -- # lcov --version 00:07:32.149 16:31:29 llvm_fuzz -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:32.149 16:31:29 llvm_fuzz -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:32.149 16:31:29 llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:32.149 16:31:29 llvm_fuzz -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:32.149 16:31:29 llvm_fuzz -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:32.149 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.149 --rc genhtml_branch_coverage=1 00:07:32.149 --rc genhtml_function_coverage=1 00:07:32.149 --rc genhtml_legend=1 00:07:32.149 --rc geninfo_all_blocks=1 00:07:32.149 --rc geninfo_unexecuted_blocks=1 00:07:32.149 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.149 ' 00:07:32.149 16:31:29 llvm_fuzz -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:32.149 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.149 --rc genhtml_branch_coverage=1 00:07:32.150 --rc genhtml_function_coverage=1 00:07:32.150 --rc genhtml_legend=1 00:07:32.150 --rc geninfo_all_blocks=1 00:07:32.150 --rc geninfo_unexecuted_blocks=1 00:07:32.150 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.150 ' 00:07:32.150 16:31:29 llvm_fuzz -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:32.150 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.150 --rc genhtml_branch_coverage=1 00:07:32.150 --rc genhtml_function_coverage=1 00:07:32.150 --rc genhtml_legend=1 00:07:32.150 --rc geninfo_all_blocks=1 00:07:32.150 --rc geninfo_unexecuted_blocks=1 00:07:32.150 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.150 ' 00:07:32.150 16:31:29 llvm_fuzz -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:32.150 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.150 --rc genhtml_branch_coverage=1 00:07:32.150 --rc genhtml_function_coverage=1 00:07:32.150 --rc genhtml_legend=1 00:07:32.150 --rc geninfo_all_blocks=1 00:07:32.150 --rc geninfo_unexecuted_blocks=1 00:07:32.150 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.150 ' 00:07:32.150 16:31:29 llvm_fuzz -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:32.150 16:31:29 llvm_fuzz -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:32.150 16:31:29 llvm_fuzz -- common/autotest_common.sh@548 -- # fuzzers=() 00:07:32.150 16:31:29 llvm_fuzz -- common/autotest_common.sh@548 -- # local fuzzers 00:07:32.150 16:31:29 llvm_fuzz -- common/autotest_common.sh@550 -- # [[ -n '' ]] 00:07:32.150 16:31:29 llvm_fuzz -- common/autotest_common.sh@553 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:32.150 16:31:29 llvm_fuzz -- common/autotest_common.sh@554 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:32.150 16:31:29 llvm_fuzz -- common/autotest_common.sh@557 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:32.150 16:31:29 llvm_fuzz -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:32.150 16:31:29 llvm_fuzz -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:32.150 16:31:29 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:32.150 16:31:29 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:32.150 16:31:29 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:32.150 16:31:29 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:32.150 16:31:29 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:32.150 16:31:29 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:32.150 16:31:29 llvm_fuzz -- fuzz/llvm.sh@19 -- # run_test nvmf_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:32.150 16:31:29 llvm_fuzz -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:32.150 16:31:29 llvm_fuzz -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:32.150 16:31:29 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:07:32.150 ************************************ 00:07:32.150 START TEST nvmf_llvm_fuzz 00:07:32.150 ************************************ 00:07:32.150 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:32.413 * Looking for test storage... 00:07:32.413 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # lcov --version 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:32.413 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.413 --rc genhtml_branch_coverage=1 00:07:32.413 --rc genhtml_function_coverage=1 00:07:32.413 --rc genhtml_legend=1 00:07:32.413 --rc geninfo_all_blocks=1 00:07:32.413 --rc geninfo_unexecuted_blocks=1 00:07:32.413 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.413 ' 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:32.413 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.413 --rc genhtml_branch_coverage=1 00:07:32.413 --rc genhtml_function_coverage=1 00:07:32.413 --rc genhtml_legend=1 00:07:32.413 --rc geninfo_all_blocks=1 00:07:32.413 --rc geninfo_unexecuted_blocks=1 00:07:32.413 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.413 ' 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:32.413 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.413 --rc genhtml_branch_coverage=1 00:07:32.413 --rc genhtml_function_coverage=1 00:07:32.413 --rc genhtml_legend=1 00:07:32.413 --rc geninfo_all_blocks=1 00:07:32.413 --rc geninfo_unexecuted_blocks=1 00:07:32.413 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.413 ' 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:32.413 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.413 --rc genhtml_branch_coverage=1 00:07:32.413 --rc genhtml_function_coverage=1 00:07:32.413 --rc genhtml_legend=1 00:07:32.413 --rc geninfo_all_blocks=1 00:07:32.413 --rc geninfo_unexecuted_blocks=1 00:07:32.413 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.413 ' 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@60 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_AIO_FSDEV=y 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_UBLK=y 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_ISAL_CRYPTO=y 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_OPENSSL_PATH= 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OCF=n 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_FUSE=n 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_VTUNE_DIR= 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER=y 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FSDEV=y 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_CRYPTO=n 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_PGO_USE=n 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_VHOST=y 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_DAOS=n 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DAOS_DIR= 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_UNIT_TESTS=n 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_VIRTIO=y 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_DPDK_UADK=n 00:07:32.413 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_COVERAGE=y 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_RDMA=y 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_LZ4=n 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_URING_PATH= 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_XNVME=n 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_VFIO_USER=y 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_ARCH=native 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_HAVE_EVP_MAC=y 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_URING_ZNS=n 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_WERROR=y 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_HAVE_LIBBSD=n 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_UBSAN=y 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_IPSEC_MB_DIR= 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_GOLANG=n 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_ISAL=y 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_IDXD_KERNEL=y 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_RDMA_PROV=verbs 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_APPS=y 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_SHARED=n 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_HAVE_KEYUTILS=y 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_FC_PATH= 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_FC=n 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_AVAHI=n 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_FIO_PLUGIN=y 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_RAID5F=n 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_EXAMPLES=y 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_TESTS=y 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_CRYPTO_MLX5=n 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_MAX_LCORES=128 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_IPSEC_MB=n 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_PGO_DIR= 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_DEBUG=y 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_CROSS_PREFIX= 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_COPY_FILE_RANGE=y 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_URING=n 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:32.414 #define SPDK_CONFIG_H 00:07:32.414 #define SPDK_CONFIG_AIO_FSDEV 1 00:07:32.414 #define SPDK_CONFIG_APPS 1 00:07:32.414 #define SPDK_CONFIG_ARCH native 00:07:32.414 #undef SPDK_CONFIG_ASAN 00:07:32.414 #undef SPDK_CONFIG_AVAHI 00:07:32.414 #undef SPDK_CONFIG_CET 00:07:32.414 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:07:32.414 #define SPDK_CONFIG_COVERAGE 1 00:07:32.414 #define SPDK_CONFIG_CROSS_PREFIX 00:07:32.414 #undef SPDK_CONFIG_CRYPTO 00:07:32.414 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:32.414 #undef SPDK_CONFIG_CUSTOMOCF 00:07:32.414 #undef SPDK_CONFIG_DAOS 00:07:32.414 #define SPDK_CONFIG_DAOS_DIR 00:07:32.414 #define SPDK_CONFIG_DEBUG 1 00:07:32.414 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:32.414 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:32.414 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:32.414 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:32.414 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:32.414 #undef SPDK_CONFIG_DPDK_UADK 00:07:32.414 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:32.414 #define SPDK_CONFIG_EXAMPLES 1 00:07:32.414 #undef SPDK_CONFIG_FC 00:07:32.414 #define SPDK_CONFIG_FC_PATH 00:07:32.414 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:32.414 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:32.414 #define SPDK_CONFIG_FSDEV 1 00:07:32.414 #undef SPDK_CONFIG_FUSE 00:07:32.414 #define SPDK_CONFIG_FUZZER 1 00:07:32.414 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:32.414 #undef SPDK_CONFIG_GOLANG 00:07:32.414 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:32.414 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:07:32.414 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:32.414 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:07:32.414 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:32.414 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:32.414 #undef SPDK_CONFIG_HAVE_LZ4 00:07:32.414 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:07:32.414 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:07:32.414 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:32.414 #define SPDK_CONFIG_IDXD 1 00:07:32.414 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:32.414 #undef SPDK_CONFIG_IPSEC_MB 00:07:32.414 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:32.414 #define SPDK_CONFIG_ISAL 1 00:07:32.414 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:32.414 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:32.414 #define SPDK_CONFIG_LIBDIR 00:07:32.414 #undef SPDK_CONFIG_LTO 00:07:32.414 #define SPDK_CONFIG_MAX_LCORES 128 00:07:32.414 #define SPDK_CONFIG_NVME_CUSE 1 00:07:32.414 #undef SPDK_CONFIG_OCF 00:07:32.414 #define SPDK_CONFIG_OCF_PATH 00:07:32.414 #define SPDK_CONFIG_OPENSSL_PATH 00:07:32.414 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:32.414 #define SPDK_CONFIG_PGO_DIR 00:07:32.414 #undef SPDK_CONFIG_PGO_USE 00:07:32.414 #define SPDK_CONFIG_PREFIX /usr/local 00:07:32.414 #undef SPDK_CONFIG_RAID5F 00:07:32.414 #undef SPDK_CONFIG_RBD 00:07:32.414 #define SPDK_CONFIG_RDMA 1 00:07:32.414 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:32.414 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:32.414 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:32.414 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:32.414 #undef SPDK_CONFIG_SHARED 00:07:32.414 #undef SPDK_CONFIG_SMA 00:07:32.414 #define SPDK_CONFIG_TESTS 1 00:07:32.414 #undef SPDK_CONFIG_TSAN 00:07:32.414 #define SPDK_CONFIG_UBLK 1 00:07:32.414 #define SPDK_CONFIG_UBSAN 1 00:07:32.414 #undef SPDK_CONFIG_UNIT_TESTS 00:07:32.414 #undef SPDK_CONFIG_URING 00:07:32.414 #define SPDK_CONFIG_URING_PATH 00:07:32.414 #undef SPDK_CONFIG_URING_ZNS 00:07:32.414 #undef SPDK_CONFIG_USDT 00:07:32.414 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:32.414 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:32.414 #define SPDK_CONFIG_VFIO_USER 1 00:07:32.414 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:32.414 #define SPDK_CONFIG_VHOST 1 00:07:32.414 #define SPDK_CONFIG_VIRTIO 1 00:07:32.414 #undef SPDK_CONFIG_VTUNE 00:07:32.414 #define SPDK_CONFIG_VTUNE_DIR 00:07:32.414 #define SPDK_CONFIG_WERROR 1 00:07:32.414 #define SPDK_CONFIG_WPDK_DIR 00:07:32.414 #undef SPDK_CONFIG_XNVME 00:07:32.414 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:32.414 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # uname -s 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:07:32.415 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@140 -- # : v22.11.4 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@179 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@179 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@180 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@180 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@185 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@185 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@189 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@189 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@193 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@193 -- # PYTHONDONTWRITEBYTECODE=1 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@197 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@197 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@198 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@198 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@202 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@203 -- # rm -rf /var/tmp/asan_suppression_file 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@204 -- # cat 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@240 -- # echo leak:libfuse3.so 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@242 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@242 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@246 -- # '[' -z /var/spdk/dependencies ']' 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@249 -- # export DEPENDENCY_DIR 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@253 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@253 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@254 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@254 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@257 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@257 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@258 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@258 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@263 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:32.416 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@263 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@265 -- # _LCOV_MAIN=0 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@266 -- # _LCOV_LLVM=1 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV= 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # [[ '' == *clang* ]] 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # [[ 1 -eq 1 ]] 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV=1 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@271 -- # _lcov_opt[_LCOV_MAIN]= 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@273 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@276 -- # '[' 0 -eq 0 ']' 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@277 -- # export valgrind= 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@277 -- # valgrind= 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@283 -- # uname -s 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@283 -- # '[' Linux = Linux ']' 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@284 -- # HUGEMEM=4096 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # export CLEAR_HUGE=yes 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # CLEAR_HUGE=yes 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@287 -- # MAKE=make 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@288 -- # MAKEFLAGS=-j112 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@304 -- # export HUGEMEM=4096 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@304 -- # HUGEMEM=4096 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@306 -- # NO_HUGE=() 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@307 -- # TEST_MODE= 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@329 -- # [[ -z 3754230 ]] 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@329 -- # kill -0 3754230 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1666 -- # set_test_storage 2147483648 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@339 -- # [[ -v testdir ]] 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@341 -- # local requested_size=2147483648 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@342 -- # local mount target_dir 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@344 -- # local -A mounts fss sizes avails uses 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@345 -- # local source fs size avail mount use 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@347 -- # local storage_fallback storage_candidates 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@349 -- # mktemp -udt spdk.XXXXXX 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@349 -- # storage_fallback=/tmp/spdk.0XgBe0 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@354 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@356 -- # [[ -n '' ]] 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@361 -- # [[ -n '' ]] 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@366 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.0XgBe0/tests/nvmf /tmp/spdk.0XgBe0 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@369 -- # requested_size=2214592512 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@338 -- # df -T 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@338 -- # grep -v Filesystem 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_devtmpfs 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=devtmpfs 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=67108864 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=67108864 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=0 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=/dev/pmem0 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=ext2 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=4096 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=5284429824 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=5284425728 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_root 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=overlay 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=51644088320 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=61730607104 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=10086518784 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=30860537856 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=30865301504 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=4763648 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=12340129792 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=12346122240 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=5992448 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=30863499264 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=30865305600 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=1806336 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=6173044736 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=6173057024 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=12288 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@377 -- # printf '* Looking for test storage...\n' 00:07:32.417 * Looking for test storage... 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@379 -- # local target_space new_size 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@380 -- # for target_dir in "${storage_candidates[@]}" 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@383 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@383 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:32.417 16:31:29 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@383 -- # mount=/ 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # target_space=51644088320 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@386 -- # (( target_space == 0 || target_space < requested_size )) 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@389 -- # (( target_space >= requested_size )) 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ overlay == tmpfs ]] 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ overlay == ramfs ]] 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ / == / ]] 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@392 -- # new_size=12301111296 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@398 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@398 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@399 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:32.418 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@400 -- # return 0 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1668 -- # set -o errtrace 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1669 -- # shopt -s extdebug 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1670 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1672 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1673 -- # true 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1675 -- # xtrace_fd 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # lcov --version 00:07:32.418 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:07:32.677 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.677 --rc genhtml_branch_coverage=1 00:07:32.677 --rc genhtml_function_coverage=1 00:07:32.677 --rc genhtml_legend=1 00:07:32.677 --rc geninfo_all_blocks=1 00:07:32.677 --rc geninfo_unexecuted_blocks=1 00:07:32.677 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.677 ' 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:07:32.677 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.677 --rc genhtml_branch_coverage=1 00:07:32.677 --rc genhtml_function_coverage=1 00:07:32.677 --rc genhtml_legend=1 00:07:32.677 --rc geninfo_all_blocks=1 00:07:32.677 --rc geninfo_unexecuted_blocks=1 00:07:32.677 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.677 ' 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:07:32.677 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.677 --rc genhtml_branch_coverage=1 00:07:32.677 --rc genhtml_function_coverage=1 00:07:32.677 --rc genhtml_legend=1 00:07:32.677 --rc geninfo_all_blocks=1 00:07:32.677 --rc geninfo_unexecuted_blocks=1 00:07:32.677 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.677 ' 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:07:32.677 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:32.677 --rc genhtml_branch_coverage=1 00:07:32.677 --rc genhtml_function_coverage=1 00:07:32.677 --rc genhtml_legend=1 00:07:32.677 --rc geninfo_all_blocks=1 00:07:32.677 --rc geninfo_unexecuted_blocks=1 00:07:32.677 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:32.677 ' 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@61 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@63 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # fuzz_num=25 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@65 -- # (( fuzz_num != 0 )) 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@67 -- # trap 'cleanup /tmp/llvm_fuzz* /var/tmp/suppress_nvmf_fuzz; exit 1' SIGINT SIGTERM EXIT 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@69 -- # mem_size=512 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@70 -- # [[ 1 -eq 1 ]] 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@71 -- # start_llvm_fuzz_short 25 1 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=25 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 0 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4400 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:32.677 16:31:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 00:07:32.677 [2024-11-28 16:31:30.151865] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:32.677 [2024-11-28 16:31:30.151963] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3754290 ] 00:07:32.936 [2024-11-28 16:31:30.404999] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.936 [2024-11-28 16:31:30.436475] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.936 [2024-11-28 16:31:30.489183] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:32.936 [2024-11-28 16:31:30.505541] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:32.936 INFO: Running with entropic power schedule (0xFF, 100). 00:07:32.936 INFO: Seed: 2058772596 00:07:32.936 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:07:32.936 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:07:32.936 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:32.936 INFO: A corpus is not provided, starting from an empty corpus 00:07:32.936 #2 INITED exec/s: 0 rss: 65Mb 00:07:32.936 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:32.936 This may also happen if the target rejected all inputs we tried so far 00:07:32.936 [2024-11-28 16:31:30.570921] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:32.936 [2024-11-28 16:31:30.570951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.455 NEW_FUNC[1/714]: 0x452788 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:33.455 NEW_FUNC[2/714]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:33.455 #8 NEW cov: 12145 ft: 12147 corp: 2/69b lim: 320 exec/s: 0 rss: 72Mb L: 68/68 MS: 1 InsertRepeatedBytes- 00:07:33.455 [2024-11-28 16:31:30.891852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff 00:07:33.455 [2024-11-28 16:31:30.891898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.455 #12 NEW cov: 12284 ft: 12739 corp: 3/186b lim: 320 exec/s: 0 rss: 72Mb L: 117/117 MS: 4 ChangeBit-InsertRepeatedBytes-CMP-InsertRepeatedBytes- DE: "\377\377\377\""- 00:07:33.455 [2024-11-28 16:31:30.931768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff 00:07:33.455 [2024-11-28 16:31:30.931794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.455 #13 NEW cov: 12290 ft: 12923 corp: 4/303b lim: 320 exec/s: 0 rss: 72Mb L: 117/117 MS: 1 PersAutoDict- DE: "\377\377\377\""- 00:07:33.455 [2024-11-28 16:31:30.991955] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.455 [2024-11-28 16:31:30.991981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.455 #14 NEW cov: 12375 ft: 13170 corp: 5/375b lim: 320 exec/s: 0 rss: 72Mb L: 72/117 MS: 1 PersAutoDict- DE: "\377\377\377\""- 00:07:33.455 [2024-11-28 16:31:31.052067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (27) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.455 [2024-11-28 16:31:31.052093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.455 NEW_FUNC[1/1]: 0x150df38 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2213 00:07:33.455 #19 NEW cov: 12406 ft: 13416 corp: 6/444b lim: 320 exec/s: 0 rss: 72Mb L: 69/117 MS: 5 ShuffleBytes-CrossOver-ChangeBit-ChangeByte-CrossOver- 00:07:33.455 [2024-11-28 16:31:31.092188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (27) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.455 [2024-11-28 16:31:31.092213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.715 #20 NEW cov: 12406 ft: 13493 corp: 7/517b lim: 320 exec/s: 0 rss: 73Mb L: 73/117 MS: 1 CMP- DE: "(\000\000\000"- 00:07:33.715 [2024-11-28 16:31:31.152377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff 00:07:33.715 [2024-11-28 16:31:31.152403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.715 #21 NEW cov: 12406 ft: 13554 corp: 8/635b lim: 320 exec/s: 0 rss: 73Mb L: 118/118 MS: 1 InsertByte- 00:07:33.715 [2024-11-28 16:31:31.192459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (24) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff 00:07:33.715 [2024-11-28 16:31:31.192484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.715 #25 NEW cov: 12407 ft: 13642 corp: 9/754b lim: 320 exec/s: 0 rss: 73Mb L: 119/119 MS: 4 ChangeByte-CopyPart-CopyPart-CrossOver- 00:07:33.715 [2024-11-28 16:31:31.232616] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.715 [2024-11-28 16:31:31.232642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.715 #26 NEW cov: 12407 ft: 13665 corp: 10/830b lim: 320 exec/s: 0 rss: 73Mb L: 76/119 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:33.715 [2024-11-28 16:31:31.292775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff 00:07:33.715 [2024-11-28 16:31:31.292800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.715 #32 NEW cov: 12407 ft: 13703 corp: 11/948b lim: 320 exec/s: 0 rss: 73Mb L: 118/119 MS: 1 InsertByte- 00:07:33.715 [2024-11-28 16:31:31.332898] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.715 [2024-11-28 16:31:31.332923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.715 #33 NEW cov: 12407 ft: 13766 corp: 12/1042b lim: 320 exec/s: 0 rss: 73Mb L: 94/119 MS: 1 CopyPart- 00:07:33.974 [2024-11-28 16:31:31.372975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (27) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.974 [2024-11-28 16:31:31.373001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.974 #39 NEW cov: 12407 ft: 13843 corp: 13/1111b lim: 320 exec/s: 0 rss: 73Mb L: 69/119 MS: 1 ChangeByte- 00:07:33.974 [2024-11-28 16:31:31.413121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:ffffffff cdw10:ffff2bff cdw11:ffffffff 00:07:33.974 [2024-11-28 16:31:31.413147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.974 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:33.974 #40 NEW cov: 12430 ft: 13882 corp: 14/1228b lim: 320 exec/s: 0 rss: 73Mb L: 117/119 MS: 1 ChangeByte- 00:07:33.974 [2024-11-28 16:31:31.473391] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.974 [2024-11-28 16:31:31.473418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.974 [2024-11-28 16:31:31.473474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.974 [2024-11-28 16:31:31.473489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.974 #46 NEW cov: 12430 ft: 14081 corp: 15/1364b lim: 320 exec/s: 0 rss: 73Mb L: 136/136 MS: 1 CopyPart- 00:07:33.974 [2024-11-28 16:31:31.513436] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.974 [2024-11-28 16:31:31.513463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.974 #47 NEW cov: 12430 ft: 14108 corp: 16/1432b lim: 320 exec/s: 0 rss: 73Mb L: 68/136 MS: 1 ShuffleBytes- 00:07:33.974 [2024-11-28 16:31:31.553518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff 00:07:33.974 [2024-11-28 16:31:31.553544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:33.974 #48 NEW cov: 12430 ft: 14130 corp: 17/1553b lim: 320 exec/s: 48 rss: 73Mb L: 121/136 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:33.974 [2024-11-28 16:31:31.593672] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:33.974 [2024-11-28 16:31:31.593697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.234 #49 NEW cov: 12430 ft: 14139 corp: 18/1623b lim: 320 exec/s: 49 rss: 73Mb L: 70/136 MS: 1 CMP- DE: "\377\377"- 00:07:34.234 [2024-11-28 16:31:31.653797] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.234 [2024-11-28 16:31:31.653822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.234 #50 NEW cov: 12430 ft: 14188 corp: 19/1717b lim: 320 exec/s: 50 rss: 73Mb L: 94/136 MS: 1 ChangeBinInt- 00:07:34.234 [2024-11-28 16:31:31.713971] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.234 [2024-11-28 16:31:31.713997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.234 #51 NEW cov: 12430 ft: 14227 corp: 20/1789b lim: 320 exec/s: 51 rss: 73Mb L: 72/136 MS: 1 ChangeBinInt- 00:07:34.234 [2024-11-28 16:31:31.754071] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.234 [2024-11-28 16:31:31.754097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.234 #52 NEW cov: 12430 ft: 14279 corp: 21/1891b lim: 320 exec/s: 52 rss: 73Mb L: 102/136 MS: 1 CMP- DE: "\377\222E'A\266f."- 00:07:34.234 [2024-11-28 16:31:31.814261] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.234 [2024-11-28 16:31:31.814286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.234 #53 NEW cov: 12430 ft: 14319 corp: 22/1963b lim: 320 exec/s: 53 rss: 73Mb L: 72/136 MS: 1 ChangeBinInt- 00:07:34.234 [2024-11-28 16:31:31.854337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (24) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff 00:07:34.234 [2024-11-28 16:31:31.854362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.493 #54 NEW cov: 12430 ft: 14325 corp: 23/2082b lim: 320 exec/s: 54 rss: 73Mb L: 119/136 MS: 1 ChangeBinInt- 00:07:34.494 [2024-11-28 16:31:31.914523] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.494 [2024-11-28 16:31:31.914548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.494 #55 NEW cov: 12430 ft: 14343 corp: 24/2180b lim: 320 exec/s: 55 rss: 73Mb L: 98/136 MS: 1 PersAutoDict- DE: "(\000\000\000"- 00:07:34.494 [2024-11-28 16:31:31.954689] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.494 [2024-11-28 16:31:31.954716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.494 [2024-11-28 16:31:31.994763] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.494 [2024-11-28 16:31:31.994788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.494 #57 NEW cov: 12430 ft: 14346 corp: 25/2278b lim: 320 exec/s: 57 rss: 73Mb L: 98/136 MS: 2 PersAutoDict-CopyPart- DE: "\000\000\000\000"- 00:07:34.494 [2024-11-28 16:31:32.034894] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.494 [2024-11-28 16:31:32.034921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.494 #58 NEW cov: 12430 ft: 14357 corp: 26/2372b lim: 320 exec/s: 58 rss: 73Mb L: 94/136 MS: 1 ChangeByte- 00:07:34.494 [2024-11-28 16:31:32.074949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (27) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.494 [2024-11-28 16:31:32.074979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.494 #59 NEW cov: 12430 ft: 14368 corp: 27/2445b lim: 320 exec/s: 59 rss: 73Mb L: 73/136 MS: 1 CopyPart- 00:07:34.494 [2024-11-28 16:31:32.135148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (24) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:0022ffff 00:07:34.494 [2024-11-28 16:31:32.135174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.753 #60 NEW cov: 12430 ft: 14372 corp: 28/2564b lim: 320 exec/s: 60 rss: 73Mb L: 119/136 MS: 1 CopyPart- 00:07:34.753 [2024-11-28 16:31:32.195321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff 00:07:34.753 [2024-11-28 16:31:32.195346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.753 #61 NEW cov: 12430 ft: 14390 corp: 29/2682b lim: 320 exec/s: 61 rss: 74Mb L: 118/136 MS: 1 ChangeBinInt- 00:07:34.753 [2024-11-28 16:31:32.255495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (27) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.753 [2024-11-28 16:31:32.255521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.753 #67 NEW cov: 12430 ft: 14398 corp: 30/2804b lim: 320 exec/s: 67 rss: 74Mb L: 122/136 MS: 1 CopyPart- 00:07:34.753 [2024-11-28 16:31:32.295779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff 00:07:34.753 [2024-11-28 16:31:32.295805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.753 [2024-11-28 16:31:32.295866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffff22ff 00:07:34.753 [2024-11-28 16:31:32.295880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.753 #68 NEW cov: 12430 ft: 14434 corp: 31/2977b lim: 320 exec/s: 68 rss: 74Mb L: 173/173 MS: 1 CopyPart- 00:07:34.753 [2024-11-28 16:31:32.355816] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.753 [2024-11-28 16:31:32.355841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:34.753 #69 NEW cov: 12430 ft: 14442 corp: 32/3071b lim: 320 exec/s: 69 rss: 74Mb L: 94/173 MS: 1 CopyPart- 00:07:34.753 [2024-11-28 16:31:32.395909] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:34.753 [2024-11-28 16:31:32.395936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.013 #70 NEW cov: 12430 ft: 14469 corp: 33/3143b lim: 320 exec/s: 70 rss: 74Mb L: 72/173 MS: 1 CopyPart- 00:07:35.013 [2024-11-28 16:31:32.456086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (24) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff 00:07:35.013 [2024-11-28 16:31:32.456112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.013 #71 NEW cov: 12430 ft: 14479 corp: 34/3262b lim: 320 exec/s: 71 rss: 74Mb L: 119/173 MS: 1 ChangeBit- 00:07:35.013 [2024-11-28 16:31:32.496191] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.013 [2024-11-28 16:31:32.496217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.013 #72 NEW cov: 12430 ft: 14533 corp: 35/3328b lim: 320 exec/s: 72 rss: 74Mb L: 66/173 MS: 1 EraseBytes- 00:07:35.013 [2024-11-28 16:31:32.556477] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.013 [2024-11-28 16:31:32.556505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.013 [2024-11-28 16:31:32.556568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:35.013 [2024-11-28 16:31:32.556582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.013 #73 NEW cov: 12430 ft: 14548 corp: 36/3457b lim: 320 exec/s: 36 rss: 74Mb L: 129/173 MS: 1 CrossOver- 00:07:35.013 #73 DONE cov: 12430 ft: 14548 corp: 36/3457b lim: 320 exec/s: 36 rss: 74Mb 00:07:35.013 ###### Recommended dictionary. ###### 00:07:35.013 "\377\377\377\"" # Uses: 2 00:07:35.013 "(\000\000\000" # Uses: 1 00:07:35.013 "\000\000\000\000" # Uses: 2 00:07:35.013 "\377\377" # Uses: 0 00:07:35.013 "\377\222E'A\266f." # Uses: 1 00:07:35.013 ###### End of recommended dictionary. ###### 00:07:35.013 Done 73 runs in 2 second(s) 00:07:35.273 16:31:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_0.conf /var/tmp/suppress_nvmf_fuzz 00:07:35.273 16:31:32 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:35.273 16:31:32 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:35.273 16:31:32 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:35.273 16:31:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:35.273 16:31:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:35.273 16:31:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:35.273 16:31:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:35.273 16:31:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:35.273 16:31:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:35.273 16:31:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:35.273 16:31:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 1 00:07:35.273 16:31:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4401 00:07:35.273 16:31:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:35.273 16:31:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:35.273 16:31:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:35.273 16:31:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:35.273 16:31:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:35.273 16:31:32 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 00:07:35.273 [2024-11-28 16:31:32.736619] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:35.273 [2024-11-28 16:31:32.736689] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3754830 ] 00:07:35.532 [2024-11-28 16:31:32.978999] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.532 [2024-11-28 16:31:33.009722] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.532 [2024-11-28 16:31:33.061782] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:35.532 [2024-11-28 16:31:33.078116] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:35.532 INFO: Running with entropic power schedule (0xFF, 100). 00:07:35.532 INFO: Seed: 336821214 00:07:35.532 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:07:35.532 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:07:35.532 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:35.532 INFO: A corpus is not provided, starting from an empty corpus 00:07:35.532 #2 INITED exec/s: 0 rss: 64Mb 00:07:35.532 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:35.532 This may also happen if the target rejected all inputs we tried so far 00:07:35.532 [2024-11-28 16:31:33.122794] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (273408) > buf size (4096) 00:07:35.532 [2024-11-28 16:31:33.122923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff8192 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:35.532 [2024-11-28 16:31:33.122949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.051 NEW_FUNC[1/715]: 0x453088 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:36.051 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:36.051 #4 NEW cov: 12246 ft: 12245 corp: 2/10b lim: 30 exec/s: 0 rss: 72Mb L: 9/9 MS: 2 ShuffleBytes-CMP- DE: "\377\222E(\224P\014v"- 00:07:36.051 [2024-11-28 16:31:33.484908] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (273408) > buf size (4096) 00:07:36.051 [2024-11-28 16:31:33.485091] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (81972) > buf size (4096) 00:07:36.051 [2024-11-28 16:31:33.485480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff8192 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.051 [2024-11-28 16:31:33.485534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.051 [2024-11-28 16:31:33.485672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:500c0076 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.051 [2024-11-28 16:31:33.485696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.051 #5 NEW cov: 12361 ft: 13321 corp: 3/22b lim: 30 exec/s: 0 rss: 72Mb L: 12/12 MS: 1 InsertRepeatedBytes- 00:07:36.051 [2024-11-28 16:31:33.554873] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100002891 00:07:36.051 [2024-11-28 16:31:33.555234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff8192 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.051 [2024-11-28 16:31:33.555268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.051 #6 NEW cov: 12373 ft: 13567 corp: 4/31b lim: 30 exec/s: 0 rss: 72Mb L: 9/12 MS: 1 ChangeBinInt- 00:07:36.051 [2024-11-28 16:31:33.605043] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (273408) > buf size (4096) 00:07:36.051 [2024-11-28 16:31:33.605425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff8192 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.051 [2024-11-28 16:31:33.605458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.051 #12 NEW cov: 12458 ft: 13757 corp: 5/41b lim: 30 exec/s: 0 rss: 72Mb L: 10/12 MS: 1 InsertByte- 00:07:36.051 [2024-11-28 16:31:33.655338] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (273408) > buf size (4096) 00:07:36.051 [2024-11-28 16:31:33.655508] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (81972) > buf size (4096) 00:07:36.051 [2024-11-28 16:31:33.655837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff8192 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.051 [2024-11-28 16:31:33.655869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.051 [2024-11-28 16:31:33.655981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:500c007a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.051 [2024-11-28 16:31:33.656001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.051 #13 NEW cov: 12458 ft: 13904 corp: 6/53b lim: 30 exec/s: 0 rss: 72Mb L: 12/12 MS: 1 ChangeBinInt- 00:07:36.311 [2024-11-28 16:31:33.725441] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (571484) > buf size (4096) 00:07:36.311 [2024-11-28 16:31:33.725822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2e160237 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.311 [2024-11-28 16:31:33.725854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.311 #14 NEW cov: 12458 ft: 13983 corp: 7/62b lim: 30 exec/s: 0 rss: 72Mb L: 9/12 MS: 1 CMP- DE: ".\0267\002\000\000\000\000"- 00:07:36.311 [2024-11-28 16:31:33.795704] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (571484) > buf size (4096) 00:07:36.311 [2024-11-28 16:31:33.796070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2e160237 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.311 [2024-11-28 16:31:33.796101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.311 #15 NEW cov: 12458 ft: 14052 corp: 8/71b lim: 30 exec/s: 0 rss: 72Mb L: 9/12 MS: 1 ChangeByte- 00:07:36.311 [2024-11-28 16:31:33.866004] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:36.311 [2024-11-28 16:31:33.866170] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (785452) > buf size (4096) 00:07:36.311 [2024-11-28 16:31:33.866319] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (676164) > buf size (4096) 00:07:36.311 [2024-11-28 16:31:33.866677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.311 [2024-11-28 16:31:33.866708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.311 [2024-11-28 16:31:33.866823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ff0a02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.311 [2024-11-28 16:31:33.866842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.311 [2024-11-28 16:31:33.866959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:9450020c cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.311 [2024-11-28 16:31:33.866980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.311 #16 NEW cov: 12458 ft: 14360 corp: 9/90b lim: 30 exec/s: 0 rss: 72Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:36.311 [2024-11-28 16:31:33.916080] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11264) > buf size (4096) 00:07:36.311 [2024-11-28 16:31:33.916430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff0092 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.311 [2024-11-28 16:31:33.916458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.311 #17 NEW cov: 12458 ft: 14390 corp: 10/98b lim: 30 exec/s: 0 rss: 72Mb L: 8/19 MS: 1 EraseBytes- 00:07:36.570 [2024-11-28 16:31:33.966176] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (571508) > buf size (4096) 00:07:36.570 [2024-11-28 16:31:33.966548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2e1c0237 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.570 [2024-11-28 16:31:33.966579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.570 #18 NEW cov: 12458 ft: 14459 corp: 11/107b lim: 30 exec/s: 0 rss: 72Mb L: 9/19 MS: 1 ChangeBinInt- 00:07:36.570 [2024-11-28 16:31:34.016402] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (56324) > buf size (4096) 00:07:36.571 [2024-11-28 16:31:34.016754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:37000002 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.571 [2024-11-28 16:31:34.016785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.571 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:36.571 #19 NEW cov: 12475 ft: 14521 corp: 12/116b lim: 30 exec/s: 0 rss: 72Mb L: 9/19 MS: 1 ShuffleBytes- 00:07:36.571 [2024-11-28 16:31:34.086518] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (571484) > buf size (4096) 00:07:36.571 [2024-11-28 16:31:34.086889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2e160237 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.571 [2024-11-28 16:31:34.086919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.571 #20 NEW cov: 12475 ft: 14529 corp: 13/125b lim: 30 exec/s: 20 rss: 73Mb L: 9/19 MS: 1 PersAutoDict- DE: ".\0267\002\000\000\000\000"- 00:07:36.571 [2024-11-28 16:31:34.156716] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x2d16 00:07:36.571 [2024-11-28 16:31:34.157048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:37000002 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.571 [2024-11-28 16:31:34.157077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.571 #21 NEW cov: 12475 ft: 14574 corp: 14/134b lim: 30 exec/s: 21 rss: 73Mb L: 9/19 MS: 1 ShuffleBytes- 00:07:36.830 [2024-11-28 16:31:34.227203] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:36.830 [2024-11-28 16:31:34.227541] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xff 00:07:36.830 [2024-11-28 16:31:34.227707] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (149556) > buf size (4096) 00:07:36.830 [2024-11-28 16:31:34.228047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.830 [2024-11-28 16:31:34.228077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.830 [2024-11-28 16:31:34.228197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.830 [2024-11-28 16:31:34.228214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.830 [2024-11-28 16:31:34.228332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.830 [2024-11-28 16:31:34.228351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.830 [2024-11-28 16:31:34.228470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:920c0076 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.830 [2024-11-28 16:31:34.228491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.830 #22 NEW cov: 12492 ft: 15104 corp: 15/158b lim: 30 exec/s: 22 rss: 73Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:07:36.830 [2024-11-28 16:31:34.297154] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (273408) > buf size (4096) 00:07:36.830 [2024-11-28 16:31:34.297509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff8192 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.830 [2024-11-28 16:31:34.297539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.830 #23 NEW cov: 12492 ft: 15121 corp: 16/166b lim: 30 exec/s: 23 rss: 73Mb L: 8/24 MS: 1 EraseBytes- 00:07:36.830 [2024-11-28 16:31:34.367326] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (571484) > buf size (4096) 00:07:36.830 [2024-11-28 16:31:34.367648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2e160237 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.830 [2024-11-28 16:31:34.367676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.830 #24 NEW cov: 12492 ft: 15161 corp: 17/173b lim: 30 exec/s: 24 rss: 73Mb L: 7/24 MS: 1 EraseBytes- 00:07:36.830 [2024-11-28 16:31:34.437785] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (273408) > buf size (4096) 00:07:36.830 [2024-11-28 16:31:34.437959] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (81972) > buf size (4096) 00:07:36.830 [2024-11-28 16:31:34.438590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff8192 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.830 [2024-11-28 16:31:34.438620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.830 [2024-11-28 16:31:34.438738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:500c0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.830 [2024-11-28 16:31:34.438756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.830 [2024-11-28 16:31:34.438870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.830 [2024-11-28 16:31:34.438887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.830 [2024-11-28 16:31:34.439006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:36.830 [2024-11-28 16:31:34.439025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.830 #30 NEW cov: 12492 ft: 15228 corp: 18/200b lim: 30 exec/s: 30 rss: 73Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:07:37.090 [2024-11-28 16:31:34.487630] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3716 00:07:37.090 [2024-11-28 16:31:34.487982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2d000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.090 [2024-11-28 16:31:34.488011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.090 #31 NEW cov: 12492 ft: 15247 corp: 19/209b lim: 30 exec/s: 31 rss: 73Mb L: 9/27 MS: 1 ShuffleBytes- 00:07:37.090 [2024-11-28 16:31:34.537816] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3716 00:07:37.090 [2024-11-28 16:31:34.538159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2d000080 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.090 [2024-11-28 16:31:34.538188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.090 #32 NEW cov: 12492 ft: 15279 corp: 20/218b lim: 30 exec/s: 32 rss: 73Mb L: 9/27 MS: 1 ChangeBit- 00:07:37.090 [2024-11-28 16:31:34.608281] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:37.090 [2024-11-28 16:31:34.608612] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xff 00:07:37.090 [2024-11-28 16:31:34.608778] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (149556) > buf size (4096) 00:07:37.090 [2024-11-28 16:31:34.609121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000001 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.090 [2024-11-28 16:31:34.609152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.090 [2024-11-28 16:31:34.609278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.090 [2024-11-28 16:31:34.609298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.090 [2024-11-28 16:31:34.609418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.090 [2024-11-28 16:31:34.609435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.090 [2024-11-28 16:31:34.609556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:920c0076 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.090 [2024-11-28 16:31:34.609575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.090 #33 NEW cov: 12492 ft: 15349 corp: 21/242b lim: 30 exec/s: 33 rss: 73Mb L: 24/27 MS: 1 ChangeBinInt- 00:07:37.090 [2024-11-28 16:31:34.678236] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (273408) > buf size (4096) 00:07:37.090 [2024-11-28 16:31:34.678619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff8192 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.090 [2024-11-28 16:31:34.678649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.090 #34 NEW cov: 12492 ft: 15384 corp: 22/252b lim: 30 exec/s: 34 rss: 73Mb L: 10/27 MS: 1 ChangeBit- 00:07:37.350 [2024-11-28 16:31:34.738643] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.350 [2024-11-28 16:31:34.738813] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009245 00:07:37.350 [2024-11-28 16:31:34.738971] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (41556) > buf size (4096) 00:07:37.350 [2024-11-28 16:31:34.739317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.350 [2024-11-28 16:31:34.739346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.350 [2024-11-28 16:31:34.739461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0300830a cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.350 [2024-11-28 16:31:34.739481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.350 [2024-11-28 16:31:34.739603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:28940050 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.350 [2024-11-28 16:31:34.739621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.350 #35 NEW cov: 12492 ft: 15398 corp: 23/272b lim: 30 exec/s: 35 rss: 73Mb L: 20/27 MS: 1 CMP- DE: "\377\377\377\377\377\377\003\000"- 00:07:37.350 [2024-11-28 16:31:34.788682] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (273408) > buf size (4096) 00:07:37.350 [2024-11-28 16:31:34.788885] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (148532) > buf size (4096) 00:07:37.350 [2024-11-28 16:31:34.789210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff8192 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.350 [2024-11-28 16:31:34.789239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.350 [2024-11-28 16:31:34.789352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:910c0076 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.350 [2024-11-28 16:31:34.789370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.350 #36 NEW cov: 12492 ft: 15432 corp: 24/284b lim: 30 exec/s: 36 rss: 73Mb L: 12/27 MS: 1 ChangeByte- 00:07:37.350 [2024-11-28 16:31:34.838933] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.350 [2024-11-28 16:31:34.839115] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200001637 00:07:37.350 [2024-11-28 16:31:34.839621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.350 [2024-11-28 16:31:34.839650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.350 [2024-11-28 16:31:34.839770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0300020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.350 [2024-11-28 16:31:34.839789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.350 [2024-11-28 16:31:34.839906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:02000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.350 [2024-11-28 16:31:34.839925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.350 #37 NEW cov: 12492 ft: 15459 corp: 25/304b lim: 30 exec/s: 37 rss: 73Mb L: 20/27 MS: 1 PersAutoDict- DE: ".\0267\002\000\000\000\000"- 00:07:37.350 [2024-11-28 16:31:34.908933] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (273408) > buf size (4096) 00:07:37.350 [2024-11-28 16:31:34.909281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff8192 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.350 [2024-11-28 16:31:34.909309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.350 #38 NEW cov: 12492 ft: 15474 corp: 26/312b lim: 30 exec/s: 38 rss: 74Mb L: 8/27 MS: 1 ChangeBit- 00:07:37.350 [2024-11-28 16:31:34.979412] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:37.350 [2024-11-28 16:31:34.979756] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xff 00:07:37.350 [2024-11-28 16:31:34.979915] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (149556) > buf size (4096) 00:07:37.350 [2024-11-28 16:31:34.980268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000001 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.350 [2024-11-28 16:31:34.980297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.350 [2024-11-28 16:31:34.980415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.350 [2024-11-28 16:31:34.980435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.350 [2024-11-28 16:31:34.980558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.350 [2024-11-28 16:31:34.980581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.350 [2024-11-28 16:31:34.980702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:920c0076 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.350 [2024-11-28 16:31:34.980721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:37.608 #39 NEW cov: 12499 ft: 15503 corp: 27/336b lim: 30 exec/s: 39 rss: 74Mb L: 24/27 MS: 1 ChangeBinInt- 00:07:37.608 [2024-11-28 16:31:35.049504] ctrlr.c:2667:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (571484) > buf size (4096) 00:07:37.608 [2024-11-28 16:31:35.049674] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:37.608 [2024-11-28 16:31:35.050003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2e160237 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.608 [2024-11-28 16:31:35.050034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.608 [2024-11-28 16:31:35.050155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.608 [2024-11-28 16:31:35.050173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.608 #40 NEW cov: 12499 ft: 15513 corp: 28/353b lim: 30 exec/s: 40 rss: 74Mb L: 17/27 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:07:37.608 [2024-11-28 16:31:35.099540] ctrlr.c:2655:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x9491 00:07:37.608 [2024-11-28 16:31:35.100072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aff0045 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:37.608 [2024-11-28 16:31:35.100100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.608 #41 NEW cov: 12499 ft: 15596 corp: 29/363b lim: 30 exec/s: 20 rss: 74Mb L: 10/27 MS: 1 CrossOver- 00:07:37.608 #41 DONE cov: 12499 ft: 15596 corp: 29/363b lim: 30 exec/s: 20 rss: 74Mb 00:07:37.608 ###### Recommended dictionary. ###### 00:07:37.608 "\377\222E(\224P\014v" # Uses: 0 00:07:37.608 ".\0267\002\000\000\000\000" # Uses: 2 00:07:37.608 "\377\377\377\377\377\377\003\000" # Uses: 0 00:07:37.608 "\377\377\377\377\377\377\377\377" # Uses: 0 00:07:37.608 ###### End of recommended dictionary. ###### 00:07:37.608 Done 41 runs in 2 second(s) 00:07:37.608 16:31:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_1.conf /var/tmp/suppress_nvmf_fuzz 00:07:37.608 16:31:35 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:37.608 16:31:35 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:37.608 16:31:35 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:37.608 16:31:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:37.608 16:31:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:37.608 16:31:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:37.608 16:31:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:37.608 16:31:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:37.608 16:31:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:37.608 16:31:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:37.608 16:31:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 2 00:07:37.608 16:31:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4402 00:07:37.608 16:31:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:37.608 16:31:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:37.608 16:31:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:37.608 16:31:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:37.608 16:31:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:37.865 16:31:35 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 00:07:37.865 [2024-11-28 16:31:35.281582] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:37.865 [2024-11-28 16:31:35.281657] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3755166 ] 00:07:38.124 [2024-11-28 16:31:35.531827] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:38.124 [2024-11-28 16:31:35.561379] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.124 [2024-11-28 16:31:35.613625] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:38.124 [2024-11-28 16:31:35.629973] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:38.124 INFO: Running with entropic power schedule (0xFF, 100). 00:07:38.124 INFO: Seed: 2887792877 00:07:38.124 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:07:38.124 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:07:38.124 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:38.124 INFO: A corpus is not provided, starting from an empty corpus 00:07:38.124 #2 INITED exec/s: 0 rss: 64Mb 00:07:38.124 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:38.124 This may also happen if the target rejected all inputs we tried so far 00:07:38.124 [2024-11-28 16:31:35.675256] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.124 [2024-11-28 16:31:35.675384] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.124 [2024-11-28 16:31:35.675498] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.124 [2024-11-28 16:31:35.675614] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.124 [2024-11-28 16:31:35.675837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.124 [2024-11-28 16:31:35.675870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.124 [2024-11-28 16:31:35.675930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.124 [2024-11-28 16:31:35.675947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.124 [2024-11-28 16:31:35.676004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.124 [2024-11-28 16:31:35.676036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.124 [2024-11-28 16:31:35.676092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.124 [2024-11-28 16:31:35.676108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.383 NEW_FUNC[1/714]: 0x455b38 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:38.383 NEW_FUNC[2/714]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:38.383 #3 NEW cov: 12187 ft: 12182 corp: 2/31b lim: 35 exec/s: 0 rss: 72Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:07:38.383 [2024-11-28 16:31:35.986118] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.383 [2024-11-28 16:31:35.986270] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.383 [2024-11-28 16:31:35.986382] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.383 [2024-11-28 16:31:35.986509] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.383 [2024-11-28 16:31:35.986783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.383 [2024-11-28 16:31:35.986817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.383 [2024-11-28 16:31:35.986877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.383 [2024-11-28 16:31:35.986892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.383 [2024-11-28 16:31:35.986949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.383 [2024-11-28 16:31:35.986965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.383 [2024-11-28 16:31:35.987020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.383 [2024-11-28 16:31:35.987035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.383 #4 NEW cov: 12301 ft: 12655 corp: 3/61b lim: 35 exec/s: 0 rss: 72Mb L: 30/30 MS: 1 ChangeBit- 00:07:38.653 [2024-11-28 16:31:36.046160] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.653 [2024-11-28 16:31:36.046285] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.653 [2024-11-28 16:31:36.046398] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.653 [2024-11-28 16:31:36.046510] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.653 [2024-11-28 16:31:36.046751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.653 [2024-11-28 16:31:36.046780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.653 [2024-11-28 16:31:36.046835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.653 [2024-11-28 16:31:36.046850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.653 [2024-11-28 16:31:36.046905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.653 [2024-11-28 16:31:36.046920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.653 [2024-11-28 16:31:36.046975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.653 [2024-11-28 16:31:36.046994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.653 #5 NEW cov: 12307 ft: 13069 corp: 4/89b lim: 35 exec/s: 0 rss: 72Mb L: 28/30 MS: 1 EraseBytes- 00:07:38.653 [2024-11-28 16:31:36.086227] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.653 [2024-11-28 16:31:36.086351] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.653 [2024-11-28 16:31:36.086463] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.653 [2024-11-28 16:31:36.086577] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.653 [2024-11-28 16:31:36.086807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.653 [2024-11-28 16:31:36.086838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.653 [2024-11-28 16:31:36.086894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.653 [2024-11-28 16:31:36.086910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.654 [2024-11-28 16:31:36.086966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.654 [2024-11-28 16:31:36.086980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.654 [2024-11-28 16:31:36.087034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.654 [2024-11-28 16:31:36.087050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.654 #6 NEW cov: 12392 ft: 13358 corp: 5/117b lim: 35 exec/s: 0 rss: 72Mb L: 28/30 MS: 1 CopyPart- 00:07:38.654 [2024-11-28 16:31:36.146441] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.654 [2024-11-28 16:31:36.146564] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.654 [2024-11-28 16:31:36.146685] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.654 [2024-11-28 16:31:36.146800] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.654 [2024-11-28 16:31:36.147011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.654 [2024-11-28 16:31:36.147038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.654 [2024-11-28 16:31:36.147094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.654 [2024-11-28 16:31:36.147111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.654 [2024-11-28 16:31:36.147167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.654 [2024-11-28 16:31:36.147182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.654 [2024-11-28 16:31:36.147237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000006 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.654 [2024-11-28 16:31:36.147254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.654 #7 NEW cov: 12392 ft: 13466 corp: 6/147b lim: 35 exec/s: 0 rss: 72Mb L: 30/30 MS: 1 ChangeBinInt- 00:07:38.654 [2024-11-28 16:31:36.186514] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.654 [2024-11-28 16:31:36.186673] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.654 [2024-11-28 16:31:36.186790] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.654 [2024-11-28 16:31:36.186903] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.654 [2024-11-28 16:31:36.187116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.654 [2024-11-28 16:31:36.187143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.654 [2024-11-28 16:31:36.187199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.654 [2024-11-28 16:31:36.187215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.654 [2024-11-28 16:31:36.187268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.654 [2024-11-28 16:31:36.187284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.654 [2024-11-28 16:31:36.187340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.654 [2024-11-28 16:31:36.187357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.654 #8 NEW cov: 12392 ft: 13565 corp: 7/178b lim: 35 exec/s: 0 rss: 72Mb L: 31/31 MS: 1 CopyPart- 00:07:38.654 [2024-11-28 16:31:36.246677] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.654 [2024-11-28 16:31:36.246803] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.654 [2024-11-28 16:31:36.246933] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.654 [2024-11-28 16:31:36.247049] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.654 [2024-11-28 16:31:36.247273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.654 [2024-11-28 16:31:36.247302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.654 [2024-11-28 16:31:36.247360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.654 [2024-11-28 16:31:36.247376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.654 [2024-11-28 16:31:36.247434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.654 [2024-11-28 16:31:36.247450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.654 [2024-11-28 16:31:36.247506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000006 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.654 [2024-11-28 16:31:36.247523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.654 #9 NEW cov: 12392 ft: 13625 corp: 8/208b lim: 35 exec/s: 0 rss: 72Mb L: 30/31 MS: 1 ChangeBinInt- 00:07:38.918 [2024-11-28 16:31:36.306860] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.918 [2024-11-28 16:31:36.306991] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.918 [2024-11-28 16:31:36.307107] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.918 [2024-11-28 16:31:36.307220] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.918 [2024-11-28 16:31:36.307434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.918 [2024-11-28 16:31:36.307461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.918 [2024-11-28 16:31:36.307532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.918 [2024-11-28 16:31:36.307548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.918 [2024-11-28 16:31:36.307604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.918 [2024-11-28 16:31:36.307620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.918 [2024-11-28 16:31:36.307675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.918 [2024-11-28 16:31:36.307691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.918 #10 NEW cov: 12392 ft: 13658 corp: 9/238b lim: 35 exec/s: 0 rss: 72Mb L: 30/31 MS: 1 ShuffleBytes- 00:07:38.918 [2024-11-28 16:31:36.346966] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.918 [2024-11-28 16:31:36.347093] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.918 [2024-11-28 16:31:36.347207] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.918 [2024-11-28 16:31:36.347319] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.918 [2024-11-28 16:31:36.347542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.918 [2024-11-28 16:31:36.347570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.918 [2024-11-28 16:31:36.347626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.918 [2024-11-28 16:31:36.347643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.919 [2024-11-28 16:31:36.347698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.919 [2024-11-28 16:31:36.347713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.919 [2024-11-28 16:31:36.347769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.919 [2024-11-28 16:31:36.347784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.919 #11 NEW cov: 12392 ft: 13692 corp: 10/266b lim: 35 exec/s: 0 rss: 72Mb L: 28/31 MS: 1 CMP- DE: "\007\000\000\000"- 00:07:38.919 [2024-11-28 16:31:36.387091] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.919 [2024-11-28 16:31:36.387215] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.919 [2024-11-28 16:31:36.387339] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.919 [2024-11-28 16:31:36.387450] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.919 [2024-11-28 16:31:36.387677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.919 [2024-11-28 16:31:36.387706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.919 [2024-11-28 16:31:36.387760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.919 [2024-11-28 16:31:36.387776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.919 [2024-11-28 16:31:36.387831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.919 [2024-11-28 16:31:36.387846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.919 [2024-11-28 16:31:36.387901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000006 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.919 [2024-11-28 16:31:36.387916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.919 #12 NEW cov: 12392 ft: 13740 corp: 11/296b lim: 35 exec/s: 0 rss: 72Mb L: 30/31 MS: 1 ChangeBit- 00:07:38.919 [2024-11-28 16:31:36.447194] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.919 [2024-11-28 16:31:36.447316] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.919 [2024-11-28 16:31:36.447432] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.919 [2024-11-28 16:31:36.447543] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.919 [2024-11-28 16:31:36.447782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.919 [2024-11-28 16:31:36.447809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.919 [2024-11-28 16:31:36.447866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.919 [2024-11-28 16:31:36.447882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.919 [2024-11-28 16:31:36.447938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.919 [2024-11-28 16:31:36.447953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.919 [2024-11-28 16:31:36.448007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.919 [2024-11-28 16:31:36.448021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.919 #13 NEW cov: 12392 ft: 13769 corp: 12/324b lim: 35 exec/s: 0 rss: 72Mb L: 28/31 MS: 1 EraseBytes- 00:07:38.919 [2024-11-28 16:31:36.507400] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.919 [2024-11-28 16:31:36.507525] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.919 [2024-11-28 16:31:36.507664] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:38.919 [2024-11-28 16:31:36.508035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.919 [2024-11-28 16:31:36.508064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.919 [2024-11-28 16:31:36.508121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.919 [2024-11-28 16:31:36.508137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.919 [2024-11-28 16:31:36.508192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:38.919 [2024-11-28 16:31:36.508207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.919 #14 NEW cov: 12392 ft: 14312 corp: 13/358b lim: 35 exec/s: 0 rss: 73Mb L: 34/34 MS: 1 CrossOver- 00:07:39.179 [2024-11-28 16:31:36.567571] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.179 [2024-11-28 16:31:36.567702] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.179 [2024-11-28 16:31:36.567819] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.179 [2024-11-28 16:31:36.567936] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.179 [2024-11-28 16:31:36.568157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.179 [2024-11-28 16:31:36.568184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.179 [2024-11-28 16:31:36.568243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:f7000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.179 [2024-11-28 16:31:36.568260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.179 [2024-11-28 16:31:36.568316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.179 [2024-11-28 16:31:36.568332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.179 [2024-11-28 16:31:36.568391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.179 [2024-11-28 16:31:36.568407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.179 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:39.179 #15 NEW cov: 12415 ft: 14365 corp: 14/388b lim: 35 exec/s: 0 rss: 73Mb L: 30/34 MS: 1 ChangeBinInt- 00:07:39.179 [2024-11-28 16:31:36.607722] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.179 [2024-11-28 16:31:36.607864] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.179 [2024-11-28 16:31:36.607979] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.179 [2024-11-28 16:31:36.608094] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.179 [2024-11-28 16:31:36.608422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.179 [2024-11-28 16:31:36.608450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.179 [2024-11-28 16:31:36.608512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.179 [2024-11-28 16:31:36.608529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.179 [2024-11-28 16:31:36.608584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.179 [2024-11-28 16:31:36.608604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.179 [2024-11-28 16:31:36.608660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:8b000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.179 [2024-11-28 16:31:36.608674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.179 [2024-11-28 16:31:36.608729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:8b00008b cdw11:00000040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.179 [2024-11-28 16:31:36.608743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:39.179 #16 NEW cov: 12425 ft: 14462 corp: 15/423b lim: 35 exec/s: 0 rss: 73Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:39.179 [2024-11-28 16:31:36.647766] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.179 [2024-11-28 16:31:36.647896] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.180 [2024-11-28 16:31:36.648014] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.180 [2024-11-28 16:31:36.648129] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.180 [2024-11-28 16:31:36.648350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.180 [2024-11-28 16:31:36.648378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.180 [2024-11-28 16:31:36.648438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.180 [2024-11-28 16:31:36.648454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.180 [2024-11-28 16:31:36.648512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.180 [2024-11-28 16:31:36.648527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.180 [2024-11-28 16:31:36.648585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.180 [2024-11-28 16:31:36.648605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.180 #17 NEW cov: 12425 ft: 14493 corp: 16/452b lim: 35 exec/s: 17 rss: 73Mb L: 29/35 MS: 1 InsertByte- 00:07:39.180 [2024-11-28 16:31:36.687881] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.180 [2024-11-28 16:31:36.688008] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.180 [2024-11-28 16:31:36.688122] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.180 [2024-11-28 16:31:36.688237] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.180 [2024-11-28 16:31:36.688453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.180 [2024-11-28 16:31:36.688483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.180 [2024-11-28 16:31:36.688541] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.180 [2024-11-28 16:31:36.688558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.180 [2024-11-28 16:31:36.688615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.180 [2024-11-28 16:31:36.688630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.180 [2024-11-28 16:31:36.688688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000006 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.180 [2024-11-28 16:31:36.688704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.180 #18 NEW cov: 12425 ft: 14509 corp: 17/482b lim: 35 exec/s: 18 rss: 73Mb L: 30/35 MS: 1 ChangeByte- 00:07:39.180 [2024-11-28 16:31:36.728056] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.180 [2024-11-28 16:31:36.728180] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.180 [2024-11-28 16:31:36.728300] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.180 [2024-11-28 16:31:36.728416] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.180 [2024-11-28 16:31:36.728650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.180 [2024-11-28 16:31:36.728678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.180 [2024-11-28 16:31:36.728739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.180 [2024-11-28 16:31:36.728757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.180 [2024-11-28 16:31:36.728815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.180 [2024-11-28 16:31:36.728831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.180 [2024-11-28 16:31:36.728890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.180 [2024-11-28 16:31:36.728906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.180 #19 NEW cov: 12425 ft: 14562 corp: 18/512b lim: 35 exec/s: 19 rss: 73Mb L: 30/35 MS: 1 ShuffleBytes- 00:07:39.180 [2024-11-28 16:31:36.768126] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.180 [2024-11-28 16:31:36.768255] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.180 [2024-11-28 16:31:36.768391] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.180 [2024-11-28 16:31:36.768508] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.180 [2024-11-28 16:31:36.768745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.180 [2024-11-28 16:31:36.768774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.180 [2024-11-28 16:31:36.768836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000007 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.180 [2024-11-28 16:31:36.768853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.180 [2024-11-28 16:31:36.768913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.180 [2024-11-28 16:31:36.768928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.180 [2024-11-28 16:31:36.768987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.180 [2024-11-28 16:31:36.769004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.180 #20 NEW cov: 12425 ft: 14606 corp: 19/546b lim: 35 exec/s: 20 rss: 73Mb L: 34/35 MS: 1 PersAutoDict- DE: "\007\000\000\000"- 00:07:39.441 [2024-11-28 16:31:36.828376] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.441 [2024-11-28 16:31:36.828521] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.441 [2024-11-28 16:31:36.828750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.441 [2024-11-28 16:31:36.828777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.441 [2024-11-28 16:31:36.828832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.441 [2024-11-28 16:31:36.828849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.441 [2024-11-28 16:31:36.828902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000006 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.441 [2024-11-28 16:31:36.828917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.441 #21 NEW cov: 12425 ft: 15033 corp: 20/569b lim: 35 exec/s: 21 rss: 73Mb L: 23/35 MS: 1 EraseBytes- 00:07:39.441 [2024-11-28 16:31:36.888462] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.441 [2024-11-28 16:31:36.888586] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.441 [2024-11-28 16:31:36.888709] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.441 [2024-11-28 16:31:36.888828] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.441 [2024-11-28 16:31:36.889186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.441 [2024-11-28 16:31:36.889215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.441 [2024-11-28 16:31:36.889275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.441 [2024-11-28 16:31:36.889292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.441 [2024-11-28 16:31:36.889348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.441 [2024-11-28 16:31:36.889364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.441 [2024-11-28 16:31:36.889423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:8b000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.441 [2024-11-28 16:31:36.889440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.441 [2024-11-28 16:31:36.889499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:0000008b cdw11:00000040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.441 [2024-11-28 16:31:36.889512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:39.441 #22 NEW cov: 12425 ft: 15059 corp: 21/604b lim: 35 exec/s: 22 rss: 73Mb L: 35/35 MS: 1 CrossOver- 00:07:39.441 [2024-11-28 16:31:36.948647] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.441 [2024-11-28 16:31:36.948772] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.441 [2024-11-28 16:31:36.948890] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.441 [2024-11-28 16:31:36.949003] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.441 [2024-11-28 16:31:36.949240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.441 [2024-11-28 16:31:36.949268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.441 [2024-11-28 16:31:36.949327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.441 [2024-11-28 16:31:36.949343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.441 [2024-11-28 16:31:36.949402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.441 [2024-11-28 16:31:36.949418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.441 [2024-11-28 16:31:36.949473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.441 [2024-11-28 16:31:36.949488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.441 #23 NEW cov: 12425 ft: 15067 corp: 22/634b lim: 35 exec/s: 23 rss: 73Mb L: 30/35 MS: 1 CopyPart- 00:07:39.441 [2024-11-28 16:31:36.988746] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.441 [2024-11-28 16:31:36.988878] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.441 [2024-11-28 16:31:36.988996] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.441 [2024-11-28 16:31:36.989111] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.441 [2024-11-28 16:31:36.989337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.441 [2024-11-28 16:31:36.989365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.441 [2024-11-28 16:31:36.989424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.441 [2024-11-28 16:31:36.989441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.441 [2024-11-28 16:31:36.989498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.441 [2024-11-28 16:31:36.989516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.441 [2024-11-28 16:31:36.989575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.441 [2024-11-28 16:31:36.989590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.441 #24 NEW cov: 12425 ft: 15098 corp: 23/665b lim: 35 exec/s: 24 rss: 73Mb L: 31/35 MS: 1 InsertByte- 00:07:39.441 [2024-11-28 16:31:37.048889] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.441 [2024-11-28 16:31:37.049020] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.441 [2024-11-28 16:31:37.049156] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.441 [2024-11-28 16:31:37.049272] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.441 [2024-11-28 16:31:37.049494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.441 [2024-11-28 16:31:37.049522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.441 [2024-11-28 16:31:37.049581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.441 [2024-11-28 16:31:37.049602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.441 [2024-11-28 16:31:37.049658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.441 [2024-11-28 16:31:37.049674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.442 [2024-11-28 16:31:37.049732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00cb0000 cdw11:00000006 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.442 [2024-11-28 16:31:37.049748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.442 #25 NEW cov: 12425 ft: 15104 corp: 24/695b lim: 35 exec/s: 25 rss: 73Mb L: 30/35 MS: 1 ChangeByte- 00:07:39.701 [2024-11-28 16:31:37.089021] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.701 [2024-11-28 16:31:37.089171] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.701 [2024-11-28 16:31:37.089297] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.701 [2024-11-28 16:31:37.089419] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.701 [2024-11-28 16:31:37.089632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.701 [2024-11-28 16:31:37.089676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.702 [2024-11-28 16:31:37.089735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.702 [2024-11-28 16:31:37.089750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.702 [2024-11-28 16:31:37.089806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.702 [2024-11-28 16:31:37.089826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.702 [2024-11-28 16:31:37.089885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000006 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.702 [2024-11-28 16:31:37.089902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.702 #26 NEW cov: 12425 ft: 15110 corp: 25/725b lim: 35 exec/s: 26 rss: 73Mb L: 30/35 MS: 1 ShuffleBytes- 00:07:39.702 [2024-11-28 16:31:37.129112] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.702 [2024-11-28 16:31:37.129240] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.702 [2024-11-28 16:31:37.129360] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.702 [2024-11-28 16:31:37.129477] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.702 [2024-11-28 16:31:37.129712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.702 [2024-11-28 16:31:37.129740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.702 [2024-11-28 16:31:37.129800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000007 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.702 [2024-11-28 16:31:37.129817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.702 [2024-11-28 16:31:37.129874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.702 [2024-11-28 16:31:37.129890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.702 [2024-11-28 16:31:37.129946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.702 [2024-11-28 16:31:37.129963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.702 #27 NEW cov: 12425 ft: 15113 corp: 26/753b lim: 35 exec/s: 27 rss: 73Mb L: 28/35 MS: 1 CopyPart- 00:07:39.702 [2024-11-28 16:31:37.189289] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.702 [2024-11-28 16:31:37.189419] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.702 [2024-11-28 16:31:37.189542] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.702 [2024-11-28 16:31:37.189672] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.702 [2024-11-28 16:31:37.190006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.702 [2024-11-28 16:31:37.190034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.702 [2024-11-28 16:31:37.190091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.702 [2024-11-28 16:31:37.190107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.702 [2024-11-28 16:31:37.190163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.702 [2024-11-28 16:31:37.190178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.702 [2024-11-28 16:31:37.190238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:8b000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.702 [2024-11-28 16:31:37.190254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.702 [2024-11-28 16:31:37.190311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:0000008b cdw11:00000040 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.702 [2024-11-28 16:31:37.190325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:39.702 #28 NEW cov: 12425 ft: 15133 corp: 27/788b lim: 35 exec/s: 28 rss: 73Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:39.702 [2024-11-28 16:31:37.249478] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.702 [2024-11-28 16:31:37.249614] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.702 [2024-11-28 16:31:37.249736] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.702 [2024-11-28 16:31:37.249853] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.702 [2024-11-28 16:31:37.250081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.702 [2024-11-28 16:31:37.250107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.702 [2024-11-28 16:31:37.250166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.702 [2024-11-28 16:31:37.250182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.702 [2024-11-28 16:31:37.250238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.702 [2024-11-28 16:31:37.250253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.702 [2024-11-28 16:31:37.250311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00cb0000 cdw11:00000006 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.702 [2024-11-28 16:31:37.250325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.702 #29 NEW cov: 12425 ft: 15150 corp: 28/818b lim: 35 exec/s: 29 rss: 73Mb L: 30/35 MS: 1 CMP- DE: "\001\000\000\020"- 00:07:39.702 [2024-11-28 16:31:37.309636] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.702 [2024-11-28 16:31:37.309767] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.702 [2024-11-28 16:31:37.309904] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.702 [2024-11-28 16:31:37.310024] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.702 [2024-11-28 16:31:37.310246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.702 [2024-11-28 16:31:37.310273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.702 [2024-11-28 16:31:37.310332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.702 [2024-11-28 16:31:37.310349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.702 [2024-11-28 16:31:37.310403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.702 [2024-11-28 16:31:37.310422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.702 [2024-11-28 16:31:37.310481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.702 [2024-11-28 16:31:37.310497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.702 #30 NEW cov: 12425 ft: 15172 corp: 29/848b lim: 35 exec/s: 30 rss: 73Mb L: 30/35 MS: 1 CrossOver- 00:07:39.962 [2024-11-28 16:31:37.349710] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.962 [2024-11-28 16:31:37.349834] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.962 [2024-11-28 16:31:37.349948] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.962 [2024-11-28 16:31:37.350059] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.962 [2024-11-28 16:31:37.350288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.962 [2024-11-28 16:31:37.350314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.962 [2024-11-28 16:31:37.350371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.962 [2024-11-28 16:31:37.350387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.962 [2024-11-28 16:31:37.350444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.962 [2024-11-28 16:31:37.350457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.962 [2024-11-28 16:31:37.350513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.962 [2024-11-28 16:31:37.350529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.962 #31 NEW cov: 12425 ft: 15189 corp: 30/878b lim: 35 exec/s: 31 rss: 73Mb L: 30/35 MS: 1 ShuffleBytes- 00:07:39.962 [2024-11-28 16:31:37.389828] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.963 [2024-11-28 16:31:37.389951] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.963 [2024-11-28 16:31:37.390069] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.963 [2024-11-28 16:31:37.390179] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.963 [2024-11-28 16:31:37.390397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.963 [2024-11-28 16:31:37.390424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.963 [2024-11-28 16:31:37.390479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.963 [2024-11-28 16:31:37.390496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.963 [2024-11-28 16:31:37.390554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:0000eb00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.963 [2024-11-28 16:31:37.390570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.963 [2024-11-28 16:31:37.390627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.963 [2024-11-28 16:31:37.390642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.963 #32 NEW cov: 12425 ft: 15196 corp: 31/909b lim: 35 exec/s: 32 rss: 74Mb L: 31/35 MS: 1 ChangeByte- 00:07:39.963 [2024-11-28 16:31:37.450023] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.963 [2024-11-28 16:31:37.450146] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.963 [2024-11-28 16:31:37.450259] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.963 [2024-11-28 16:31:37.450372] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.963 [2024-11-28 16:31:37.450584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.963 [2024-11-28 16:31:37.450614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.963 [2024-11-28 16:31:37.450672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000007 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.963 [2024-11-28 16:31:37.450688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.963 [2024-11-28 16:31:37.450744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.963 [2024-11-28 16:31:37.450759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.963 [2024-11-28 16:31:37.450813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.963 [2024-11-28 16:31:37.450828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.963 #33 NEW cov: 12425 ft: 15201 corp: 32/943b lim: 35 exec/s: 33 rss: 74Mb L: 34/35 MS: 1 ChangeASCIIInt- 00:07:39.963 [2024-11-28 16:31:37.490135] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.963 [2024-11-28 16:31:37.490263] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.963 [2024-11-28 16:31:37.490379] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.963 [2024-11-28 16:31:37.490718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.963 [2024-11-28 16:31:37.490746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.963 [2024-11-28 16:31:37.490805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.963 [2024-11-28 16:31:37.490821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.963 [2024-11-28 16:31:37.490879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.963 [2024-11-28 16:31:37.490895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.963 [2024-11-28 16:31:37.490951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000009 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.963 [2024-11-28 16:31:37.490968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.963 #34 NEW cov: 12425 ft: 15217 corp: 33/973b lim: 35 exec/s: 34 rss: 74Mb L: 30/35 MS: 1 ChangeBinInt- 00:07:39.963 [2024-11-28 16:31:37.530232] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.963 [2024-11-28 16:31:37.530353] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.963 [2024-11-28 16:31:37.530465] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.963 [2024-11-28 16:31:37.530575] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.963 [2024-11-28 16:31:37.530817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.963 [2024-11-28 16:31:37.530844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.963 [2024-11-28 16:31:37.530903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.963 [2024-11-28 16:31:37.530919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.963 [2024-11-28 16:31:37.530977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.963 [2024-11-28 16:31:37.530992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.963 [2024-11-28 16:31:37.531047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.963 [2024-11-28 16:31:37.531063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.963 #35 NEW cov: 12425 ft: 15280 corp: 34/1006b lim: 35 exec/s: 35 rss: 74Mb L: 33/35 MS: 1 PersAutoDict- DE: "\001\000\000\020"- 00:07:39.963 [2024-11-28 16:31:37.590430] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.963 [2024-11-28 16:31:37.590550] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.963 [2024-11-28 16:31:37.590690] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.963 [2024-11-28 16:31:37.590805] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:39.963 [2024-11-28 16:31:37.591024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.963 [2024-11-28 16:31:37.591054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.963 [2024-11-28 16:31:37.591114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000007 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.963 [2024-11-28 16:31:37.591130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.963 [2024-11-28 16:31:37.591188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.963 [2024-11-28 16:31:37.591204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.963 [2024-11-28 16:31:37.591260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:39.963 [2024-11-28 16:31:37.591276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.224 #36 NEW cov: 12425 ft: 15292 corp: 35/1040b lim: 35 exec/s: 36 rss: 74Mb L: 34/35 MS: 1 PersAutoDict- DE: "\007\000\000\000"- 00:07:40.224 [2024-11-28 16:31:37.630513] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.224 [2024-11-28 16:31:37.630648] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.224 [2024-11-28 16:31:37.630756] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.224 [2024-11-28 16:31:37.630869] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.224 [2024-11-28 16:31:37.631086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.224 [2024-11-28 16:31:37.631112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.224 [2024-11-28 16:31:37.631171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000007 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.224 [2024-11-28 16:31:37.631188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.224 [2024-11-28 16:31:37.631243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.224 [2024-11-28 16:31:37.631258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.224 [2024-11-28 16:31:37.631315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.224 [2024-11-28 16:31:37.631331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.224 #37 NEW cov: 12425 ft: 15315 corp: 36/1074b lim: 35 exec/s: 37 rss: 74Mb L: 34/35 MS: 1 ShuffleBytes- 00:07:40.224 [2024-11-28 16:31:37.670627] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.224 [2024-11-28 16:31:37.670753] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.224 [2024-11-28 16:31:37.670869] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.224 [2024-11-28 16:31:37.670984] ctrlr.c:2750:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:40.224 [2024-11-28 16:31:37.671203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.224 [2024-11-28 16:31:37.671229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.224 [2024-11-28 16:31:37.671288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.224 [2024-11-28 16:31:37.671304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.224 [2024-11-28 16:31:37.671359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ad000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.224 [2024-11-28 16:31:37.671374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.224 [2024-11-28 16:31:37.671430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:40.224 [2024-11-28 16:31:37.671445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.224 #38 NEW cov: 12425 ft: 15340 corp: 37/1105b lim: 35 exec/s: 19 rss: 74Mb L: 31/35 MS: 1 InsertByte- 00:07:40.224 #38 DONE cov: 12425 ft: 15340 corp: 37/1105b lim: 35 exec/s: 19 rss: 74Mb 00:07:40.224 ###### Recommended dictionary. ###### 00:07:40.224 "\007\000\000\000" # Uses: 2 00:07:40.224 "\001\000\000\020" # Uses: 1 00:07:40.224 ###### End of recommended dictionary. ###### 00:07:40.224 Done 38 runs in 2 second(s) 00:07:40.224 16:31:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_2.conf /var/tmp/suppress_nvmf_fuzz 00:07:40.224 16:31:37 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:40.224 16:31:37 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:40.224 16:31:37 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:40.224 16:31:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:40.224 16:31:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:40.224 16:31:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:40.224 16:31:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:40.224 16:31:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:40.224 16:31:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:40.224 16:31:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:40.224 16:31:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 3 00:07:40.224 16:31:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4403 00:07:40.224 16:31:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:40.224 16:31:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:40.224 16:31:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:40.224 16:31:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:40.224 16:31:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:40.225 16:31:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 00:07:40.484 [2024-11-28 16:31:37.871402] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:40.484 [2024-11-28 16:31:37.871473] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3755650 ] 00:07:40.484 [2024-11-28 16:31:38.124841] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.743 [2024-11-28 16:31:38.155694] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.743 [2024-11-28 16:31:38.207866] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:40.743 [2024-11-28 16:31:38.224211] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:40.743 INFO: Running with entropic power schedule (0xFF, 100). 00:07:40.743 INFO: Seed: 1185832126 00:07:40.743 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:07:40.743 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:07:40.743 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:40.743 INFO: A corpus is not provided, starting from an empty corpus 00:07:40.743 #2 INITED exec/s: 0 rss: 64Mb 00:07:40.743 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:40.743 This may also happen if the target rejected all inputs we tried so far 00:07:40.743 [2024-11-28 16:31:38.283621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:40.743 [2024-11-28 16:31:38.283657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.003 NEW_FUNC[1/723]: 0x457818 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:41.003 NEW_FUNC[2/723]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:41.003 #4 NEW cov: 12420 ft: 12389 corp: 2/13b lim: 20 exec/s: 0 rss: 72Mb L: 12/12 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:41.003 [2024-11-28 16:31:38.614318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.003 [2024-11-28 16:31:38.614355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.003 #6 NEW cov: 12535 ft: 13373 corp: 3/22b lim: 20 exec/s: 0 rss: 72Mb L: 9/12 MS: 2 ChangeBit-CrossOver- 00:07:41.262 [2024-11-28 16:31:38.654559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.262 [2024-11-28 16:31:38.654587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.262 NEW_FUNC[1/1]: 0x12dfb58 in nvmf_ctrlr_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3550 00:07:41.262 #7 NEW cov: 12558 ft: 13678 corp: 4/34b lim: 20 exec/s: 0 rss: 72Mb L: 12/12 MS: 1 ChangeBit- 00:07:41.262 [2024-11-28 16:31:38.714631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.262 [2024-11-28 16:31:38.714658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.262 #8 NEW cov: 12649 ft: 13950 corp: 5/46b lim: 20 exec/s: 0 rss: 72Mb L: 12/12 MS: 1 ChangeByte- 00:07:41.262 #11 NEW cov: 12649 ft: 14099 corp: 6/56b lim: 20 exec/s: 0 rss: 72Mb L: 10/12 MS: 3 ChangeByte-CrossOver-InsertRepeatedBytes- 00:07:41.262 [2024-11-28 16:31:38.794787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.262 [2024-11-28 16:31:38.794814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.262 #12 NEW cov: 12649 ft: 14216 corp: 7/65b lim: 20 exec/s: 0 rss: 72Mb L: 9/12 MS: 1 ChangeByte- 00:07:41.262 [2024-11-28 16:31:38.855011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.262 [2024-11-28 16:31:38.855038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.262 #13 NEW cov: 12649 ft: 14322 corp: 8/77b lim: 20 exec/s: 0 rss: 72Mb L: 12/12 MS: 1 ChangeByte- 00:07:41.546 [2024-11-28 16:31:38.915427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.546 [2024-11-28 16:31:38.915454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.546 #14 NEW cov: 12666 ft: 14535 corp: 9/94b lim: 20 exec/s: 0 rss: 72Mb L: 17/17 MS: 1 CMP- DE: "\323\311nj+E\223\000"- 00:07:41.546 [2024-11-28 16:31:38.955428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.546 [2024-11-28 16:31:38.955454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.546 #15 NEW cov: 12666 ft: 14623 corp: 10/111b lim: 20 exec/s: 0 rss: 73Mb L: 17/17 MS: 1 CopyPart- 00:07:41.546 [2024-11-28 16:31:39.015456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.546 [2024-11-28 16:31:39.015482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.546 #16 NEW cov: 12666 ft: 14711 corp: 11/123b lim: 20 exec/s: 0 rss: 73Mb L: 12/17 MS: 1 ShuffleBytes- 00:07:41.546 [2024-11-28 16:31:39.075690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.546 [2024-11-28 16:31:39.075716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.546 #17 NEW cov: 12666 ft: 14745 corp: 12/140b lim: 20 exec/s: 0 rss: 73Mb L: 17/17 MS: 1 ShuffleBytes- 00:07:41.546 [2024-11-28 16:31:39.115755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.546 [2024-11-28 16:31:39.115780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.546 #18 NEW cov: 12666 ft: 14829 corp: 13/158b lim: 20 exec/s: 0 rss: 73Mb L: 18/18 MS: 1 CopyPart- 00:07:41.546 [2024-11-28 16:31:39.175905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.546 [2024-11-28 16:31:39.175931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.806 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:41.806 #19 NEW cov: 12689 ft: 14864 corp: 14/171b lim: 20 exec/s: 0 rss: 73Mb L: 13/18 MS: 1 InsertByte- 00:07:41.806 #20 NEW cov: 12689 ft: 14882 corp: 15/181b lim: 20 exec/s: 0 rss: 73Mb L: 10/18 MS: 1 ChangeByte- 00:07:41.806 [2024-11-28 16:31:39.276260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.806 [2024-11-28 16:31:39.276285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.806 #21 NEW cov: 12689 ft: 14936 corp: 16/193b lim: 20 exec/s: 21 rss: 73Mb L: 12/18 MS: 1 ChangeBit- 00:07:41.806 #22 NEW cov: 12689 ft: 15013 corp: 17/203b lim: 20 exec/s: 22 rss: 73Mb L: 10/18 MS: 1 InsertByte- 00:07:41.806 [2024-11-28 16:31:39.376512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.806 [2024-11-28 16:31:39.376539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.806 #23 NEW cov: 12689 ft: 15033 corp: 18/215b lim: 20 exec/s: 23 rss: 73Mb L: 12/18 MS: 1 ShuffleBytes- 00:07:42.066 #24 NEW cov: 12689 ft: 15056 corp: 19/233b lim: 20 exec/s: 24 rss: 73Mb L: 18/18 MS: 1 PersAutoDict- DE: "\323\311nj+E\223\000"- 00:07:42.066 [2024-11-28 16:31:39.476945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.066 [2024-11-28 16:31:39.476973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.066 [2024-11-28 16:31:39.477093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:2 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.066 [2024-11-28 16:31:39.477110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:2 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.066 #25 NEW cov: 12691 ft: 15701 corp: 20/253b lim: 20 exec/s: 25 rss: 73Mb L: 20/20 MS: 1 CopyPart- 00:07:42.066 [2024-11-28 16:31:39.516826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.066 [2024-11-28 16:31:39.516853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.066 #26 NEW cov: 12691 ft: 15713 corp: 21/265b lim: 20 exec/s: 26 rss: 73Mb L: 12/20 MS: 1 ShuffleBytes- 00:07:42.066 [2024-11-28 16:31:39.557014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.066 [2024-11-28 16:31:39.557044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.066 #27 NEW cov: 12691 ft: 15733 corp: 22/283b lim: 20 exec/s: 27 rss: 73Mb L: 18/20 MS: 1 ChangeByte- 00:07:42.066 [2024-11-28 16:31:39.617190] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.066 [2024-11-28 16:31:39.617216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.066 #28 NEW cov: 12691 ft: 15748 corp: 23/295b lim: 20 exec/s: 28 rss: 73Mb L: 12/20 MS: 1 ShuffleBytes- 00:07:42.326 #29 NEW cov: 12691 ft: 15762 corp: 24/313b lim: 20 exec/s: 29 rss: 73Mb L: 18/20 MS: 1 ChangeByte- 00:07:42.326 [2024-11-28 16:31:39.737548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.326 [2024-11-28 16:31:39.737574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.326 #30 NEW cov: 12691 ft: 15764 corp: 25/330b lim: 20 exec/s: 30 rss: 73Mb L: 17/20 MS: 1 ShuffleBytes- 00:07:42.326 #31 NEW cov: 12691 ft: 15791 corp: 26/343b lim: 20 exec/s: 31 rss: 74Mb L: 13/20 MS: 1 InsertByte- 00:07:42.326 [2024-11-28 16:31:39.837769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.326 [2024-11-28 16:31:39.837796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.326 #32 NEW cov: 12691 ft: 15809 corp: 27/356b lim: 20 exec/s: 32 rss: 74Mb L: 13/20 MS: 1 InsertByte- 00:07:42.326 [2024-11-28 16:31:39.877763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.326 [2024-11-28 16:31:39.877790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.326 #33 NEW cov: 12691 ft: 15866 corp: 28/366b lim: 20 exec/s: 33 rss: 74Mb L: 10/20 MS: 1 CrossOver- 00:07:42.326 [2024-11-28 16:31:39.918010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.326 [2024-11-28 16:31:39.918035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.326 #34 NEW cov: 12691 ft: 15875 corp: 29/378b lim: 20 exec/s: 34 rss: 74Mb L: 12/20 MS: 1 ChangeByte- 00:07:42.586 #37 NEW cov: 12691 ft: 16133 corp: 30/382b lim: 20 exec/s: 37 rss: 74Mb L: 4/20 MS: 3 CrossOver-ChangeByte-InsertByte- 00:07:42.586 #38 NEW cov: 12691 ft: 16145 corp: 31/392b lim: 20 exec/s: 38 rss: 74Mb L: 10/20 MS: 1 CMP- DE: "\001\000\000\000\000\000\004\000"- 00:07:42.586 [2024-11-28 16:31:40.058473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.586 [2024-11-28 16:31:40.058510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.586 #39 NEW cov: 12691 ft: 16153 corp: 32/409b lim: 20 exec/s: 39 rss: 74Mb L: 17/20 MS: 1 ChangeBinInt- 00:07:42.586 #40 NEW cov: 12691 ft: 16160 corp: 33/427b lim: 20 exec/s: 40 rss: 74Mb L: 18/20 MS: 1 InsertByte- 00:07:42.586 [2024-11-28 16:31:40.178724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.586 [2024-11-28 16:31:40.178760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.586 #41 NEW cov: 12691 ft: 16168 corp: 34/439b lim: 20 exec/s: 41 rss: 74Mb L: 12/20 MS: 1 ChangeByte- 00:07:42.845 #42 NEW cov: 12691 ft: 16205 corp: 35/449b lim: 20 exec/s: 42 rss: 74Mb L: 10/20 MS: 1 ChangeByte- 00:07:42.845 [2024-11-28 16:31:40.279033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:42.845 [2024-11-28 16:31:40.279066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.845 #43 NEW cov: 12691 ft: 16208 corp: 36/461b lim: 20 exec/s: 21 rss: 74Mb L: 12/20 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\004\000"- 00:07:42.845 #43 DONE cov: 12691 ft: 16208 corp: 36/461b lim: 20 exec/s: 21 rss: 74Mb 00:07:42.845 ###### Recommended dictionary. ###### 00:07:42.845 "\323\311nj+E\223\000" # Uses: 1 00:07:42.845 "\001\000\000\000\000\000\004\000" # Uses: 1 00:07:42.845 ###### End of recommended dictionary. ###### 00:07:42.845 Done 43 runs in 2 second(s) 00:07:42.845 16:31:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_3.conf /var/tmp/suppress_nvmf_fuzz 00:07:42.845 16:31:40 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:42.845 16:31:40 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:42.846 16:31:40 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:42.846 16:31:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:42.846 16:31:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:42.846 16:31:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:42.846 16:31:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:42.846 16:31:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:42.846 16:31:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:42.846 16:31:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:42.846 16:31:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 4 00:07:42.846 16:31:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4404 00:07:42.846 16:31:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:42.846 16:31:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:42.846 16:31:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:42.846 16:31:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:42.846 16:31:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:42.846 16:31:40 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 00:07:42.846 [2024-11-28 16:31:40.459961] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:42.846 [2024-11-28 16:31:40.460051] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3756179 ] 00:07:43.105 [2024-11-28 16:31:40.710339] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.105 [2024-11-28 16:31:40.741144] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.364 [2024-11-28 16:31:40.793471] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:43.364 [2024-11-28 16:31:40.809822] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:43.364 INFO: Running with entropic power schedule (0xFF, 100). 00:07:43.364 INFO: Seed: 3771823070 00:07:43.365 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:07:43.365 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:07:43.365 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:43.365 INFO: A corpus is not provided, starting from an empty corpus 00:07:43.365 #2 INITED exec/s: 0 rss: 64Mb 00:07:43.365 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:43.365 This may also happen if the target rejected all inputs we tried so far 00:07:43.365 [2024-11-28 16:31:40.855422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000ff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.365 [2024-11-28 16:31:40.855449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.365 [2024-11-28 16:31:40.855502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.365 [2024-11-28 16:31:40.855516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.365 [2024-11-28 16:31:40.855566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.365 [2024-11-28 16:31:40.855580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.624 NEW_FUNC[1/715]: 0x458918 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:43.624 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:43.624 #4 NEW cov: 12208 ft: 12202 corp: 2/26b lim: 35 exec/s: 0 rss: 71Mb L: 25/25 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:07:43.624 [2024-11-28 16:31:41.186291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000ff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.624 [2024-11-28 16:31:41.186321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.624 [2024-11-28 16:31:41.186375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.624 [2024-11-28 16:31:41.186389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.624 [2024-11-28 16:31:41.186441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.624 [2024-11-28 16:31:41.186455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.624 #5 NEW cov: 12321 ft: 12675 corp: 3/51b lim: 35 exec/s: 0 rss: 72Mb L: 25/25 MS: 1 ShuffleBytes- 00:07:43.624 [2024-11-28 16:31:41.246406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000ff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.624 [2024-11-28 16:31:41.246431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.624 [2024-11-28 16:31:41.246499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.624 [2024-11-28 16:31:41.246513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.624 [2024-11-28 16:31:41.246565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.624 [2024-11-28 16:31:41.246578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.884 #11 NEW cov: 12327 ft: 13007 corp: 4/76b lim: 35 exec/s: 0 rss: 72Mb L: 25/25 MS: 1 CrossOver- 00:07:43.884 [2024-11-28 16:31:41.306501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000bf00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.884 [2024-11-28 16:31:41.306531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.884 [2024-11-28 16:31:41.306602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.884 [2024-11-28 16:31:41.306617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.884 [2024-11-28 16:31:41.306669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.884 [2024-11-28 16:31:41.306682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.884 #12 NEW cov: 12412 ft: 13292 corp: 5/101b lim: 35 exec/s: 0 rss: 72Mb L: 25/25 MS: 1 ChangeBit- 00:07:43.884 [2024-11-28 16:31:41.346307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.884 [2024-11-28 16:31:41.346333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.884 #14 NEW cov: 12412 ft: 14124 corp: 6/108b lim: 35 exec/s: 0 rss: 72Mb L: 7/25 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:43.884 [2024-11-28 16:31:41.386768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000ff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.884 [2024-11-28 16:31:41.386794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.884 [2024-11-28 16:31:41.386863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00400000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.884 [2024-11-28 16:31:41.386877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.884 [2024-11-28 16:31:41.386929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.884 [2024-11-28 16:31:41.386944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.884 #15 NEW cov: 12412 ft: 14207 corp: 7/133b lim: 35 exec/s: 0 rss: 72Mb L: 25/25 MS: 1 ChangeBit- 00:07:43.884 [2024-11-28 16:31:41.427188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000ff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.884 [2024-11-28 16:31:41.427213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.884 [2024-11-28 16:31:41.427281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.884 [2024-11-28 16:31:41.427296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.884 [2024-11-28 16:31:41.427351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.884 [2024-11-28 16:31:41.427364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.884 [2024-11-28 16:31:41.427417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.884 [2024-11-28 16:31:41.427430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.884 [2024-11-28 16:31:41.427482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.884 [2024-11-28 16:31:41.427499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:43.884 #16 NEW cov: 12412 ft: 14607 corp: 8/168b lim: 35 exec/s: 0 rss: 72Mb L: 35/35 MS: 1 CopyPart- 00:07:43.884 [2024-11-28 16:31:41.466964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000bf00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.884 [2024-11-28 16:31:41.466990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.884 [2024-11-28 16:31:41.467041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.884 [2024-11-28 16:31:41.467055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.884 [2024-11-28 16:31:41.467106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00004000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.884 [2024-11-28 16:31:41.467119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.884 #17 NEW cov: 12412 ft: 14680 corp: 9/193b lim: 35 exec/s: 0 rss: 72Mb L: 25/35 MS: 1 CrossOver- 00:07:43.884 [2024-11-28 16:31:41.527171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000ff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.884 [2024-11-28 16:31:41.527196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.884 [2024-11-28 16:31:41.527248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00400000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.884 [2024-11-28 16:31:41.527262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.884 [2024-11-28 16:31:41.527316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:01000000 cdw11:00060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:43.884 [2024-11-28 16:31:41.527329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.144 #18 NEW cov: 12412 ft: 14701 corp: 10/218b lim: 35 exec/s: 0 rss: 72Mb L: 25/35 MS: 1 ChangeBinInt- 00:07:44.144 [2024-11-28 16:31:41.587271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000bf00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-11-28 16:31:41.587296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.144 [2024-11-28 16:31:41.587348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0000de00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-11-28 16:31:41.587362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.144 [2024-11-28 16:31:41.587414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-11-28 16:31:41.587427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.144 #19 NEW cov: 12412 ft: 14734 corp: 11/244b lim: 35 exec/s: 0 rss: 72Mb L: 26/35 MS: 1 InsertByte- 00:07:44.144 [2024-11-28 16:31:41.627420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000bf00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-11-28 16:31:41.627445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.144 [2024-11-28 16:31:41.627498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-11-28 16:31:41.627515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.144 [2024-11-28 16:31:41.627565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0000ff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-11-28 16:31:41.627579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.144 #20 NEW cov: 12412 ft: 14753 corp: 12/270b lim: 35 exec/s: 0 rss: 72Mb L: 26/35 MS: 1 InsertByte- 00:07:44.144 [2024-11-28 16:31:41.667530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000bf00 cdw11:10000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-11-28 16:31:41.667556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.144 [2024-11-28 16:31:41.667611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-11-28 16:31:41.667625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.144 [2024-11-28 16:31:41.667675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00004000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-11-28 16:31:41.667689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.144 #21 NEW cov: 12412 ft: 14782 corp: 13/295b lim: 35 exec/s: 0 rss: 72Mb L: 25/35 MS: 1 ChangeBit- 00:07:44.144 [2024-11-28 16:31:41.727679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000bf00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-11-28 16:31:41.727704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.144 [2024-11-28 16:31:41.727773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-11-28 16:31:41.727788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.144 [2024-11-28 16:31:41.727841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2b0cafc8 cdw11:2d450001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-11-28 16:31:41.727855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.144 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:44.144 #22 NEW cov: 12435 ft: 14848 corp: 14/320b lim: 35 exec/s: 0 rss: 72Mb L: 25/35 MS: 1 CMP- DE: "\257\310+\014-E\223\000"- 00:07:44.144 [2024-11-28 16:31:41.767776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000bf00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-11-28 16:31:41.767801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.144 [2024-11-28 16:31:41.767851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-11-28 16:31:41.767864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.144 [2024-11-28 16:31:41.767913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00004000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.144 [2024-11-28 16:31:41.767927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.144 #23 NEW cov: 12435 ft: 14864 corp: 15/341b lim: 35 exec/s: 0 rss: 72Mb L: 21/35 MS: 1 CrossOver- 00:07:44.403 [2024-11-28 16:31:41.807970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000bf00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.403 [2024-11-28 16:31:41.807994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.403 [2024-11-28 16:31:41.808064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.403 [2024-11-28 16:31:41.808078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.403 [2024-11-28 16:31:41.808131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2b0cafc8 cdw11:2c450001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.403 [2024-11-28 16:31:41.808144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.403 #24 NEW cov: 12435 ft: 14897 corp: 16/366b lim: 35 exec/s: 24 rss: 73Mb L: 25/35 MS: 1 ChangeBit- 00:07:44.403 [2024-11-28 16:31:41.868103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000bf00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.403 [2024-11-28 16:31:41.868127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.403 [2024-11-28 16:31:41.868196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.403 [2024-11-28 16:31:41.868210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.403 [2024-11-28 16:31:41.868261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.403 [2024-11-28 16:31:41.868274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.403 #25 NEW cov: 12435 ft: 14901 corp: 17/391b lim: 35 exec/s: 25 rss: 73Mb L: 25/35 MS: 1 CMP- DE: "\012\000\000\000\000\000\000\000"- 00:07:44.403 [2024-11-28 16:31:41.908259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:000aff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.403 [2024-11-28 16:31:41.908283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.403 [2024-11-28 16:31:41.908337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.403 [2024-11-28 16:31:41.908350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.403 [2024-11-28 16:31:41.908404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:01000000 cdw11:00060000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.403 [2024-11-28 16:31:41.908417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.403 #26 NEW cov: 12435 ft: 14957 corp: 18/416b lim: 35 exec/s: 26 rss: 73Mb L: 25/35 MS: 1 PersAutoDict- DE: "\012\000\000\000\000\000\000\000"- 00:07:44.403 [2024-11-28 16:31:41.968098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00400000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.403 [2024-11-28 16:31:41.968123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.403 #27 NEW cov: 12435 ft: 14961 corp: 19/423b lim: 35 exec/s: 27 rss: 73Mb L: 7/35 MS: 1 ChangeBit- 00:07:44.403 [2024-11-28 16:31:42.028302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.403 [2024-11-28 16:31:42.028329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.663 #31 NEW cov: 12435 ft: 14980 corp: 20/433b lim: 35 exec/s: 31 rss: 73Mb L: 10/35 MS: 4 ChangeByte-ShuffleBytes-InsertRepeatedBytes-InsertRepeatedBytes- 00:07:44.663 [2024-11-28 16:31:42.068509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000ff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.663 [2024-11-28 16:31:42.068534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.663 [2024-11-28 16:31:42.068607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.663 [2024-11-28 16:31:42.068622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.663 #32 NEW cov: 12435 ft: 15237 corp: 21/450b lim: 35 exec/s: 32 rss: 73Mb L: 17/35 MS: 1 EraseBytes- 00:07:44.663 [2024-11-28 16:31:42.108975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000bf00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.663 [2024-11-28 16:31:42.108999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.663 [2024-11-28 16:31:42.109069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.663 [2024-11-28 16:31:42.109083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.663 [2024-11-28 16:31:42.109134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00ff0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.663 [2024-11-28 16:31:42.109147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.663 [2024-11-28 16:31:42.109198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.663 [2024-11-28 16:31:42.109211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.663 #33 NEW cov: 12435 ft: 15275 corp: 22/479b lim: 35 exec/s: 33 rss: 73Mb L: 29/35 MS: 1 InsertRepeatedBytes- 00:07:44.663 [2024-11-28 16:31:42.169147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000bf00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.663 [2024-11-28 16:31:42.169171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.663 [2024-11-28 16:31:42.169241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.663 [2024-11-28 16:31:42.169255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.664 [2024-11-28 16:31:42.169308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.664 [2024-11-28 16:31:42.169321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.664 [2024-11-28 16:31:42.169373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:000000ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.664 [2024-11-28 16:31:42.169387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.664 #34 NEW cov: 12435 ft: 15288 corp: 23/508b lim: 35 exec/s: 34 rss: 73Mb L: 29/35 MS: 1 CopyPart- 00:07:44.664 [2024-11-28 16:31:42.229143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000bf00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.664 [2024-11-28 16:31:42.229167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.664 [2024-11-28 16:31:42.229236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.664 [2024-11-28 16:31:42.229250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.664 [2024-11-28 16:31:42.229301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2b0cafc8 cdw11:2c450001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.664 [2024-11-28 16:31:42.229314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.664 #35 NEW cov: 12435 ft: 15305 corp: 24/533b lim: 35 exec/s: 35 rss: 73Mb L: 25/35 MS: 1 ChangeBit- 00:07:44.664 [2024-11-28 16:31:42.289334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000bf00 cdw11:10000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.664 [2024-11-28 16:31:42.289358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.664 [2024-11-28 16:31:42.289426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.664 [2024-11-28 16:31:42.289440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.664 [2024-11-28 16:31:42.289492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00004000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.664 [2024-11-28 16:31:42.289506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.923 #36 NEW cov: 12435 ft: 15323 corp: 25/558b lim: 35 exec/s: 36 rss: 73Mb L: 25/35 MS: 1 ShuffleBytes- 00:07:44.923 [2024-11-28 16:31:42.349482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000bf00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.923 [2024-11-28 16:31:42.349506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.923 [2024-11-28 16:31:42.349575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.923 [2024-11-28 16:31:42.349589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.923 [2024-11-28 16:31:42.349642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.923 [2024-11-28 16:31:42.349656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.923 #37 NEW cov: 12435 ft: 15341 corp: 26/583b lim: 35 exec/s: 37 rss: 73Mb L: 25/35 MS: 1 ChangeBit- 00:07:44.923 [2024-11-28 16:31:42.389242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000ff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.923 [2024-11-28 16:31:42.389266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.923 #38 NEW cov: 12435 ft: 15404 corp: 27/596b lim: 35 exec/s: 38 rss: 73Mb L: 13/35 MS: 1 EraseBytes- 00:07:44.923 [2024-11-28 16:31:42.449758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000bf00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.923 [2024-11-28 16:31:42.449785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.923 [2024-11-28 16:31:42.449854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00190000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.923 [2024-11-28 16:31:42.449868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.923 [2024-11-28 16:31:42.449921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00004000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.923 [2024-11-28 16:31:42.449934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.923 #39 NEW cov: 12435 ft: 15406 corp: 28/621b lim: 35 exec/s: 39 rss: 73Mb L: 25/35 MS: 1 ChangeBinInt- 00:07:44.923 [2024-11-28 16:31:42.489494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0d0d2c0d cdw11:0d0d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.923 [2024-11-28 16:31:42.489518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.924 #44 NEW cov: 12435 ft: 15420 corp: 29/632b lim: 35 exec/s: 44 rss: 73Mb L: 11/35 MS: 5 CopyPart-InsertByte-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:07:44.924 [2024-11-28 16:31:42.530144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000bf00 cdw11:10000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.924 [2024-11-28 16:31:42.530168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.924 [2024-11-28 16:31:42.530237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.924 [2024-11-28 16:31:42.530251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.924 [2024-11-28 16:31:42.530305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ff40ffff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.924 [2024-11-28 16:31:42.530317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.924 [2024-11-28 16:31:42.530373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:44.924 [2024-11-28 16:31:42.530386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.924 #45 NEW cov: 12435 ft: 15426 corp: 30/660b lim: 35 exec/s: 45 rss: 73Mb L: 28/35 MS: 1 InsertRepeatedBytes- 00:07:45.183 [2024-11-28 16:31:42.570368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000bf00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.183 [2024-11-28 16:31:42.570393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.183 [2024-11-28 16:31:42.570446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.183 [2024-11-28 16:31:42.570459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.183 [2024-11-28 16:31:42.570512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00004000 cdw11:003d0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.183 [2024-11-28 16:31:42.570526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.183 [2024-11-28 16:31:42.570581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:2d45e689 cdw11:93000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.183 [2024-11-28 16:31:42.570603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.183 #46 NEW cov: 12435 ft: 15433 corp: 31/693b lim: 35 exec/s: 46 rss: 73Mb L: 33/35 MS: 1 CMP- DE: "=\260\346\211-E\223\000"- 00:07:45.183 [2024-11-28 16:31:42.609911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.183 [2024-11-28 16:31:42.609935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.183 #47 NEW cov: 12435 ft: 15443 corp: 32/706b lim: 35 exec/s: 47 rss: 73Mb L: 13/35 MS: 1 CopyPart- 00:07:45.183 [2024-11-28 16:31:42.650351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000bf00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.183 [2024-11-28 16:31:42.650376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.183 [2024-11-28 16:31:42.650428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.183 [2024-11-28 16:31:42.650443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.183 [2024-11-28 16:31:42.650498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2b0c0000 cdw11:2c450001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.184 [2024-11-28 16:31:42.650512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.184 #48 NEW cov: 12435 ft: 15470 corp: 33/731b lim: 35 exec/s: 48 rss: 73Mb L: 25/35 MS: 1 CrossOver- 00:07:45.184 [2024-11-28 16:31:42.690088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.184 [2024-11-28 16:31:42.690112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.184 #49 NEW cov: 12435 ft: 15487 corp: 34/744b lim: 35 exec/s: 49 rss: 73Mb L: 13/35 MS: 1 ChangeByte- 00:07:45.184 [2024-11-28 16:31:42.750710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000ff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.184 [2024-11-28 16:31:42.750734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.184 [2024-11-28 16:31:42.750805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00400000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.184 [2024-11-28 16:31:42.750820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.184 [2024-11-28 16:31:42.750874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.184 [2024-11-28 16:31:42.750887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.184 [2024-11-28 16:31:42.750940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.184 [2024-11-28 16:31:42.750953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.184 #50 NEW cov: 12435 ft: 15526 corp: 35/776b lim: 35 exec/s: 50 rss: 74Mb L: 32/35 MS: 1 CrossOver- 00:07:45.184 [2024-11-28 16:31:42.790824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000ff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.184 [2024-11-28 16:31:42.790853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.184 [2024-11-28 16:31:42.790922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00400000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.184 [2024-11-28 16:31:42.790937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.184 [2024-11-28 16:31:42.790992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.184 [2024-11-28 16:31:42.791006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.184 [2024-11-28 16:31:42.791057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.184 [2024-11-28 16:31:42.791071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:45.444 #51 NEW cov: 12435 ft: 15572 corp: 36/809b lim: 35 exec/s: 51 rss: 74Mb L: 33/35 MS: 1 CrossOver- 00:07:45.444 [2024-11-28 16:31:42.850542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:bf00ff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.444 [2024-11-28 16:31:42.850567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.444 #52 NEW cov: 12435 ft: 15586 corp: 37/817b lim: 35 exec/s: 26 rss: 74Mb L: 8/35 MS: 1 CrossOver- 00:07:45.444 #52 DONE cov: 12435 ft: 15586 corp: 37/817b lim: 35 exec/s: 26 rss: 74Mb 00:07:45.444 ###### Recommended dictionary. ###### 00:07:45.444 "\257\310+\014-E\223\000" # Uses: 0 00:07:45.444 "\012\000\000\000\000\000\000\000" # Uses: 1 00:07:45.444 "=\260\346\211-E\223\000" # Uses: 0 00:07:45.444 ###### End of recommended dictionary. ###### 00:07:45.444 Done 52 runs in 2 second(s) 00:07:45.444 16:31:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_4.conf /var/tmp/suppress_nvmf_fuzz 00:07:45.444 16:31:42 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:45.444 16:31:42 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:45.444 16:31:42 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:45.444 16:31:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:45.444 16:31:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:45.444 16:31:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:45.444 16:31:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:45.444 16:31:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:45.444 16:31:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:45.444 16:31:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:45.444 16:31:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 5 00:07:45.444 16:31:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4405 00:07:45.444 16:31:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:45.444 16:31:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:45.444 16:31:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:45.444 16:31:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:45.444 16:31:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:45.444 16:31:43 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 00:07:45.444 [2024-11-28 16:31:43.037627] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:45.444 [2024-11-28 16:31:43.037708] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3756513 ] 00:07:45.704 [2024-11-28 16:31:43.286165] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.704 [2024-11-28 16:31:43.316486] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.965 [2024-11-28 16:31:43.368632] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:45.965 [2024-11-28 16:31:43.384985] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:45.965 INFO: Running with entropic power schedule (0xFF, 100). 00:07:45.965 INFO: Seed: 2053863325 00:07:45.965 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:07:45.965 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:07:45.965 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:45.965 INFO: A corpus is not provided, starting from an empty corpus 00:07:45.965 #2 INITED exec/s: 0 rss: 64Mb 00:07:45.965 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:45.965 This may also happen if the target rejected all inputs we tried so far 00:07:45.965 [2024-11-28 16:31:43.451927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.965 [2024-11-28 16:31:43.451966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.224 NEW_FUNC[1/715]: 0x45aab8 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:46.224 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:46.224 #10 NEW cov: 12195 ft: 12172 corp: 2/10b lim: 45 exec/s: 0 rss: 72Mb L: 9/9 MS: 3 CopyPart-ChangeByte-InsertRepeatedBytes- 00:07:46.224 [2024-11-28 16:31:43.782158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.224 [2024-11-28 16:31:43.782195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.224 #14 NEW cov: 12332 ft: 12968 corp: 3/19b lim: 45 exec/s: 0 rss: 72Mb L: 9/9 MS: 4 ShuffleBytes-ChangeBit-ChangeByte-CMP- DE: "\377\377\377\377\377\377\377\377"- 00:07:46.224 [2024-11-28 16:31:43.833092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:46464646 cdw11:46460002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.224 [2024-11-28 16:31:43.833119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.224 [2024-11-28 16:31:43.833243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:46464646 cdw11:46460002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.224 [2024-11-28 16:31:43.833261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.224 [2024-11-28 16:31:43.833382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:46464646 cdw11:46460002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.224 [2024-11-28 16:31:43.833399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.224 [2024-11-28 16:31:43.833519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:46464646 cdw11:46460002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.224 [2024-11-28 16:31:43.833538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.224 #15 NEW cov: 12338 ft: 13907 corp: 4/61b lim: 45 exec/s: 0 rss: 72Mb L: 42/42 MS: 1 InsertRepeatedBytes- 00:07:46.484 [2024-11-28 16:31:43.882404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff08 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.484 [2024-11-28 16:31:43.882432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.484 #21 NEW cov: 12423 ft: 14188 corp: 5/70b lim: 45 exec/s: 0 rss: 72Mb L: 9/42 MS: 1 ChangeBinInt- 00:07:46.484 [2024-11-28 16:31:43.952656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.484 [2024-11-28 16:31:43.952685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.484 #22 NEW cov: 12423 ft: 14233 corp: 6/79b lim: 45 exec/s: 0 rss: 72Mb L: 9/42 MS: 1 CopyPart- 00:07:46.484 [2024-11-28 16:31:44.022888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff08 cdw11:3bff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.484 [2024-11-28 16:31:44.022916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.484 #28 NEW cov: 12423 ft: 14439 corp: 7/88b lim: 45 exec/s: 0 rss: 72Mb L: 9/42 MS: 1 ChangeByte- 00:07:46.484 [2024-11-28 16:31:44.093965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff08 cdw11:1a1a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.484 [2024-11-28 16:31:44.093993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.484 [2024-11-28 16:31:44.094110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:1a1a1a1a cdw11:1a1a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.484 [2024-11-28 16:31:44.094129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.484 [2024-11-28 16:31:44.094253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:1a1a1a1a cdw11:1a1a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.484 [2024-11-28 16:31:44.094272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.484 [2024-11-28 16:31:44.094406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:1a1a1a1a cdw11:1a1a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.484 [2024-11-28 16:31:44.094422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.744 #32 NEW cov: 12423 ft: 14555 corp: 8/130b lim: 45 exec/s: 0 rss: 72Mb L: 42/42 MS: 4 EraseBytes-ChangeBit-CrossOver-InsertRepeatedBytes- 00:07:46.744 [2024-11-28 16:31:44.164110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff08 cdw11:1a1a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.744 [2024-11-28 16:31:44.164137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.744 [2024-11-28 16:31:44.164263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:1a1a1a1a cdw11:1a1a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.744 [2024-11-28 16:31:44.164280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.744 [2024-11-28 16:31:44.164400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:1a1a1a1a cdw11:1a1a0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.744 [2024-11-28 16:31:44.164416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.744 [2024-11-28 16:31:44.164542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:1a1aff1a cdw11:1a1a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.744 [2024-11-28 16:31:44.164560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.744 #33 NEW cov: 12423 ft: 14608 corp: 9/172b lim: 45 exec/s: 0 rss: 72Mb L: 42/42 MS: 1 CrossOver- 00:07:46.744 [2024-11-28 16:31:44.234311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:46464646 cdw11:46460002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.744 [2024-11-28 16:31:44.234339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.744 [2024-11-28 16:31:44.234465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:46464646 cdw11:46460002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.744 [2024-11-28 16:31:44.234482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.744 [2024-11-28 16:31:44.234609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:46464646 cdw11:46280002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.744 [2024-11-28 16:31:44.234624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.744 [2024-11-28 16:31:44.234752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:46464646 cdw11:46460002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.744 [2024-11-28 16:31:44.234770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.744 #34 NEW cov: 12423 ft: 14634 corp: 10/215b lim: 45 exec/s: 0 rss: 72Mb L: 43/43 MS: 1 InsertByte- 00:07:46.744 [2024-11-28 16:31:44.303658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff2b cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.744 [2024-11-28 16:31:44.303685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.744 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:46.744 #35 NEW cov: 12446 ft: 14700 corp: 11/224b lim: 45 exec/s: 0 rss: 73Mb L: 9/43 MS: 1 ChangeByte- 00:07:46.744 [2024-11-28 16:31:44.354836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff08 cdw11:1a1a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.744 [2024-11-28 16:31:44.354863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.744 [2024-11-28 16:31:44.354990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:101a1a1a cdw11:1a1a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.744 [2024-11-28 16:31:44.355006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.744 [2024-11-28 16:31:44.355134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:1a1a1a1a cdw11:1a1a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.744 [2024-11-28 16:31:44.355152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.744 [2024-11-28 16:31:44.355279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:1a1a1a1a cdw11:1a1a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.744 [2024-11-28 16:31:44.355297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.744 #36 NEW cov: 12446 ft: 14736 corp: 12/266b lim: 45 exec/s: 0 rss: 73Mb L: 42/43 MS: 1 ChangeBinInt- 00:07:47.004 [2024-11-28 16:31:44.403966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffbfff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.004 [2024-11-28 16:31:44.403994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.004 #37 NEW cov: 12446 ft: 14786 corp: 13/275b lim: 45 exec/s: 0 rss: 73Mb L: 9/43 MS: 1 ChangeBit- 00:07:47.004 [2024-11-28 16:31:44.455090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff08 cdw11:1a1a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.004 [2024-11-28 16:31:44.455121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.004 [2024-11-28 16:31:44.455241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:1a1a1a1a cdw11:1a1a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.004 [2024-11-28 16:31:44.455259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.004 [2024-11-28 16:31:44.455384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:1a1a1a1a cdw11:1a1a0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.004 [2024-11-28 16:31:44.455403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.004 [2024-11-28 16:31:44.455526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.004 [2024-11-28 16:31:44.455543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.004 #38 NEW cov: 12446 ft: 14847 corp: 14/317b lim: 45 exec/s: 38 rss: 73Mb L: 42/43 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:07:47.004 [2024-11-28 16:31:44.524431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0abf cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.004 [2024-11-28 16:31:44.524460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.004 #39 NEW cov: 12446 ft: 14908 corp: 15/327b lim: 45 exec/s: 39 rss: 73Mb L: 10/43 MS: 1 CrossOver- 00:07:47.004 [2024-11-28 16:31:44.574554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:3fff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.004 [2024-11-28 16:31:44.574583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.004 #40 NEW cov: 12446 ft: 14942 corp: 16/337b lim: 45 exec/s: 40 rss: 73Mb L: 10/43 MS: 1 InsertByte- 00:07:47.004 [2024-11-28 16:31:44.625613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.004 [2024-11-28 16:31:44.625640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.004 [2024-11-28 16:31:44.625765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.004 [2024-11-28 16:31:44.625782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.004 [2024-11-28 16:31:44.625907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.004 [2024-11-28 16:31:44.625924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.004 [2024-11-28 16:31:44.626040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.004 [2024-11-28 16:31:44.626058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.004 #43 NEW cov: 12446 ft: 14977 corp: 17/378b lim: 45 exec/s: 43 rss: 73Mb L: 41/43 MS: 3 ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:07:47.264 [2024-11-28 16:31:44.675166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.264 [2024-11-28 16:31:44.675194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.264 [2024-11-28 16:31:44.675318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff870007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.264 [2024-11-28 16:31:44.675336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.264 #44 NEW cov: 12446 ft: 15220 corp: 18/396b lim: 45 exec/s: 44 rss: 73Mb L: 18/43 MS: 1 CrossOver- 00:07:47.264 [2024-11-28 16:31:44.725008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffbfff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.264 [2024-11-28 16:31:44.725036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.264 #45 NEW cov: 12446 ft: 15272 corp: 19/405b lim: 45 exec/s: 45 rss: 73Mb L: 9/43 MS: 1 ChangeBinInt- 00:07:47.264 [2024-11-28 16:31:44.796101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.264 [2024-11-28 16:31:44.796130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.264 [2024-11-28 16:31:44.796261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.264 [2024-11-28 16:31:44.796279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.264 [2024-11-28 16:31:44.796410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.264 [2024-11-28 16:31:44.796428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.264 [2024-11-28 16:31:44.796553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:52525200 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.264 [2024-11-28 16:31:44.796572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.264 #46 NEW cov: 12446 ft: 15280 corp: 20/446b lim: 45 exec/s: 46 rss: 73Mb L: 41/43 MS: 1 ChangeByte- 00:07:47.264 [2024-11-28 16:31:44.866195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff08ffff cdw11:1a1a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.264 [2024-11-28 16:31:44.866223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.264 [2024-11-28 16:31:44.866341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:1a1a1a1a cdw11:1a1a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.264 [2024-11-28 16:31:44.866361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.264 [2024-11-28 16:31:44.866481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:1a1a1a1a cdw11:1a1a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.264 [2024-11-28 16:31:44.866499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.264 [2024-11-28 16:31:44.866623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:1a1a1a1a cdw11:1a1a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.264 [2024-11-28 16:31:44.866648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.264 #47 NEW cov: 12446 ft: 15295 corp: 21/488b lim: 45 exec/s: 47 rss: 73Mb L: 42/43 MS: 1 ShuffleBytes- 00:07:47.524 [2024-11-28 16:31:44.916132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff2b cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.524 [2024-11-28 16:31:44.916160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.524 [2024-11-28 16:31:44.916290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.524 [2024-11-28 16:31:44.916309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.524 [2024-11-28 16:31:44.916435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.524 [2024-11-28 16:31:44.916453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.524 #48 NEW cov: 12446 ft: 15530 corp: 22/522b lim: 45 exec/s: 48 rss: 73Mb L: 34/43 MS: 1 InsertRepeatedBytes- 00:07:47.524 [2024-11-28 16:31:44.985755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffbfff cdw11:23ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.524 [2024-11-28 16:31:44.985786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.524 #49 NEW cov: 12446 ft: 15531 corp: 23/532b lim: 45 exec/s: 49 rss: 73Mb L: 10/43 MS: 1 InsertByte- 00:07:47.524 [2024-11-28 16:31:45.056782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.524 [2024-11-28 16:31:45.056810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.524 [2024-11-28 16:31:45.056932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.524 [2024-11-28 16:31:45.056949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.524 [2024-11-28 16:31:45.057067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:525252ff cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.524 [2024-11-28 16:31:45.057082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.524 [2024-11-28 16:31:45.057203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.524 [2024-11-28 16:31:45.057220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.524 #50 NEW cov: 12446 ft: 15610 corp: 24/574b lim: 45 exec/s: 50 rss: 73Mb L: 42/43 MS: 1 CrossOver- 00:07:47.524 [2024-11-28 16:31:45.106129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffff08 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.524 [2024-11-28 16:31:45.106160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.524 #51 NEW cov: 12446 ft: 15638 corp: 25/583b lim: 45 exec/s: 51 rss: 73Mb L: 9/43 MS: 1 CrossOver- 00:07:47.524 [2024-11-28 16:31:45.157162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.524 [2024-11-28 16:31:45.157190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.524 [2024-11-28 16:31:45.157308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.524 [2024-11-28 16:31:45.157326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.524 [2024-11-28 16:31:45.157450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.524 [2024-11-28 16:31:45.157467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.524 [2024-11-28 16:31:45.157579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.524 [2024-11-28 16:31:45.157596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.783 #54 NEW cov: 12446 ft: 15650 corp: 26/625b lim: 45 exec/s: 54 rss: 73Mb L: 42/43 MS: 3 EraseBytes-EraseBytes-InsertRepeatedBytes- 00:07:47.783 [2024-11-28 16:31:45.207297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.783 [2024-11-28 16:31:45.207326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.783 [2024-11-28 16:31:45.207455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.783 [2024-11-28 16:31:45.207475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.783 [2024-11-28 16:31:45.207595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.783 [2024-11-28 16:31:45.207618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.783 [2024-11-28 16:31:45.207748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:005252ff cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.783 [2024-11-28 16:31:45.207766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.783 #55 NEW cov: 12446 ft: 15654 corp: 27/667b lim: 45 exec/s: 55 rss: 73Mb L: 42/43 MS: 1 InsertByte- 00:07:47.783 [2024-11-28 16:31:45.277592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.783 [2024-11-28 16:31:45.277623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.783 [2024-11-28 16:31:45.277751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:52005252 cdw11:00000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.783 [2024-11-28 16:31:45.277767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.783 [2024-11-28 16:31:45.277889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.783 [2024-11-28 16:31:45.277906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.783 [2024-11-28 16:31:45.278023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.783 [2024-11-28 16:31:45.278040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.783 #56 NEW cov: 12446 ft: 15673 corp: 28/708b lim: 45 exec/s: 56 rss: 73Mb L: 41/43 MS: 1 ChangeBinInt- 00:07:47.783 [2024-11-28 16:31:45.327092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.783 [2024-11-28 16:31:45.327120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.783 [2024-11-28 16:31:45.327238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff870007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.783 [2024-11-28 16:31:45.327257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.783 #57 NEW cov: 12446 ft: 15716 corp: 29/727b lim: 45 exec/s: 57 rss: 73Mb L: 19/43 MS: 1 InsertByte- 00:07:47.783 [2024-11-28 16:31:45.397962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.783 [2024-11-28 16:31:45.397989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.783 [2024-11-28 16:31:45.398120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.783 [2024-11-28 16:31:45.398137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.783 [2024-11-28 16:31:45.398261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:52524252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.783 [2024-11-28 16:31:45.398280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.783 [2024-11-28 16:31:45.398402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:52525252 cdw11:52520002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.783 [2024-11-28 16:31:45.398419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.784 #58 NEW cov: 12446 ft: 15724 corp: 30/768b lim: 45 exec/s: 58 rss: 73Mb L: 41/43 MS: 1 ChangeBit- 00:07:48.042 [2024-11-28 16:31:45.447166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:08ffff30 cdw11:ff3b0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.043 [2024-11-28 16:31:45.447196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.043 #59 NEW cov: 12446 ft: 15752 corp: 31/778b lim: 45 exec/s: 29 rss: 73Mb L: 10/43 MS: 1 InsertByte- 00:07:48.043 #59 DONE cov: 12446 ft: 15752 corp: 31/778b lim: 45 exec/s: 29 rss: 73Mb 00:07:48.043 ###### Recommended dictionary. ###### 00:07:48.043 "\377\377\377\377\377\377\377\377" # Uses: 1 00:07:48.043 ###### End of recommended dictionary. ###### 00:07:48.043 Done 59 runs in 2 second(s) 00:07:48.043 16:31:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_5.conf /var/tmp/suppress_nvmf_fuzz 00:07:48.043 16:31:45 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:48.043 16:31:45 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:48.043 16:31:45 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:48.043 16:31:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:48.043 16:31:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:48.043 16:31:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:48.043 16:31:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:48.043 16:31:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:48.043 16:31:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:48.043 16:31:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:48.043 16:31:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 6 00:07:48.043 16:31:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4406 00:07:48.043 16:31:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:48.043 16:31:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:48.043 16:31:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:48.043 16:31:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:48.043 16:31:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:48.043 16:31:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 00:07:48.043 [2024-11-28 16:31:45.632434] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:48.043 [2024-11-28 16:31:45.632499] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3757005 ] 00:07:48.302 [2024-11-28 16:31:45.881239] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.302 [2024-11-28 16:31:45.911688] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.561 [2024-11-28 16:31:45.963897] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:48.561 [2024-11-28 16:31:45.980240] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:48.561 INFO: Running with entropic power schedule (0xFF, 100). 00:07:48.561 INFO: Seed: 352903594 00:07:48.561 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:07:48.561 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:07:48.561 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:48.561 INFO: A corpus is not provided, starting from an empty corpus 00:07:48.561 #2 INITED exec/s: 0 rss: 64Mb 00:07:48.561 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:48.562 This may also happen if the target rejected all inputs we tried so far 00:07:48.562 [2024-11-28 16:31:46.057443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000830a cdw11:00000000 00:07:48.562 [2024-11-28 16:31:46.057478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.821 NEW_FUNC[1/713]: 0x45d2c8 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:48.821 NEW_FUNC[2/713]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:48.821 #10 NEW cov: 12136 ft: 12135 corp: 2/3b lim: 10 exec/s: 0 rss: 71Mb L: 2/2 MS: 3 ShuffleBytes-ShuffleBytes-InsertByte- 00:07:48.821 [2024-11-28 16:31:46.407858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008383 cdw11:00000000 00:07:48.821 [2024-11-28 16:31:46.407907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.821 #12 NEW cov: 12249 ft: 12951 corp: 3/5b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 2 EraseBytes-CopyPart- 00:07:49.079 [2024-11-28 16:31:46.477892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000830a cdw11:00000000 00:07:49.079 [2024-11-28 16:31:46.477923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.079 #13 NEW cov: 12255 ft: 13154 corp: 4/8b lim: 10 exec/s: 0 rss: 72Mb L: 3/3 MS: 1 CrossOver- 00:07:49.079 [2024-11-28 16:31:46.528024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000830a cdw11:00000000 00:07:49.079 [2024-11-28 16:31:46.528055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.079 #19 NEW cov: 12340 ft: 13385 corp: 5/11b lim: 10 exec/s: 0 rss: 72Mb L: 3/3 MS: 1 ChangeByte- 00:07:49.079 [2024-11-28 16:31:46.598519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000a283 cdw11:00000000 00:07:49.079 [2024-11-28 16:31:46.598547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.079 [2024-11-28 16:31:46.598666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000ac8 cdw11:00000000 00:07:49.079 [2024-11-28 16:31:46.598684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.079 #20 NEW cov: 12340 ft: 13706 corp: 6/15b lim: 10 exec/s: 0 rss: 72Mb L: 4/4 MS: 1 InsertByte- 00:07:49.079 [2024-11-28 16:31:46.668689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:07:49.079 [2024-11-28 16:31:46.668717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.080 [2024-11-28 16:31:46.668837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.080 [2024-11-28 16:31:46.668863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.080 #22 NEW cov: 12340 ft: 13852 corp: 7/19b lim: 10 exec/s: 0 rss: 72Mb L: 4/4 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:49.080 [2024-11-28 16:31:46.718863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:49.080 [2024-11-28 16:31:46.718891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.080 [2024-11-28 16:31:46.719012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.080 [2024-11-28 16:31:46.719029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.339 #23 NEW cov: 12340 ft: 13898 corp: 8/24b lim: 10 exec/s: 0 rss: 72Mb L: 5/5 MS: 1 CMP- DE: "\000\000\000T"- 00:07:49.339 [2024-11-28 16:31:46.769226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000830a cdw11:00000000 00:07:49.339 [2024-11-28 16:31:46.769252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.339 [2024-11-28 16:31:46.769355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a83 cdw11:00000000 00:07:49.339 [2024-11-28 16:31:46.769371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.339 [2024-11-28 16:31:46.769485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:49.339 [2024-11-28 16:31:46.769502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.339 #24 NEW cov: 12340 ft: 14110 corp: 9/30b lim: 10 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 CopyPart- 00:07:49.339 [2024-11-28 16:31:46.818965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000083ed cdw11:00000000 00:07:49.339 [2024-11-28 16:31:46.818992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.339 #25 NEW cov: 12340 ft: 14184 corp: 10/32b lim: 10 exec/s: 0 rss: 72Mb L: 2/6 MS: 1 ChangeBinInt- 00:07:49.339 [2024-11-28 16:31:46.869380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000a283 cdw11:00000000 00:07:49.339 [2024-11-28 16:31:46.869408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.339 [2024-11-28 16:31:46.869536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000aaa cdw11:00000000 00:07:49.339 [2024-11-28 16:31:46.869552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.339 #26 NEW cov: 12340 ft: 14218 corp: 11/36b lim: 10 exec/s: 0 rss: 72Mb L: 4/6 MS: 1 ChangeByte- 00:07:49.339 [2024-11-28 16:31:46.939329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000af1 cdw11:00000000 00:07:49.339 [2024-11-28 16:31:46.939355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.339 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:49.339 #27 NEW cov: 12363 ft: 14269 corp: 12/38b lim: 10 exec/s: 0 rss: 72Mb L: 2/6 MS: 1 InsertByte- 00:07:49.598 [2024-11-28 16:31:46.989770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000a283 cdw11:00000000 00:07:49.598 [2024-11-28 16:31:46.989800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.598 [2024-11-28 16:31:46.989920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a44 cdw11:00000000 00:07:49.598 [2024-11-28 16:31:46.989939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.599 #28 NEW cov: 12363 ft: 14311 corp: 13/42b lim: 10 exec/s: 0 rss: 72Mb L: 4/6 MS: 1 ChangeByte- 00:07:49.599 [2024-11-28 16:31:47.040064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000830a cdw11:00000000 00:07:49.599 [2024-11-28 16:31:47.040091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.599 [2024-11-28 16:31:47.040206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c8a2 cdw11:00000000 00:07:49.599 [2024-11-28 16:31:47.040222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.599 [2024-11-28 16:31:47.040337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000830a cdw11:00000000 00:07:49.599 [2024-11-28 16:31:47.040352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.599 #29 NEW cov: 12363 ft: 14352 corp: 14/49b lim: 10 exec/s: 29 rss: 72Mb L: 7/7 MS: 1 CrossOver- 00:07:49.599 [2024-11-28 16:31:47.089822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a83 cdw11:00000000 00:07:49.599 [2024-11-28 16:31:47.089849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.599 #30 NEW cov: 12363 ft: 14374 corp: 15/52b lim: 10 exec/s: 30 rss: 72Mb L: 3/7 MS: 1 CrossOver- 00:07:49.599 [2024-11-28 16:31:47.140045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:49.599 [2024-11-28 16:31:47.140072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.599 [2024-11-28 16:31:47.140183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.599 [2024-11-28 16:31:47.140200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.599 #31 NEW cov: 12363 ft: 14494 corp: 16/57b lim: 10 exec/s: 31 rss: 72Mb L: 5/7 MS: 1 CrossOver- 00:07:49.599 [2024-11-28 16:31:47.210315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000087a2 cdw11:00000000 00:07:49.599 [2024-11-28 16:31:47.210343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.599 [2024-11-28 16:31:47.210472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000830a cdw11:00000000 00:07:49.599 [2024-11-28 16:31:47.210489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.858 #32 NEW cov: 12363 ft: 14514 corp: 17/62b lim: 10 exec/s: 32 rss: 72Mb L: 5/7 MS: 1 InsertByte- 00:07:49.858 [2024-11-28 16:31:47.280618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:49.858 [2024-11-28 16:31:47.280646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.858 [2024-11-28 16:31:47.280761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.858 [2024-11-28 16:31:47.280779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.858 #34 NEW cov: 12363 ft: 14529 corp: 18/67b lim: 10 exec/s: 34 rss: 72Mb L: 5/7 MS: 2 ShuffleBytes-PersAutoDict- DE: "\000\000\000T"- 00:07:49.858 [2024-11-28 16:31:47.330460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000083c3 cdw11:00000000 00:07:49.858 [2024-11-28 16:31:47.330488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.858 #35 NEW cov: 12363 ft: 14611 corp: 19/69b lim: 10 exec/s: 35 rss: 73Mb L: 2/7 MS: 1 ChangeBit- 00:07:49.858 [2024-11-28 16:31:47.400954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:49.858 [2024-11-28 16:31:47.400981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.858 [2024-11-28 16:31:47.401101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:49.858 [2024-11-28 16:31:47.401118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.858 #36 NEW cov: 12363 ft: 14662 corp: 20/74b lim: 10 exec/s: 36 rss: 73Mb L: 5/7 MS: 1 ShuffleBytes- 00:07:49.858 [2024-11-28 16:31:47.470988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000833d cdw11:00000000 00:07:49.858 [2024-11-28 16:31:47.471016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.118 #37 NEW cov: 12363 ft: 14691 corp: 21/76b lim: 10 exec/s: 37 rss: 73Mb L: 2/7 MS: 1 ChangeByte- 00:07:50.118 [2024-11-28 16:31:47.541171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004af1 cdw11:00000000 00:07:50.118 [2024-11-28 16:31:47.541200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.118 #38 NEW cov: 12363 ft: 14785 corp: 22/78b lim: 10 exec/s: 38 rss: 73Mb L: 2/7 MS: 1 ChangeBit- 00:07:50.118 [2024-11-28 16:31:47.611580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000a283 cdw11:00000000 00:07:50.118 [2024-11-28 16:31:47.611612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.118 [2024-11-28 16:31:47.611738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:50.118 [2024-11-28 16:31:47.611758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.118 #39 NEW cov: 12363 ft: 14832 corp: 23/83b lim: 10 exec/s: 39 rss: 73Mb L: 5/7 MS: 1 CrossOver- 00:07:50.118 [2024-11-28 16:31:47.661434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f00a cdw11:00000000 00:07:50.118 [2024-11-28 16:31:47.661462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.118 #41 NEW cov: 12363 ft: 14843 corp: 24/85b lim: 10 exec/s: 41 rss: 73Mb L: 2/7 MS: 2 EraseBytes-InsertByte- 00:07:50.118 [2024-11-28 16:31:47.712111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:50.118 [2024-11-28 16:31:47.712140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.118 [2024-11-28 16:31:47.712265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:50.118 [2024-11-28 16:31:47.712283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.118 [2024-11-28 16:31:47.712406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00005400 cdw11:00000000 00:07:50.119 [2024-11-28 16:31:47.712422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.119 #42 NEW cov: 12363 ft: 14863 corp: 25/92b lim: 10 exec/s: 42 rss: 73Mb L: 7/7 MS: 1 CopyPart- 00:07:50.119 [2024-11-28 16:31:47.761849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000832d cdw11:00000000 00:07:50.119 [2024-11-28 16:31:47.761879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.378 #43 NEW cov: 12363 ft: 14883 corp: 26/94b lim: 10 exec/s: 43 rss: 73Mb L: 2/7 MS: 1 ChangeBit- 00:07:50.378 [2024-11-28 16:31:47.832799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000833d cdw11:00000000 00:07:50.378 [2024-11-28 16:31:47.832829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.378 [2024-11-28 16:31:47.832946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000cbcb cdw11:00000000 00:07:50.378 [2024-11-28 16:31:47.832963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.378 [2024-11-28 16:31:47.833085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000cbcb cdw11:00000000 00:07:50.378 [2024-11-28 16:31:47.833103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.378 [2024-11-28 16:31:47.833217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000cbcb cdw11:00000000 00:07:50.378 [2024-11-28 16:31:47.833234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.378 #44 NEW cov: 12363 ft: 15105 corp: 27/103b lim: 10 exec/s: 44 rss: 73Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:07:50.378 [2024-11-28 16:31:47.882871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001a00 cdw11:00000000 00:07:50.378 [2024-11-28 16:31:47.882899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.378 [2024-11-28 16:31:47.883022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:50.378 [2024-11-28 16:31:47.883039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.378 [2024-11-28 16:31:47.883155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00005400 cdw11:00000000 00:07:50.378 [2024-11-28 16:31:47.883172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.378 [2024-11-28 16:31:47.883287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:50.378 [2024-11-28 16:31:47.883304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.378 #45 NEW cov: 12363 ft: 15113 corp: 28/111b lim: 10 exec/s: 45 rss: 73Mb L: 8/9 MS: 1 PersAutoDict- DE: "\000\000\000T"- 00:07:50.378 [2024-11-28 16:31:47.952667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:50.378 [2024-11-28 16:31:47.952696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.378 [2024-11-28 16:31:47.952811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:50.378 [2024-11-28 16:31:47.952828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.378 #46 NEW cov: 12363 ft: 15142 corp: 29/116b lim: 10 exec/s: 46 rss: 73Mb L: 5/9 MS: 1 PersAutoDict- DE: "\000\000\000T"- 00:07:50.378 [2024-11-28 16:31:48.022656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008012 cdw11:00000000 00:07:50.378 [2024-11-28 16:31:48.022685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.678 #47 NEW cov: 12363 ft: 15161 corp: 30/118b lim: 10 exec/s: 23 rss: 73Mb L: 2/9 MS: 1 ChangeBinInt- 00:07:50.678 #47 DONE cov: 12363 ft: 15161 corp: 30/118b lim: 10 exec/s: 23 rss: 73Mb 00:07:50.678 ###### Recommended dictionary. ###### 00:07:50.678 "\000\000\000T" # Uses: 3 00:07:50.678 ###### End of recommended dictionary. ###### 00:07:50.678 Done 47 runs in 2 second(s) 00:07:50.678 16:31:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_6.conf /var/tmp/suppress_nvmf_fuzz 00:07:50.678 16:31:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:50.678 16:31:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:50.678 16:31:48 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:50.678 16:31:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:50.678 16:31:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:50.678 16:31:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:50.678 16:31:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:50.678 16:31:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:50.678 16:31:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:50.678 16:31:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:50.678 16:31:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 7 00:07:50.678 16:31:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4407 00:07:50.678 16:31:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:50.678 16:31:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:50.678 16:31:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:50.678 16:31:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:50.678 16:31:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:50.678 16:31:48 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 00:07:50.678 [2024-11-28 16:31:48.207450] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:50.678 [2024-11-28 16:31:48.207530] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3757540 ] 00:07:50.968 [2024-11-28 16:31:48.455874] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.968 [2024-11-28 16:31:48.487256] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.968 [2024-11-28 16:31:48.539699] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:50.968 [2024-11-28 16:31:48.556033] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:50.968 INFO: Running with entropic power schedule (0xFF, 100). 00:07:50.968 INFO: Seed: 2929897265 00:07:50.968 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:07:50.968 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:07:50.968 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:50.968 INFO: A corpus is not provided, starting from an empty corpus 00:07:50.968 #2 INITED exec/s: 0 rss: 65Mb 00:07:50.968 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:50.968 This may also happen if the target rejected all inputs we tried so far 00:07:51.256 [2024-11-28 16:31:48.611423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000b00 cdw11:00000000 00:07:51.256 [2024-11-28 16:31:48.611451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.516 NEW_FUNC[1/713]: 0x45dcc8 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:51.516 NEW_FUNC[2/713]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:51.516 #4 NEW cov: 12136 ft: 12131 corp: 2/3b lim: 10 exec/s: 0 rss: 72Mb L: 2/2 MS: 2 ChangeBit-InsertByte- 00:07:51.516 [2024-11-28 16:31:48.922184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000720a cdw11:00000000 00:07:51.516 [2024-11-28 16:31:48.922218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.516 #6 NEW cov: 12249 ft: 12804 corp: 3/5b lim: 10 exec/s: 0 rss: 73Mb L: 2/2 MS: 2 ShuffleBytes-InsertByte- 00:07:51.516 [2024-11-28 16:31:48.962497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007200 cdw11:00000000 00:07:51.516 [2024-11-28 16:31:48.962522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.516 [2024-11-28 16:31:48.962576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.516 [2024-11-28 16:31:48.962590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.516 [2024-11-28 16:31:48.962647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:07:51.516 [2024-11-28 16:31:48.962661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.516 #7 NEW cov: 12255 ft: 13208 corp: 4/11b lim: 10 exec/s: 0 rss: 73Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:07:51.516 [2024-11-28 16:31:49.022392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000072f6 cdw11:00000000 00:07:51.516 [2024-11-28 16:31:49.022418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.516 #8 NEW cov: 12340 ft: 13464 corp: 5/13b lim: 10 exec/s: 0 rss: 73Mb L: 2/6 MS: 1 ChangeBinInt- 00:07:51.516 [2024-11-28 16:31:49.062864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:51.516 [2024-11-28 16:31:49.062890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.516 [2024-11-28 16:31:49.062943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00009245 cdw11:00000000 00:07:51.516 [2024-11-28 16:31:49.062956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.516 [2024-11-28 16:31:49.063009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000314e cdw11:00000000 00:07:51.516 [2024-11-28 16:31:49.063023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.516 [2024-11-28 16:31:49.063076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000cfda cdw11:00000000 00:07:51.516 [2024-11-28 16:31:49.063089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.516 #9 NEW cov: 12340 ft: 13773 corp: 6/22b lim: 10 exec/s: 0 rss: 73Mb L: 9/9 MS: 1 CMP- DE: "\377\222E1N\317\332\020"- 00:07:51.516 [2024-11-28 16:31:49.102628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000b00 cdw11:00000000 00:07:51.516 [2024-11-28 16:31:49.102654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.516 #10 NEW cov: 12340 ft: 13870 corp: 7/25b lim: 10 exec/s: 0 rss: 73Mb L: 3/9 MS: 1 InsertByte- 00:07:51.516 [2024-11-28 16:31:49.162881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002b00 cdw11:00000000 00:07:51.516 [2024-11-28 16:31:49.162907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.776 #11 NEW cov: 12340 ft: 13919 corp: 8/28b lim: 10 exec/s: 0 rss: 73Mb L: 3/9 MS: 1 ChangeBit- 00:07:51.776 [2024-11-28 16:31:49.223374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007200 cdw11:00000000 00:07:51.776 [2024-11-28 16:31:49.223399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.776 [2024-11-28 16:31:49.223452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.776 [2024-11-28 16:31:49.223466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.776 [2024-11-28 16:31:49.223520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:07:51.776 [2024-11-28 16:31:49.223550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.776 [2024-11-28 16:31:49.223606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.776 [2024-11-28 16:31:49.223620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.776 #12 NEW cov: 12340 ft: 13961 corp: 9/36b lim: 10 exec/s: 0 rss: 73Mb L: 8/9 MS: 1 CopyPart- 00:07:51.776 [2024-11-28 16:31:49.283517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007200 cdw11:00000000 00:07:51.776 [2024-11-28 16:31:49.283542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.776 [2024-11-28 16:31:49.283615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.776 [2024-11-28 16:31:49.283629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.776 [2024-11-28 16:31:49.283686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:07:51.776 [2024-11-28 16:31:49.283700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.776 [2024-11-28 16:31:49.283764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000900 cdw11:00000000 00:07:51.776 [2024-11-28 16:31:49.283777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.776 #13 NEW cov: 12340 ft: 13979 corp: 10/44b lim: 10 exec/s: 0 rss: 73Mb L: 8/9 MS: 1 CMP- DE: "\011\000"- 00:07:51.776 [2024-11-28 16:31:49.323630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:51.776 [2024-11-28 16:31:49.323655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.776 [2024-11-28 16:31:49.323726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00009245 cdw11:00000000 00:07:51.776 [2024-11-28 16:31:49.323740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.776 [2024-11-28 16:31:49.323792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000314e cdw11:00000000 00:07:51.776 [2024-11-28 16:31:49.323806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.776 [2024-11-28 16:31:49.323860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000dfda cdw11:00000000 00:07:51.776 [2024-11-28 16:31:49.323873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.776 #14 NEW cov: 12340 ft: 14033 corp: 11/53b lim: 10 exec/s: 0 rss: 73Mb L: 9/9 MS: 1 ChangeByte- 00:07:51.776 [2024-11-28 16:31:49.383786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007200 cdw11:00000000 00:07:51.776 [2024-11-28 16:31:49.383812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.776 [2024-11-28 16:31:49.383868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.776 [2024-11-28 16:31:49.383881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.776 [2024-11-28 16:31:49.383935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:51.776 [2024-11-28 16:31:49.383949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.776 [2024-11-28 16:31:49.384003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.776 [2024-11-28 16:31:49.384016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.036 #15 NEW cov: 12340 ft: 14115 corp: 12/61b lim: 10 exec/s: 0 rss: 73Mb L: 8/9 MS: 1 CopyPart- 00:07:52.036 [2024-11-28 16:31:49.443940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007200 cdw11:00000000 00:07:52.036 [2024-11-28 16:31:49.443965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.036 [2024-11-28 16:31:49.444038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.036 [2024-11-28 16:31:49.444052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.036 [2024-11-28 16:31:49.444105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:52.036 [2024-11-28 16:31:49.444122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.036 [2024-11-28 16:31:49.444177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.036 [2024-11-28 16:31:49.444191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.036 #16 NEW cov: 12340 ft: 14165 corp: 13/70b lim: 10 exec/s: 0 rss: 73Mb L: 9/9 MS: 1 CrossOver- 00:07:52.036 [2024-11-28 16:31:49.503733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000320a cdw11:00000000 00:07:52.036 [2024-11-28 16:31:49.503758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.036 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:52.036 #17 NEW cov: 12363 ft: 14239 corp: 14/72b lim: 10 exec/s: 0 rss: 74Mb L: 2/9 MS: 1 ChangeBit- 00:07:52.036 [2024-11-28 16:31:49.544240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007200 cdw11:00000000 00:07:52.036 [2024-11-28 16:31:49.544266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.036 [2024-11-28 16:31:49.544321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000000b9 cdw11:00000000 00:07:52.036 [2024-11-28 16:31:49.544335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.036 [2024-11-28 16:31:49.544390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:07:52.036 [2024-11-28 16:31:49.544403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.036 [2024-11-28 16:31:49.544457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.036 [2024-11-28 16:31:49.544470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.036 #18 NEW cov: 12363 ft: 14268 corp: 15/80b lim: 10 exec/s: 0 rss: 74Mb L: 8/9 MS: 1 ChangeByte- 00:07:52.036 [2024-11-28 16:31:49.584320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007272 cdw11:00000000 00:07:52.036 [2024-11-28 16:31:49.584346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.036 [2024-11-28 16:31:49.584402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.036 [2024-11-28 16:31:49.584415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.036 [2024-11-28 16:31:49.584469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.036 [2024-11-28 16:31:49.584482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.036 [2024-11-28 16:31:49.584537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:52.036 [2024-11-28 16:31:49.584550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.036 #19 NEW cov: 12363 ft: 14290 corp: 16/89b lim: 10 exec/s: 19 rss: 74Mb L: 9/9 MS: 1 CrossOver- 00:07:52.036 [2024-11-28 16:31:49.644526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007208 cdw11:00000000 00:07:52.036 [2024-11-28 16:31:49.644553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.036 [2024-11-28 16:31:49.644612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.036 [2024-11-28 16:31:49.644626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.036 [2024-11-28 16:31:49.644679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:52.036 [2024-11-28 16:31:49.644692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.036 [2024-11-28 16:31:49.644744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.036 [2024-11-28 16:31:49.644757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.296 #20 NEW cov: 12363 ft: 14293 corp: 17/98b lim: 10 exec/s: 20 rss: 74Mb L: 9/9 MS: 1 ChangeBit- 00:07:52.296 [2024-11-28 16:31:49.704409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000b00 cdw11:00000000 00:07:52.296 [2024-11-28 16:31:49.704434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.296 [2024-11-28 16:31:49.704489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000e64a cdw11:00000000 00:07:52.296 [2024-11-28 16:31:49.704503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.296 #21 NEW cov: 12363 ft: 14448 corp: 18/102b lim: 10 exec/s: 21 rss: 74Mb L: 4/9 MS: 1 InsertByte- 00:07:52.296 [2024-11-28 16:31:49.744786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007200 cdw11:00000000 00:07:52.296 [2024-11-28 16:31:49.744811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.296 [2024-11-28 16:31:49.744881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.296 [2024-11-28 16:31:49.744895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.296 [2024-11-28 16:31:49.744950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.296 [2024-11-28 16:31:49.744963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.296 [2024-11-28 16:31:49.745016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.296 [2024-11-28 16:31:49.745030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.296 #22 NEW cov: 12363 ft: 14484 corp: 19/111b lim: 10 exec/s: 22 rss: 74Mb L: 9/9 MS: 1 CopyPart- 00:07:52.296 [2024-11-28 16:31:49.784801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000b00 cdw11:00000000 00:07:52.296 [2024-11-28 16:31:49.784826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.296 [2024-11-28 16:31:49.784899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000e64a cdw11:00000000 00:07:52.296 [2024-11-28 16:31:49.784913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.296 [2024-11-28 16:31:49.784969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000b00 cdw11:00000000 00:07:52.296 [2024-11-28 16:31:49.784982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.296 #23 NEW cov: 12363 ft: 14498 corp: 20/118b lim: 10 exec/s: 23 rss: 74Mb L: 7/9 MS: 1 CopyPart- 00:07:52.296 [2024-11-28 16:31:49.845122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007200 cdw11:00000000 00:07:52.296 [2024-11-28 16:31:49.845147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.296 [2024-11-28 16:31:49.845219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.296 [2024-11-28 16:31:49.845232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.296 [2024-11-28 16:31:49.845286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002400 cdw11:00000000 00:07:52.296 [2024-11-28 16:31:49.845299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.296 [2024-11-28 16:31:49.845353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.296 [2024-11-28 16:31:49.845366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.296 #24 NEW cov: 12363 ft: 14505 corp: 21/126b lim: 10 exec/s: 24 rss: 74Mb L: 8/9 MS: 1 ChangeByte- 00:07:52.296 [2024-11-28 16:31:49.885204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007200 cdw11:00000000 00:07:52.296 [2024-11-28 16:31:49.885229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.296 [2024-11-28 16:31:49.885286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000080a cdw11:00000000 00:07:52.296 [2024-11-28 16:31:49.885300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.296 [2024-11-28 16:31:49.885356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.296 [2024-11-28 16:31:49.885370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.297 [2024-11-28 16:31:49.885425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.297 [2024-11-28 16:31:49.885438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.297 #25 NEW cov: 12363 ft: 14523 corp: 22/135b lim: 10 exec/s: 25 rss: 74Mb L: 9/9 MS: 1 ShuffleBytes- 00:07:52.556 [2024-11-28 16:31:49.945404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:07:52.556 [2024-11-28 16:31:49.945430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.556 [2024-11-28 16:31:49.945485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00009245 cdw11:00000000 00:07:52.556 [2024-11-28 16:31:49.945499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.556 [2024-11-28 16:31:49.945551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ce4e cdw11:00000000 00:07:52.556 [2024-11-28 16:31:49.945564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.556 [2024-11-28 16:31:49.945621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000dfda cdw11:00000000 00:07:52.556 [2024-11-28 16:31:49.945635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.556 #26 NEW cov: 12363 ft: 14539 corp: 23/144b lim: 10 exec/s: 26 rss: 74Mb L: 9/9 MS: 1 ChangeBinInt- 00:07:52.556 [2024-11-28 16:31:50.005511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007200 cdw11:00000000 00:07:52.556 [2024-11-28 16:31:50.005539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.556 [2024-11-28 16:31:50.005595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.556 [2024-11-28 16:31:50.005615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.556 [2024-11-28 16:31:50.005670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:07:52.556 [2024-11-28 16:31:50.005683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.556 [2024-11-28 16:31:50.005737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 00:07:52.556 [2024-11-28 16:31:50.005750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.556 #27 NEW cov: 12363 ft: 14548 corp: 24/152b lim: 10 exec/s: 27 rss: 74Mb L: 8/9 MS: 1 CrossOver- 00:07:52.556 [2024-11-28 16:31:50.045529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007200 cdw11:00000000 00:07:52.556 [2024-11-28 16:31:50.045556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.556 [2024-11-28 16:31:50.045612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000aff cdw11:00000000 00:07:52.556 [2024-11-28 16:31:50.045626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.556 [2024-11-28 16:31:50.045680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000092 cdw11:00000000 00:07:52.556 [2024-11-28 16:31:50.045694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.556 #28 NEW cov: 12363 ft: 14592 corp: 25/158b lim: 10 exec/s: 28 rss: 74Mb L: 6/9 MS: 1 CrossOver- 00:07:52.556 [2024-11-28 16:31:50.105625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007474 cdw11:00000000 00:07:52.556 [2024-11-28 16:31:50.105657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.556 [2024-11-28 16:31:50.105712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000742b cdw11:00000000 00:07:52.556 [2024-11-28 16:31:50.105726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.556 #30 NEW cov: 12363 ft: 14616 corp: 26/162b lim: 10 exec/s: 30 rss: 74Mb L: 4/9 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:52.556 [2024-11-28 16:31:50.165863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000b00 cdw11:00000000 00:07:52.556 [2024-11-28 16:31:50.165889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.556 [2024-11-28 16:31:50.165942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000e64a cdw11:00000000 00:07:52.556 [2024-11-28 16:31:50.165956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.557 [2024-11-28 16:31:50.166008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000bff cdw11:00000000 00:07:52.557 [2024-11-28 16:31:50.166021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.816 #31 NEW cov: 12363 ft: 14629 corp: 27/169b lim: 10 exec/s: 31 rss: 74Mb L: 7/9 MS: 1 ChangeBinInt- 00:07:52.816 [2024-11-28 16:31:50.225908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007474 cdw11:00000000 00:07:52.816 [2024-11-28 16:31:50.225937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.816 [2024-11-28 16:31:50.225991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007426 cdw11:00000000 00:07:52.816 [2024-11-28 16:31:50.226004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.816 #32 NEW cov: 12363 ft: 14664 corp: 28/174b lim: 10 exec/s: 32 rss: 75Mb L: 5/9 MS: 1 InsertByte- 00:07:52.816 [2024-11-28 16:31:50.286294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000072 cdw11:00000000 00:07:52.816 [2024-11-28 16:31:50.286319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.816 [2024-11-28 16:31:50.286388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000800 cdw11:00000000 00:07:52.816 [2024-11-28 16:31:50.286402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.816 [2024-11-28 16:31:50.286454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:52.816 [2024-11-28 16:31:50.286468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.816 [2024-11-28 16:31:50.286519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.816 [2024-11-28 16:31:50.286533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.816 #33 NEW cov: 12363 ft: 14675 corp: 29/183b lim: 10 exec/s: 33 rss: 75Mb L: 9/9 MS: 1 ShuffleBytes- 00:07:52.816 [2024-11-28 16:31:50.326317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000b00 cdw11:00000000 00:07:52.816 [2024-11-28 16:31:50.326343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.816 [2024-11-28 16:31:50.326410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000e64a cdw11:00000000 00:07:52.816 [2024-11-28 16:31:50.326423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.816 [2024-11-28 16:31:50.326474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000bff cdw11:00000000 00:07:52.816 [2024-11-28 16:31:50.326489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.816 #34 NEW cov: 12363 ft: 14685 corp: 30/190b lim: 10 exec/s: 34 rss: 75Mb L: 7/9 MS: 1 ChangeByte- 00:07:52.816 [2024-11-28 16:31:50.386613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007200 cdw11:00000000 00:07:52.816 [2024-11-28 16:31:50.386638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.816 [2024-11-28 16:31:50.386705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000081a cdw11:00000000 00:07:52.816 [2024-11-28 16:31:50.386720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.816 [2024-11-28 16:31:50.386768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.816 [2024-11-28 16:31:50.386782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.816 [2024-11-28 16:31:50.386831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.816 [2024-11-28 16:31:50.386848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.816 #35 NEW cov: 12363 ft: 14698 corp: 31/199b lim: 10 exec/s: 35 rss: 75Mb L: 9/9 MS: 1 ChangeBit- 00:07:52.816 [2024-11-28 16:31:50.446709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002b00 cdw11:00000000 00:07:52.816 [2024-11-28 16:31:50.446734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.816 [2024-11-28 16:31:50.446789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000e67f cdw11:00000000 00:07:52.816 [2024-11-28 16:31:50.446802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.816 [2024-11-28 16:31:50.446853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007f7f cdw11:00000000 00:07:52.816 [2024-11-28 16:31:50.446866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.816 [2024-11-28 16:31:50.446917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00007f7f cdw11:00000000 00:07:52.816 [2024-11-28 16:31:50.446930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.076 #36 NEW cov: 12363 ft: 14773 corp: 32/207b lim: 10 exec/s: 36 rss: 75Mb L: 8/9 MS: 1 InsertRepeatedBytes- 00:07:53.076 [2024-11-28 16:31:50.486645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000b74 cdw11:00000000 00:07:53.076 [2024-11-28 16:31:50.486669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.076 [2024-11-28 16:31:50.486737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007474 cdw11:00000000 00:07:53.076 [2024-11-28 16:31:50.486751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.076 [2024-11-28 16:31:50.486804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000262b cdw11:00000000 00:07:53.076 [2024-11-28 16:31:50.486817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.076 #37 NEW cov: 12363 ft: 14795 corp: 33/213b lim: 10 exec/s: 37 rss: 75Mb L: 6/9 MS: 1 CrossOver- 00:07:53.076 [2024-11-28 16:31:50.546932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007208 cdw11:00000000 00:07:53.076 [2024-11-28 16:31:50.546957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.076 [2024-11-28 16:31:50.547025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001000 cdw11:00000000 00:07:53.076 [2024-11-28 16:31:50.547038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.076 [2024-11-28 16:31:50.547092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a00 cdw11:00000000 00:07:53.076 [2024-11-28 16:31:50.547105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.076 [2024-11-28 16:31:50.547157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.076 [2024-11-28 16:31:50.547170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.076 #38 NEW cov: 12363 ft: 14801 corp: 34/222b lim: 10 exec/s: 38 rss: 75Mb L: 9/9 MS: 1 ChangeBit- 00:07:53.076 [2024-11-28 16:31:50.587030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007200 cdw11:00000000 00:07:53.076 [2024-11-28 16:31:50.587058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.076 [2024-11-28 16:31:50.587111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 00:07:53.076 [2024-11-28 16:31:50.587124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.076 [2024-11-28 16:31:50.587176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:53.076 [2024-11-28 16:31:50.587205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.076 [2024-11-28 16:31:50.587261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 00:07:53.076 [2024-11-28 16:31:50.587274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.076 #39 NEW cov: 12363 ft: 14805 corp: 35/230b lim: 10 exec/s: 19 rss: 75Mb L: 8/9 MS: 1 CMP- DE: "\000\010"- 00:07:53.076 #39 DONE cov: 12363 ft: 14805 corp: 35/230b lim: 10 exec/s: 19 rss: 75Mb 00:07:53.076 ###### Recommended dictionary. ###### 00:07:53.076 "\377\222E1N\317\332\020" # Uses: 0 00:07:53.076 "\011\000" # Uses: 0 00:07:53.076 "\000\010" # Uses: 0 00:07:53.076 ###### End of recommended dictionary. ###### 00:07:53.076 Done 39 runs in 2 second(s) 00:07:53.076 16:31:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_7.conf /var/tmp/suppress_nvmf_fuzz 00:07:53.335 16:31:50 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:53.336 16:31:50 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:53.336 16:31:50 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:53.336 16:31:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:53.336 16:31:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:53.336 16:31:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:53.336 16:31:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:53.336 16:31:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:53.336 16:31:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:53.336 16:31:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:53.336 16:31:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 8 00:07:53.336 16:31:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4408 00:07:53.336 16:31:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:53.336 16:31:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:53.336 16:31:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:53.336 16:31:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:53.336 16:31:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:53.336 16:31:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 00:07:53.336 [2024-11-28 16:31:50.770132] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:53.336 [2024-11-28 16:31:50.770201] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3757874 ] 00:07:53.595 [2024-11-28 16:31:51.020742] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:53.595 [2024-11-28 16:31:51.050337] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:53.595 [2024-11-28 16:31:51.102660] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:53.595 [2024-11-28 16:31:51.119013] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:53.595 INFO: Running with entropic power schedule (0xFF, 100). 00:07:53.595 INFO: Seed: 1196924905 00:07:53.595 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:07:53.595 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:07:53.595 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:53.595 INFO: A corpus is not provided, starting from an empty corpus 00:07:53.595 [2024-11-28 16:31:51.196169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.595 [2024-11-28 16:31:51.196205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.595 #2 INITED cov: 12164 ft: 12137 corp: 1/1b exec/s: 0 rss: 70Mb 00:07:53.854 [2024-11-28 16:31:51.246741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.854 [2024-11-28 16:31:51.246770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.854 [2024-11-28 16:31:51.246852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.854 [2024-11-28 16:31:51.246869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.854 #3 NEW cov: 12277 ft: 13435 corp: 2/3b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 InsertByte- 00:07:53.854 [2024-11-28 16:31:51.316685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.855 [2024-11-28 16:31:51.316714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.855 #4 NEW cov: 12283 ft: 13692 corp: 3/4b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 EraseBytes- 00:07:53.855 [2024-11-28 16:31:51.386936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.855 [2024-11-28 16:31:51.386963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.855 #5 NEW cov: 12368 ft: 13916 corp: 4/5b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 CrossOver- 00:07:53.855 [2024-11-28 16:31:51.457324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.855 [2024-11-28 16:31:51.457351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.855 #6 NEW cov: 12368 ft: 14034 corp: 5/6b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 ShuffleBytes- 00:07:54.114 [2024-11-28 16:31:51.508073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.114 [2024-11-28 16:31:51.508101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.114 [2024-11-28 16:31:51.508176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.114 [2024-11-28 16:31:51.508192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.114 #7 NEW cov: 12368 ft: 14089 corp: 6/8b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 ChangeBit- 00:07:54.114 [2024-11-28 16:31:51.557791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.114 [2024-11-28 16:31:51.557818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.114 #8 NEW cov: 12368 ft: 14181 corp: 7/9b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 ChangeBit- 00:07:54.114 [2024-11-28 16:31:51.608137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.114 [2024-11-28 16:31:51.608165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.114 #9 NEW cov: 12368 ft: 14299 corp: 8/10b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 ChangeBinInt- 00:07:54.114 [2024-11-28 16:31:51.678503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.114 [2024-11-28 16:31:51.678530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.114 #10 NEW cov: 12368 ft: 14370 corp: 9/11b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 ChangeBinInt- 00:07:54.114 [2024-11-28 16:31:51.730373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.114 [2024-11-28 16:31:51.730399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.114 [2024-11-28 16:31:51.730494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.114 [2024-11-28 16:31:51.730509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.114 [2024-11-28 16:31:51.730582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.114 [2024-11-28 16:31:51.730596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.114 [2024-11-28 16:31:51.730695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.114 [2024-11-28 16:31:51.730711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.114 [2024-11-28 16:31:51.730779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.114 [2024-11-28 16:31:51.730796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.114 #11 NEW cov: 12368 ft: 14801 corp: 10/16b lim: 5 exec/s: 0 rss: 71Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:54.373 [2024-11-28 16:31:51.789457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.373 [2024-11-28 16:31:51.789485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.373 [2024-11-28 16:31:51.789567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.373 [2024-11-28 16:31:51.789583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.373 #12 NEW cov: 12368 ft: 14819 corp: 11/18b lim: 5 exec/s: 0 rss: 71Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:54.373 [2024-11-28 16:31:51.839705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.373 [2024-11-28 16:31:51.839732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.373 [2024-11-28 16:31:51.839817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.373 [2024-11-28 16:31:51.839832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.373 #13 NEW cov: 12368 ft: 14842 corp: 12/20b lim: 5 exec/s: 0 rss: 71Mb L: 2/5 MS: 1 InsertByte- 00:07:54.373 [2024-11-28 16:31:51.910740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.373 [2024-11-28 16:31:51.910772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.373 [2024-11-28 16:31:51.910857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.373 [2024-11-28 16:31:51.910873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.373 [2024-11-28 16:31:51.910953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.373 [2024-11-28 16:31:51.910969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.373 #14 NEW cov: 12368 ft: 15051 corp: 13/23b lim: 5 exec/s: 0 rss: 71Mb L: 3/5 MS: 1 CrossOver- 00:07:54.373 [2024-11-28 16:31:51.980378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.374 [2024-11-28 16:31:51.980405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.678 #15 NEW cov: 12368 ft: 15075 corp: 14/24b lim: 5 exec/s: 0 rss: 72Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:54.678 [2024-11-28 16:31:52.052268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.678 [2024-11-28 16:31:52.052295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.678 [2024-11-28 16:31:52.052373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.678 [2024-11-28 16:31:52.052389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.678 [2024-11-28 16:31:52.052462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.678 [2024-11-28 16:31:52.052477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.678 [2024-11-28 16:31:52.052551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.678 [2024-11-28 16:31:52.052565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.678 [2024-11-28 16:31:52.052646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.678 [2024-11-28 16:31:52.052663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.936 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:54.936 #16 NEW cov: 12391 ft: 15135 corp: 15/29b lim: 5 exec/s: 16 rss: 73Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:54.937 [2024-11-28 16:31:52.381804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.937 [2024-11-28 16:31:52.381840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.937 [2024-11-28 16:31:52.381967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.937 [2024-11-28 16:31:52.381987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.937 [2024-11-28 16:31:52.382107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.937 [2024-11-28 16:31:52.382123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.937 [2024-11-28 16:31:52.382247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.937 [2024-11-28 16:31:52.382266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.937 [2024-11-28 16:31:52.382380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.937 [2024-11-28 16:31:52.382398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.937 #17 NEW cov: 12391 ft: 15268 corp: 16/34b lim: 5 exec/s: 17 rss: 73Mb L: 5/5 MS: 1 ChangeBit- 00:07:54.937 [2024-11-28 16:31:52.451063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.937 [2024-11-28 16:31:52.451094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.937 [2024-11-28 16:31:52.451205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.937 [2024-11-28 16:31:52.451222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.937 #18 NEW cov: 12391 ft: 15464 corp: 17/36b lim: 5 exec/s: 18 rss: 73Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:54.937 [2024-11-28 16:31:52.501232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.937 [2024-11-28 16:31:52.501261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.937 [2024-11-28 16:31:52.501374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.937 [2024-11-28 16:31:52.501392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.937 #19 NEW cov: 12391 ft: 15485 corp: 18/38b lim: 5 exec/s: 19 rss: 73Mb L: 2/5 MS: 1 InsertByte- 00:07:54.937 [2024-11-28 16:31:52.551370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.937 [2024-11-28 16:31:52.551399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.937 [2024-11-28 16:31:52.551521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.937 [2024-11-28 16:31:52.551538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.937 #20 NEW cov: 12391 ft: 15495 corp: 19/40b lim: 5 exec/s: 20 rss: 73Mb L: 2/5 MS: 1 CrossOver- 00:07:55.196 [2024-11-28 16:31:52.601311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.196 [2024-11-28 16:31:52.601340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.196 #21 NEW cov: 12391 ft: 15566 corp: 20/41b lim: 5 exec/s: 21 rss: 73Mb L: 1/5 MS: 1 CrossOver- 00:07:55.196 [2024-11-28 16:31:52.671801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.196 [2024-11-28 16:31:52.671830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.196 [2024-11-28 16:31:52.671966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.196 [2024-11-28 16:31:52.671985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.196 #22 NEW cov: 12391 ft: 15593 corp: 21/43b lim: 5 exec/s: 22 rss: 73Mb L: 2/5 MS: 1 InsertByte- 00:07:55.196 [2024-11-28 16:31:52.721635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.196 [2024-11-28 16:31:52.721666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.196 #23 NEW cov: 12391 ft: 15683 corp: 22/44b lim: 5 exec/s: 23 rss: 73Mb L: 1/5 MS: 1 CopyPart- 00:07:55.196 [2024-11-28 16:31:52.772057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.196 [2024-11-28 16:31:52.772084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.196 [2024-11-28 16:31:52.772207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.196 [2024-11-28 16:31:52.772224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.196 #24 NEW cov: 12391 ft: 15690 corp: 23/46b lim: 5 exec/s: 24 rss: 73Mb L: 2/5 MS: 1 ChangeBit- 00:07:55.196 [2024-11-28 16:31:52.842265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.196 [2024-11-28 16:31:52.842292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.196 [2024-11-28 16:31:52.842413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.196 [2024-11-28 16:31:52.842429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.455 #25 NEW cov: 12391 ft: 15699 corp: 24/48b lim: 5 exec/s: 25 rss: 73Mb L: 2/5 MS: 1 CrossOver- 00:07:55.455 [2024-11-28 16:31:52.913331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.455 [2024-11-28 16:31:52.913357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.455 [2024-11-28 16:31:52.913499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.455 [2024-11-28 16:31:52.913518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.455 [2024-11-28 16:31:52.913641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.455 [2024-11-28 16:31:52.913660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.455 [2024-11-28 16:31:52.913784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.455 [2024-11-28 16:31:52.913800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.455 [2024-11-28 16:31:52.913926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.455 [2024-11-28 16:31:52.913944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.455 #26 NEW cov: 12391 ft: 15720 corp: 25/53b lim: 5 exec/s: 26 rss: 73Mb L: 5/5 MS: 1 ChangeBinInt- 00:07:55.455 [2024-11-28 16:31:52.983268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.455 [2024-11-28 16:31:52.983296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.455 [2024-11-28 16:31:52.983411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.455 [2024-11-28 16:31:52.983428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.455 [2024-11-28 16:31:52.983558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.455 [2024-11-28 16:31:52.983575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.455 [2024-11-28 16:31:52.983699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.455 [2024-11-28 16:31:52.983718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.455 #27 NEW cov: 12391 ft: 15767 corp: 26/57b lim: 5 exec/s: 27 rss: 74Mb L: 4/5 MS: 1 CrossOver- 00:07:55.455 [2024-11-28 16:31:53.053874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.455 [2024-11-28 16:31:53.053903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.455 [2024-11-28 16:31:53.054032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.455 [2024-11-28 16:31:53.054050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.455 [2024-11-28 16:31:53.054176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.455 [2024-11-28 16:31:53.054195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.455 [2024-11-28 16:31:53.054318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.455 [2024-11-28 16:31:53.054337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.455 [2024-11-28 16:31:53.054460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.455 [2024-11-28 16:31:53.054478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:55.455 #28 NEW cov: 12391 ft: 15771 corp: 27/62b lim: 5 exec/s: 28 rss: 74Mb L: 5/5 MS: 1 ChangeByte- 00:07:55.714 [2024-11-28 16:31:53.122875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.714 [2024-11-28 16:31:53.122906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.714 #29 NEW cov: 12391 ft: 15789 corp: 28/63b lim: 5 exec/s: 29 rss: 74Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:55.714 [2024-11-28 16:31:53.173010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.714 [2024-11-28 16:31:53.173040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.714 #30 NEW cov: 12391 ft: 15801 corp: 29/64b lim: 5 exec/s: 15 rss: 74Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:55.714 #30 DONE cov: 12391 ft: 15801 corp: 29/64b lim: 5 exec/s: 15 rss: 74Mb 00:07:55.714 Done 30 runs in 2 second(s) 00:07:55.714 16:31:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_8.conf /var/tmp/suppress_nvmf_fuzz 00:07:55.714 16:31:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:55.715 16:31:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:55.715 16:31:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:55.715 16:31:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:55.715 16:31:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:55.715 16:31:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:55.715 16:31:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:55.715 16:31:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:55.715 16:31:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:55.715 16:31:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:55.715 16:31:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 9 00:07:55.715 16:31:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4409 00:07:55.715 16:31:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:55.715 16:31:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:55.715 16:31:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:55.715 16:31:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:55.715 16:31:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:55.715 16:31:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 00:07:55.973 [2024-11-28 16:31:53.377850] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:55.973 [2024-11-28 16:31:53.377916] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3758369 ] 00:07:56.233 [2024-11-28 16:31:53.627953] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.233 [2024-11-28 16:31:53.658654] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:56.233 [2024-11-28 16:31:53.710720] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:56.233 [2024-11-28 16:31:53.727061] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:07:56.233 INFO: Running with entropic power schedule (0xFF, 100). 00:07:56.233 INFO: Seed: 3803946592 00:07:56.233 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:07:56.233 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:07:56.233 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:56.233 INFO: A corpus is not provided, starting from an empty corpus 00:07:56.233 [2024-11-28 16:31:53.775663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.233 [2024-11-28 16:31:53.775698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.233 #2 INITED cov: 12161 ft: 12161 corp: 1/1b exec/s: 0 rss: 70Mb 00:07:56.233 [2024-11-28 16:31:53.825616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.233 [2024-11-28 16:31:53.825664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.492 #3 NEW cov: 12277 ft: 12931 corp: 2/2b lim: 5 exec/s: 0 rss: 71Mb L: 1/1 MS: 1 CrossOver- 00:07:56.492 [2024-11-28 16:31:53.915922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.492 [2024-11-28 16:31:53.915952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.492 [2024-11-28 16:31:53.916000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.492 [2024-11-28 16:31:53.916016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.492 #4 NEW cov: 12283 ft: 13807 corp: 3/4b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 CrossOver- 00:07:56.492 [2024-11-28 16:31:53.976016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.492 [2024-11-28 16:31:53.976048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.492 #5 NEW cov: 12368 ft: 14104 corp: 4/5b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 EraseBytes- 00:07:56.492 [2024-11-28 16:31:54.066314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.492 [2024-11-28 16:31:54.066345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.492 [2024-11-28 16:31:54.066393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.492 [2024-11-28 16:31:54.066408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.492 #6 NEW cov: 12368 ft: 14163 corp: 5/7b lim: 5 exec/s: 0 rss: 71Mb L: 2/2 MS: 1 InsertByte- 00:07:56.492 [2024-11-28 16:31:54.126415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.492 [2024-11-28 16:31:54.126445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.751 #7 NEW cov: 12368 ft: 14282 corp: 6/8b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 CrossOver- 00:07:56.751 [2024-11-28 16:31:54.176561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.751 [2024-11-28 16:31:54.176593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.751 #8 NEW cov: 12368 ft: 14498 corp: 7/9b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 ShuffleBytes- 00:07:56.751 [2024-11-28 16:31:54.266795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.751 [2024-11-28 16:31:54.266827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.751 #9 NEW cov: 12368 ft: 14546 corp: 8/10b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 ChangeBit- 00:07:56.751 [2024-11-28 16:31:54.357044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.751 [2024-11-28 16:31:54.357074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.011 #10 NEW cov: 12368 ft: 14616 corp: 9/11b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 ChangeByte- 00:07:57.011 [2024-11-28 16:31:54.447239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.011 [2024-11-28 16:31:54.447268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.011 #11 NEW cov: 12368 ft: 14678 corp: 10/12b lim: 5 exec/s: 0 rss: 71Mb L: 1/2 MS: 1 ShuffleBytes- 00:07:57.011 [2024-11-28 16:31:54.537688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.011 [2024-11-28 16:31:54.537719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.011 [2024-11-28 16:31:54.537753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.011 [2024-11-28 16:31:54.537769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.011 [2024-11-28 16:31:54.537799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.011 [2024-11-28 16:31:54.537815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.011 #12 NEW cov: 12368 ft: 14873 corp: 11/15b lim: 5 exec/s: 0 rss: 72Mb L: 3/3 MS: 1 InsertByte- 00:07:57.011 [2024-11-28 16:31:54.627726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.011 [2024-11-28 16:31:54.627757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.530 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:57.530 #13 NEW cov: 12391 ft: 14890 corp: 12/16b lim: 5 exec/s: 13 rss: 73Mb L: 1/3 MS: 1 ChangeByte- 00:07:57.530 [2024-11-28 16:31:54.938664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.530 [2024-11-28 16:31:54.938706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.530 #14 NEW cov: 12391 ft: 14909 corp: 13/17b lim: 5 exec/s: 14 rss: 73Mb L: 1/3 MS: 1 ChangeBinInt- 00:07:57.530 [2024-11-28 16:31:54.988624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.530 [2024-11-28 16:31:54.988655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.530 #15 NEW cov: 12391 ft: 14938 corp: 14/18b lim: 5 exec/s: 15 rss: 73Mb L: 1/3 MS: 1 ShuffleBytes- 00:07:57.530 [2024-11-28 16:31:55.038754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.530 [2024-11-28 16:31:55.038785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.530 #16 NEW cov: 12391 ft: 14986 corp: 15/19b lim: 5 exec/s: 16 rss: 73Mb L: 1/3 MS: 1 ShuffleBytes- 00:07:57.530 [2024-11-28 16:31:55.129025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.530 [2024-11-28 16:31:55.129055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.531 #17 NEW cov: 12391 ft: 14999 corp: 16/20b lim: 5 exec/s: 17 rss: 73Mb L: 1/3 MS: 1 CopyPart- 00:07:57.789 [2024-11-28 16:31:55.179142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.789 [2024-11-28 16:31:55.179172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.789 #18 NEW cov: 12391 ft: 15018 corp: 17/21b lim: 5 exec/s: 18 rss: 73Mb L: 1/3 MS: 1 ShuffleBytes- 00:07:57.789 [2024-11-28 16:31:55.269403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.789 [2024-11-28 16:31:55.269435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.789 #19 NEW cov: 12391 ft: 15064 corp: 18/22b lim: 5 exec/s: 19 rss: 73Mb L: 1/3 MS: 1 CrossOver- 00:07:57.789 [2024-11-28 16:31:55.359580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.789 [2024-11-28 16:31:55.359620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.789 #20 NEW cov: 12391 ft: 15073 corp: 19/23b lim: 5 exec/s: 20 rss: 73Mb L: 1/3 MS: 1 CrossOver- 00:07:57.789 [2024-11-28 16:31:55.409737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.789 [2024-11-28 16:31:55.409768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.048 #21 NEW cov: 12391 ft: 15097 corp: 20/24b lim: 5 exec/s: 21 rss: 73Mb L: 1/3 MS: 1 ChangeByte- 00:07:58.048 [2024-11-28 16:31:55.469903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.048 [2024-11-28 16:31:55.469933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.048 #22 NEW cov: 12391 ft: 15108 corp: 21/25b lim: 5 exec/s: 22 rss: 73Mb L: 1/3 MS: 1 ChangeBinInt- 00:07:58.048 [2024-11-28 16:31:55.560120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.048 [2024-11-28 16:31:55.560157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.048 #23 NEW cov: 12391 ft: 15205 corp: 22/26b lim: 5 exec/s: 23 rss: 73Mb L: 1/3 MS: 1 ShuffleBytes- 00:07:58.048 [2024-11-28 16:31:55.610220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.048 [2024-11-28 16:31:55.610251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.048 #24 NEW cov: 12391 ft: 15232 corp: 23/27b lim: 5 exec/s: 24 rss: 73Mb L: 1/3 MS: 1 ChangeByte- 00:07:58.048 [2024-11-28 16:31:55.660390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.048 [2024-11-28 16:31:55.660420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.308 #25 NEW cov: 12392 ft: 15288 corp: 24/28b lim: 5 exec/s: 25 rss: 73Mb L: 1/3 MS: 1 CopyPart- 00:07:58.308 [2024-11-28 16:31:55.750865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.308 [2024-11-28 16:31:55.750895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.308 [2024-11-28 16:31:55.750943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.308 [2024-11-28 16:31:55.750959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.308 [2024-11-28 16:31:55.750988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.308 [2024-11-28 16:31:55.751004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.308 [2024-11-28 16:31:55.751033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.308 [2024-11-28 16:31:55.751048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.308 #26 NEW cov: 12392 ft: 15571 corp: 25/32b lim: 5 exec/s: 13 rss: 73Mb L: 4/4 MS: 1 CopyPart- 00:07:58.308 #26 DONE cov: 12392 ft: 15571 corp: 25/32b lim: 5 exec/s: 13 rss: 73Mb 00:07:58.308 Done 26 runs in 2 second(s) 00:07:58.308 16:31:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_9.conf /var/tmp/suppress_nvmf_fuzz 00:07:58.308 16:31:55 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:58.308 16:31:55 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:58.308 16:31:55 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:07:58.308 16:31:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:07:58.308 16:31:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:58.308 16:31:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:58.309 16:31:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:58.309 16:31:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:07:58.309 16:31:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:58.309 16:31:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:58.309 16:31:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 10 00:07:58.309 16:31:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4410 00:07:58.309 16:31:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:58.309 16:31:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:07:58.309 16:31:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:58.309 16:31:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:58.309 16:31:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:58.309 16:31:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 00:07:58.309 [2024-11-28 16:31:55.954717] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:07:58.309 [2024-11-28 16:31:55.954790] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3758899 ] 00:07:58.568 [2024-11-28 16:31:56.197728] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:58.827 [2024-11-28 16:31:56.228746] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:07:58.827 [2024-11-28 16:31:56.281119] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:58.827 [2024-11-28 16:31:56.297469] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:07:58.827 INFO: Running with entropic power schedule (0xFF, 100). 00:07:58.827 INFO: Seed: 2080981019 00:07:58.827 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:07:58.827 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:07:58.827 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:58.827 INFO: A corpus is not provided, starting from an empty corpus 00:07:58.827 #2 INITED exec/s: 0 rss: 64Mb 00:07:58.827 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:58.827 This may also happen if the target rejected all inputs we tried so far 00:07:58.827 [2024-11-28 16:31:56.343114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a1e1e1e cdw11:1e1e1e1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.827 [2024-11-28 16:31:56.343142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.827 [2024-11-28 16:31:56.343202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:1e1e1e1e cdw11:1e1e1e1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.828 [2024-11-28 16:31:56.343217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.828 [2024-11-28 16:31:56.343275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:1e1e1e1e cdw11:1e1e1e1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.828 [2024-11-28 16:31:56.343289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.087 NEW_FUNC[1/714]: 0x45f648 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:07:59.087 NEW_FUNC[2/714]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:59.087 #23 NEW cov: 12187 ft: 12185 corp: 2/28b lim: 40 exec/s: 0 rss: 72Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:07:59.087 [2024-11-28 16:31:56.663958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a1e1e1e cdw11:1e1e1e1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.087 [2024-11-28 16:31:56.663993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.087 [2024-11-28 16:31:56.664071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:1e1e1e1e cdw11:1e1e1e1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.087 [2024-11-28 16:31:56.664086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.087 [2024-11-28 16:31:56.664148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:1e1e1e1e cdw11:1e1e1e1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.087 [2024-11-28 16:31:56.664162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.087 #24 NEW cov: 12300 ft: 12692 corp: 3/54b lim: 40 exec/s: 0 rss: 72Mb L: 26/27 MS: 1 EraseBytes- 00:07:59.087 [2024-11-28 16:31:56.724162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0ac8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.087 [2024-11-28 16:31:56.724188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.087 [2024-11-28 16:31:56.724264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.087 [2024-11-28 16:31:56.724278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.087 [2024-11-28 16:31:56.724337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.087 [2024-11-28 16:31:56.724351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.087 [2024-11-28 16:31:56.724410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.087 [2024-11-28 16:31:56.724424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.348 #31 NEW cov: 12306 ft: 13425 corp: 4/90b lim: 40 exec/s: 0 rss: 72Mb L: 36/36 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:59.348 [2024-11-28 16:31:56.764132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a161e1e cdw11:1e1e1e1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.348 [2024-11-28 16:31:56.764158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.348 [2024-11-28 16:31:56.764234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:1e1e1e1e cdw11:1e1e1e1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.348 [2024-11-28 16:31:56.764248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.348 [2024-11-28 16:31:56.764308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:1e1e1e1e cdw11:1e1e1e1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.348 [2024-11-28 16:31:56.764321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.348 #32 NEW cov: 12391 ft: 13713 corp: 5/116b lim: 40 exec/s: 0 rss: 72Mb L: 26/36 MS: 1 ChangeBit- 00:07:59.348 [2024-11-28 16:31:56.824414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0ac8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.348 [2024-11-28 16:31:56.824440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.348 [2024-11-28 16:31:56.824504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.348 [2024-11-28 16:31:56.824521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.348 [2024-11-28 16:31:56.824581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.348 [2024-11-28 16:31:56.824595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.348 [2024-11-28 16:31:56.824660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.348 [2024-11-28 16:31:56.824674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.348 #33 NEW cov: 12391 ft: 13827 corp: 6/152b lim: 40 exec/s: 0 rss: 72Mb L: 36/36 MS: 1 CopyPart- 00:07:59.348 [2024-11-28 16:31:56.884637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0ac80ac8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.348 [2024-11-28 16:31:56.884665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.348 [2024-11-28 16:31:56.884726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.348 [2024-11-28 16:31:56.884740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.348 [2024-11-28 16:31:56.884799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.348 [2024-11-28 16:31:56.884812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.348 [2024-11-28 16:31:56.884875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.348 [2024-11-28 16:31:56.884889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.348 #39 NEW cov: 12391 ft: 13925 corp: 7/189b lim: 40 exec/s: 0 rss: 72Mb L: 37/37 MS: 1 CopyPart- 00:07:59.348 [2024-11-28 16:31:56.944817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:4a0ac8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.348 [2024-11-28 16:31:56.944843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.348 [2024-11-28 16:31:56.944908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.348 [2024-11-28 16:31:56.944921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.348 [2024-11-28 16:31:56.944984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.348 [2024-11-28 16:31:56.945014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.348 [2024-11-28 16:31:56.945075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.348 [2024-11-28 16:31:56.945088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.348 #44 NEW cov: 12391 ft: 14032 corp: 8/226b lim: 40 exec/s: 0 rss: 72Mb L: 37/37 MS: 5 CopyPart-ChangeBit-ChangeByte-ShuffleBytes-CrossOver- 00:07:59.348 [2024-11-28 16:31:56.984904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0ac80ac8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.348 [2024-11-28 16:31:56.984931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.348 [2024-11-28 16:31:56.984996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.348 [2024-11-28 16:31:56.985010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.348 [2024-11-28 16:31:56.985073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.348 [2024-11-28 16:31:56.985087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.348 [2024-11-28 16:31:56.985150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.348 [2024-11-28 16:31:56.985163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.608 #45 NEW cov: 12391 ft: 14064 corp: 9/263b lim: 40 exec/s: 0 rss: 72Mb L: 37/37 MS: 1 CopyPart- 00:07:59.608 [2024-11-28 16:31:57.044789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:4ac8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.608 [2024-11-28 16:31:57.044815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.608 [2024-11-28 16:31:57.044893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.608 [2024-11-28 16:31:57.044908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.608 #46 NEW cov: 12391 ft: 14405 corp: 10/285b lim: 40 exec/s: 0 rss: 72Mb L: 22/37 MS: 1 EraseBytes- 00:07:59.608 [2024-11-28 16:31:57.104809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:4a0ac8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.608 [2024-11-28 16:31:57.104835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.608 #47 NEW cov: 12391 ft: 14774 corp: 11/297b lim: 40 exec/s: 0 rss: 72Mb L: 12/37 MS: 1 CrossOver- 00:07:59.608 [2024-11-28 16:31:57.145352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0ac8c8 cdw11:c8c8c800 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.608 [2024-11-28 16:31:57.145379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.608 [2024-11-28 16:31:57.145439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000024c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.608 [2024-11-28 16:31:57.145454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.608 [2024-11-28 16:31:57.145518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.608 [2024-11-28 16:31:57.145533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.608 [2024-11-28 16:31:57.145590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.608 [2024-11-28 16:31:57.145609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.609 #48 NEW cov: 12391 ft: 14796 corp: 12/333b lim: 40 exec/s: 0 rss: 72Mb L: 36/37 MS: 1 ChangeBinInt- 00:07:59.609 [2024-11-28 16:31:57.185444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0ac8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.609 [2024-11-28 16:31:57.185470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.609 [2024-11-28 16:31:57.185533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.609 [2024-11-28 16:31:57.185547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.609 [2024-11-28 16:31:57.185613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c9c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.609 [2024-11-28 16:31:57.185627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.609 [2024-11-28 16:31:57.185689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.609 [2024-11-28 16:31:57.185702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.609 #49 NEW cov: 12391 ft: 14846 corp: 13/369b lim: 40 exec/s: 0 rss: 72Mb L: 36/37 MS: 1 ChangeBit- 00:07:59.609 [2024-11-28 16:31:57.225604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0ac80ac8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.609 [2024-11-28 16:31:57.225630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.609 [2024-11-28 16:31:57.225691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.609 [2024-11-28 16:31:57.225705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.609 [2024-11-28 16:31:57.225779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.609 [2024-11-28 16:31:57.225793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.609 [2024-11-28 16:31:57.225855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.609 [2024-11-28 16:31:57.225869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.609 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:07:59.609 #55 NEW cov: 12414 ft: 14905 corp: 14/406b lim: 40 exec/s: 0 rss: 73Mb L: 37/37 MS: 1 ChangeBinInt- 00:07:59.869 [2024-11-28 16:31:57.265876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0ac8c8 cdw11:c8c8c800 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.869 [2024-11-28 16:31:57.265902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.869 [2024-11-28 16:31:57.265979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.869 [2024-11-28 16:31:57.265993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.869 [2024-11-28 16:31:57.266055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:000024c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.869 [2024-11-28 16:31:57.266072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.869 [2024-11-28 16:31:57.266133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.869 [2024-11-28 16:31:57.266146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.869 [2024-11-28 16:31:57.266209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.869 [2024-11-28 16:31:57.266223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.869 #56 NEW cov: 12414 ft: 14957 corp: 15/446b lim: 40 exec/s: 0 rss: 73Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:07:59.869 [2024-11-28 16:31:57.325903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:4a0ac8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.870 [2024-11-28 16:31:57.325929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.870 [2024-11-28 16:31:57.325991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.870 [2024-11-28 16:31:57.326004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.870 [2024-11-28 16:31:57.326065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.870 [2024-11-28 16:31:57.326079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.870 [2024-11-28 16:31:57.326142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.870 [2024-11-28 16:31:57.326155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.870 #57 NEW cov: 12414 ft: 14968 corp: 16/484b lim: 40 exec/s: 57 rss: 73Mb L: 38/40 MS: 1 CrossOver- 00:07:59.870 [2024-11-28 16:31:57.365906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:42aaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.870 [2024-11-28 16:31:57.365932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.870 [2024-11-28 16:31:57.366006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.870 [2024-11-28 16:31:57.366021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.870 [2024-11-28 16:31:57.366083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.870 [2024-11-28 16:31:57.366096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.870 #61 NEW cov: 12414 ft: 14997 corp: 17/510b lim: 40 exec/s: 61 rss: 73Mb L: 26/40 MS: 4 ChangeByte-ChangeByte-InsertByte-InsertRepeatedBytes- 00:07:59.870 [2024-11-28 16:31:57.405980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:42aaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.870 [2024-11-28 16:31:57.406006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.870 [2024-11-28 16:31:57.406086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.870 [2024-11-28 16:31:57.406100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.870 [2024-11-28 16:31:57.406162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffff00aa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.870 [2024-11-28 16:31:57.406176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.870 #62 NEW cov: 12414 ft: 15032 corp: 18/540b lim: 40 exec/s: 62 rss: 73Mb L: 30/40 MS: 1 CMP- DE: "\377\377\377\000"- 00:07:59.870 [2024-11-28 16:31:57.466285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:4a0ac8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.870 [2024-11-28 16:31:57.466311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.870 [2024-11-28 16:31:57.466391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.870 [2024-11-28 16:31:57.466405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.870 [2024-11-28 16:31:57.466470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.870 [2024-11-28 16:31:57.466484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.870 [2024-11-28 16:31:57.466544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.870 [2024-11-28 16:31:57.466558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.870 #63 NEW cov: 12414 ft: 15042 corp: 19/577b lim: 40 exec/s: 63 rss: 73Mb L: 37/40 MS: 1 CopyPart- 00:07:59.870 [2024-11-28 16:31:57.506380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:4a0ac8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.870 [2024-11-28 16:31:57.506407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.870 [2024-11-28 16:31:57.506471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.870 [2024-11-28 16:31:57.506486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.870 [2024-11-28 16:31:57.506548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.870 [2024-11-28 16:31:57.506562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.870 [2024-11-28 16:31:57.506629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c5c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.870 [2024-11-28 16:31:57.506643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.131 #64 NEW cov: 12414 ft: 15051 corp: 20/614b lim: 40 exec/s: 64 rss: 73Mb L: 37/40 MS: 1 ChangeByte- 00:08:00.131 [2024-11-28 16:31:57.546392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.131 [2024-11-28 16:31:57.546418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.131 [2024-11-28 16:31:57.546486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffff00 cdw11:aaaa4a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.131 [2024-11-28 16:31:57.546500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.131 [2024-11-28 16:31:57.546563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.131 [2024-11-28 16:31:57.546576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.131 #65 NEW cov: 12414 ft: 15081 corp: 21/640b lim: 40 exec/s: 65 rss: 73Mb L: 26/40 MS: 1 CrossOver- 00:08:00.131 [2024-11-28 16:31:57.606539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:42aaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.131 [2024-11-28 16:31:57.606564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.131 [2024-11-28 16:31:57.606632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaa0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.131 [2024-11-28 16:31:57.606647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.131 [2024-11-28 16:31:57.606709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:001eaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.131 [2024-11-28 16:31:57.606723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.131 #66 NEW cov: 12414 ft: 15123 corp: 22/670b lim: 40 exec/s: 66 rss: 73Mb L: 30/40 MS: 1 ChangeBinInt- 00:08:00.131 [2024-11-28 16:31:57.666857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:4a0ac8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.131 [2024-11-28 16:31:57.666883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.131 [2024-11-28 16:31:57.666950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.131 [2024-11-28 16:31:57.666964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.132 [2024-11-28 16:31:57.667031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffff00 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.132 [2024-11-28 16:31:57.667045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.132 [2024-11-28 16:31:57.667107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.132 [2024-11-28 16:31:57.667120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.132 #67 NEW cov: 12414 ft: 15134 corp: 23/708b lim: 40 exec/s: 67 rss: 73Mb L: 38/40 MS: 1 PersAutoDict- DE: "\377\377\377\000"- 00:08:00.132 [2024-11-28 16:31:57.726727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a1e1e1e cdw11:1e1e1e1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.132 [2024-11-28 16:31:57.726752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.132 [2024-11-28 16:31:57.726816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:1e1e1e1e cdw11:1e1e1e1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.132 [2024-11-28 16:31:57.726833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.132 #68 NEW cov: 12414 ft: 15139 corp: 24/729b lim: 40 exec/s: 68 rss: 73Mb L: 21/40 MS: 1 CrossOver- 00:08:00.132 [2024-11-28 16:31:57.767161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:4a0ac8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.132 [2024-11-28 16:31:57.767186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.132 [2024-11-28 16:31:57.767266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.132 [2024-11-28 16:31:57.767281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.132 [2024-11-28 16:31:57.767345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffff00 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.132 [2024-11-28 16:31:57.767358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.132 [2024-11-28 16:31:57.767422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.132 [2024-11-28 16:31:57.767435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.391 #69 NEW cov: 12414 ft: 15158 corp: 25/767b lim: 40 exec/s: 69 rss: 73Mb L: 38/40 MS: 1 ShuffleBytes- 00:08:00.391 [2024-11-28 16:31:57.827324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:42aaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.391 [2024-11-28 16:31:57.827350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.391 [2024-11-28 16:31:57.827431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.391 [2024-11-28 16:31:57.827446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.391 [2024-11-28 16:31:57.827510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.391 [2024-11-28 16:31:57.827522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.391 [2024-11-28 16:31:57.827588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:0000001e cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.391 [2024-11-28 16:31:57.827606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.391 #70 NEW cov: 12414 ft: 15172 corp: 26/803b lim: 40 exec/s: 70 rss: 73Mb L: 36/40 MS: 1 CopyPart- 00:08:00.391 [2024-11-28 16:31:57.887383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:42aaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.391 [2024-11-28 16:31:57.887408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.391 [2024-11-28 16:31:57.887490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaa0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.391 [2024-11-28 16:31:57.887504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.391 [2024-11-28 16:31:57.887569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:001eaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.391 [2024-11-28 16:31:57.887585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.391 #71 NEW cov: 12414 ft: 15185 corp: 27/833b lim: 40 exec/s: 71 rss: 73Mb L: 30/40 MS: 1 ChangeBinInt- 00:08:00.391 [2024-11-28 16:31:57.927723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0ac8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.391 [2024-11-28 16:31:57.927748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.391 [2024-11-28 16:31:57.927831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:c8c80000 cdw11:0000c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.391 [2024-11-28 16:31:57.927845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.391 [2024-11-28 16:31:57.927910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.391 [2024-11-28 16:31:57.927924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.391 [2024-11-28 16:31:57.927988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:c8c8c9c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.391 [2024-11-28 16:31:57.928001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.391 [2024-11-28 16:31:57.928064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.391 [2024-11-28 16:31:57.928078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:00.391 #72 NEW cov: 12414 ft: 15202 corp: 28/873b lim: 40 exec/s: 72 rss: 73Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:00.391 [2024-11-28 16:31:57.987782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:42aaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.391 [2024-11-28 16:31:57.987807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.391 [2024-11-28 16:31:57.987884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.391 [2024-11-28 16:31:57.987899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.391 [2024-11-28 16:31:57.987963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:aaaaaaaa cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.391 [2024-11-28 16:31:57.987977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.391 [2024-11-28 16:31:57.988039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:3500001e cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.391 [2024-11-28 16:31:57.988053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.391 #73 NEW cov: 12414 ft: 15226 corp: 29/909b lim: 40 exec/s: 73 rss: 74Mb L: 36/40 MS: 1 CMP- DE: "\0015"- 00:08:00.651 [2024-11-28 16:31:58.047671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a1e1e1e cdw11:1e1e1e1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.651 [2024-11-28 16:31:58.047698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.651 [2024-11-28 16:31:58.047770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:1e1e1e1e cdw11:1e1e1e1e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.651 [2024-11-28 16:31:58.047785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.651 #74 NEW cov: 12414 ft: 15238 corp: 30/932b lim: 40 exec/s: 74 rss: 74Mb L: 23/40 MS: 1 CMP- DE: "\377\002"- 00:08:00.651 [2024-11-28 16:31:58.108197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0ac8c8 cdw11:c8c8c800 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.651 [2024-11-28 16:31:58.108222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.651 [2024-11-28 16:31:58.108305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.651 [2024-11-28 16:31:58.108319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.651 [2024-11-28 16:31:58.108385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:000024c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.651 [2024-11-28 16:31:58.108399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.651 [2024-11-28 16:31:58.108462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.651 [2024-11-28 16:31:58.108476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.651 #75 NEW cov: 12414 ft: 15258 corp: 31/971b lim: 40 exec/s: 75 rss: 74Mb L: 39/40 MS: 1 EraseBytes- 00:08:00.651 [2024-11-28 16:31:58.168159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:42aaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.651 [2024-11-28 16:31:58.168185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.651 [2024-11-28 16:31:58.168266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaaaafb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.651 [2024-11-28 16:31:58.168280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.651 [2024-11-28 16:31:58.168345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffff00aa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.651 [2024-11-28 16:31:58.168358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.651 #76 NEW cov: 12414 ft: 15292 corp: 32/1001b lim: 40 exec/s: 76 rss: 74Mb L: 30/40 MS: 1 ChangeBit- 00:08:00.651 [2024-11-28 16:31:58.208420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0ac8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.651 [2024-11-28 16:31:58.208445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.651 [2024-11-28 16:31:58.208524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.651 [2024-11-28 16:31:58.208538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.651 [2024-11-28 16:31:58.208606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8373b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.651 [2024-11-28 16:31:58.208619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.651 [2024-11-28 16:31:58.208685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.651 [2024-11-28 16:31:58.208699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.651 #77 NEW cov: 12414 ft: 15302 corp: 33/1037b lim: 40 exec/s: 77 rss: 74Mb L: 36/40 MS: 1 ChangeBinInt- 00:08:00.651 [2024-11-28 16:31:58.248353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:42aaaaaa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.651 [2024-11-28 16:31:58.248378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.651 [2024-11-28 16:31:58.248443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:aaaaaaaa cdw11:aaaa0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.651 [2024-11-28 16:31:58.248457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.651 [2024-11-28 16:31:58.248522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:001eaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.651 [2024-11-28 16:31:58.248535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.651 #78 NEW cov: 12414 ft: 15319 corp: 34/1067b lim: 40 exec/s: 78 rss: 74Mb L: 30/40 MS: 1 ChangeByte- 00:08:00.911 [2024-11-28 16:31:58.308690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:aaaaaaff cdw11:924536ac SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.911 [2024-11-28 16:31:58.308716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.911 [2024-11-28 16:31:58.308795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e96594aa cdw11:aaaaaaaa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.911 [2024-11-28 16:31:58.308810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.911 [2024-11-28 16:31:58.308872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffff00 cdw11:aaaa4a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.911 [2024-11-28 16:31:58.308885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.911 [2024-11-28 16:31:58.308950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:c8c8c8c8 cdw11:c8c8c8c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:00.911 [2024-11-28 16:31:58.308963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.911 #79 NEW cov: 12414 ft: 15334 corp: 35/1101b lim: 40 exec/s: 39 rss: 74Mb L: 34/40 MS: 1 CMP- DE: "\377\222E6\254\351e\224"- 00:08:00.911 #79 DONE cov: 12414 ft: 15334 corp: 35/1101b lim: 40 exec/s: 39 rss: 74Mb 00:08:00.911 ###### Recommended dictionary. ###### 00:08:00.911 "\377\377\377\000" # Uses: 1 00:08:00.911 "\0015" # Uses: 0 00:08:00.911 "\377\002" # Uses: 0 00:08:00.911 "\377\222E6\254\351e\224" # Uses: 0 00:08:00.911 ###### End of recommended dictionary. ###### 00:08:00.911 Done 79 runs in 2 second(s) 00:08:00.911 16:31:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_10.conf /var/tmp/suppress_nvmf_fuzz 00:08:00.911 16:31:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:00.911 16:31:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:00.911 16:31:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:08:00.911 16:31:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:08:00.911 16:31:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:00.911 16:31:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:00.911 16:31:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:00.911 16:31:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:08:00.912 16:31:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:00.912 16:31:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:00.912 16:31:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 11 00:08:00.912 16:31:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4411 00:08:00.912 16:31:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:00.912 16:31:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:08:00.912 16:31:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:00.912 16:31:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:00.912 16:31:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:00.912 16:31:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 00:08:00.912 [2024-11-28 16:31:58.511973] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:00.912 [2024-11-28 16:31:58.512044] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3759224 ] 00:08:01.171 [2024-11-28 16:31:58.766490] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:01.171 [2024-11-28 16:31:58.794160] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:01.430 [2024-11-28 16:31:58.846628] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:01.430 [2024-11-28 16:31:58.862963] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:08:01.430 INFO: Running with entropic power schedule (0xFF, 100). 00:08:01.430 INFO: Seed: 351992244 00:08:01.430 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:01.430 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:01.430 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:01.430 INFO: A corpus is not provided, starting from an empty corpus 00:08:01.430 #2 INITED exec/s: 0 rss: 64Mb 00:08:01.430 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:01.430 This may also happen if the target rejected all inputs we tried so far 00:08:01.430 [2024-11-28 16:31:58.918391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9b9b9b9 cdw11:b9b9b90a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.430 [2024-11-28 16:31:58.918420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.690 NEW_FUNC[1/715]: 0x4613b8 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:01.690 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:01.690 #15 NEW cov: 12199 ft: 12195 corp: 2/9b lim: 40 exec/s: 0 rss: 72Mb L: 8/8 MS: 3 ShuffleBytes-CrossOver-InsertRepeatedBytes- 00:08:01.690 [2024-11-28 16:31:59.229096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9a9b9b9 cdw11:b9b9b90a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.690 [2024-11-28 16:31:59.229130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.690 #16 NEW cov: 12312 ft: 12585 corp: 3/17b lim: 40 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 ChangeBit- 00:08:01.690 [2024-11-28 16:31:59.289196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9a9b9b9 cdw11:b9b9b90a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.690 [2024-11-28 16:31:59.289223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.690 #22 NEW cov: 12318 ft: 12873 corp: 4/25b lim: 40 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 ShuffleBytes- 00:08:01.950 [2024-11-28 16:31:59.349323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9a9b6b9 cdw11:b9b9b90a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.950 [2024-11-28 16:31:59.349349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.950 #23 NEW cov: 12403 ft: 13256 corp: 5/33b lim: 40 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 ChangeBinInt- 00:08:01.950 [2024-11-28 16:31:59.389436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9b9b9b9 cdw11:a9b9b9b9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.950 [2024-11-28 16:31:59.389462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.950 #29 NEW cov: 12403 ft: 13382 corp: 6/41b lim: 40 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 CrossOver- 00:08:01.950 [2024-11-28 16:31:59.429718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.950 [2024-11-28 16:31:59.429743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.950 [2024-11-28 16:31:59.429816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.950 [2024-11-28 16:31:59.429830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.950 #34 NEW cov: 12403 ft: 14107 corp: 7/62b lim: 40 exec/s: 0 rss: 72Mb L: 21/21 MS: 5 ChangeBit-ChangeByte-ShuffleBytes-InsertByte-InsertRepeatedBytes- 00:08:01.950 [2024-11-28 16:31:59.469680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9b9b9b9 cdw11:b9b9b9b9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.950 [2024-11-28 16:31:59.469705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.950 #40 NEW cov: 12403 ft: 14193 corp: 8/70b lim: 40 exec/s: 0 rss: 72Mb L: 8/21 MS: 1 CopyPart- 00:08:01.950 [2024-11-28 16:31:59.509847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9b9b9b9 cdw11:a9b9b9b9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.950 [2024-11-28 16:31:59.509873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.950 #41 NEW cov: 12403 ft: 14216 corp: 9/78b lim: 40 exec/s: 0 rss: 72Mb L: 8/21 MS: 1 ShuffleBytes- 00:08:01.950 [2024-11-28 16:31:59.569972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9a9b90a cdw11:b9b9b90a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.950 [2024-11-28 16:31:59.569997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.209 #42 NEW cov: 12403 ft: 14269 corp: 10/86b lim: 40 exec/s: 0 rss: 72Mb L: 8/21 MS: 1 CrossOver- 00:08:02.209 [2024-11-28 16:31:59.630149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9a9b90a cdw11:b9b9bc0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.209 [2024-11-28 16:31:59.630174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.209 #48 NEW cov: 12403 ft: 14315 corp: 11/94b lim: 40 exec/s: 0 rss: 72Mb L: 8/21 MS: 1 ChangeBinInt- 00:08:02.209 [2024-11-28 16:31:59.690286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9b9b9b9 cdw11:b9b9b90a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.209 [2024-11-28 16:31:59.690311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.209 #49 NEW cov: 12403 ft: 14344 corp: 12/102b lim: 40 exec/s: 0 rss: 73Mb L: 8/21 MS: 1 ShuffleBytes- 00:08:02.209 [2024-11-28 16:31:59.730381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9b9b9b9 cdw11:b9b9f90a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.209 [2024-11-28 16:31:59.730407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.209 #55 NEW cov: 12403 ft: 14402 corp: 13/110b lim: 40 exec/s: 0 rss: 73Mb L: 8/21 MS: 1 ChangeBit- 00:08:02.209 [2024-11-28 16:31:59.790556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9b9b9b9 cdw11:a9b9b9b9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.209 [2024-11-28 16:31:59.790581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.209 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:02.209 #56 NEW cov: 12426 ft: 14445 corp: 14/118b lim: 40 exec/s: 0 rss: 73Mb L: 8/21 MS: 1 CopyPart- 00:08:02.210 [2024-11-28 16:31:59.850766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9b9b9b9 cdw11:b9b92d0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.210 [2024-11-28 16:31:59.850792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.469 #57 NEW cov: 12426 ft: 14544 corp: 15/126b lim: 40 exec/s: 0 rss: 73Mb L: 8/21 MS: 1 ChangeByte- 00:08:02.469 [2024-11-28 16:31:59.910915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9b9b9b9 cdw11:a9b9b9b9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.469 [2024-11-28 16:31:59.910943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.469 #63 NEW cov: 12426 ft: 14565 corp: 16/136b lim: 40 exec/s: 63 rss: 73Mb L: 10/21 MS: 1 CrossOver- 00:08:02.469 [2024-11-28 16:31:59.950987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9b9b9a9 cdw11:b9b9b9b9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.469 [2024-11-28 16:31:59.951014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.469 #64 NEW cov: 12426 ft: 14633 corp: 17/150b lim: 40 exec/s: 64 rss: 73Mb L: 14/21 MS: 1 CopyPart- 00:08:02.469 [2024-11-28 16:32:00.011175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9f90ab9 cdw11:b9b9b9b9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.469 [2024-11-28 16:32:00.011201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.469 #65 NEW cov: 12426 ft: 14650 corp: 18/161b lim: 40 exec/s: 65 rss: 73Mb L: 11/21 MS: 1 CopyPart- 00:08:02.469 [2024-11-28 16:32:00.051445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.469 [2024-11-28 16:32:00.051473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.469 [2024-11-28 16:32:00.051532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.469 [2024-11-28 16:32:00.051545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.469 #71 NEW cov: 12426 ft: 14734 corp: 19/183b lim: 40 exec/s: 71 rss: 73Mb L: 22/22 MS: 1 InsertByte- 00:08:02.469 [2024-11-28 16:32:00.111496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9b9b9b9 cdw11:b9b9b90a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.469 [2024-11-28 16:32:00.111528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.729 #72 NEW cov: 12426 ft: 14738 corp: 20/191b lim: 40 exec/s: 72 rss: 73Mb L: 8/22 MS: 1 ShuffleBytes- 00:08:02.729 [2024-11-28 16:32:00.151538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9b9b9b9 cdw11:acb9b9b9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.729 [2024-11-28 16:32:00.151564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.729 #73 NEW cov: 12426 ft: 14763 corp: 21/201b lim: 40 exec/s: 73 rss: 73Mb L: 10/22 MS: 1 ChangeBinInt- 00:08:02.729 [2024-11-28 16:32:00.211714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9b926b9 cdw11:b9b9b92d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.729 [2024-11-28 16:32:00.211740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.729 #74 NEW cov: 12426 ft: 14811 corp: 22/210b lim: 40 exec/s: 74 rss: 73Mb L: 9/22 MS: 1 InsertByte- 00:08:02.729 [2024-11-28 16:32:00.272027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9b9b9a9 cdw11:b9b9b9b9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.729 [2024-11-28 16:32:00.272052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.729 [2024-11-28 16:32:00.272108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:b9b9b9a9 cdw11:b9b9b9b9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.729 [2024-11-28 16:32:00.272121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.729 #75 NEW cov: 12426 ft: 14823 corp: 23/231b lim: 40 exec/s: 75 rss: 73Mb L: 21/22 MS: 1 CrossOver- 00:08:02.729 [2024-11-28 16:32:00.332059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9b9b9b9 cdw11:b9b9b9f9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.729 [2024-11-28 16:32:00.332083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.729 #81 NEW cov: 12426 ft: 14831 corp: 24/239b lim: 40 exec/s: 81 rss: 74Mb L: 8/22 MS: 1 ChangeBit- 00:08:02.989 [2024-11-28 16:32:00.392684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4a454545 cdw11:45454545 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.989 [2024-11-28 16:32:00.392710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.989 [2024-11-28 16:32:00.392769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:45454545 cdw11:45454545 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.989 [2024-11-28 16:32:00.392783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.989 [2024-11-28 16:32:00.392839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:45454545 cdw11:45454545 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.989 [2024-11-28 16:32:00.392853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.989 [2024-11-28 16:32:00.392911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:45454545 cdw11:45454545 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.989 [2024-11-28 16:32:00.392925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.989 #84 NEW cov: 12426 ft: 15213 corp: 25/276b lim: 40 exec/s: 84 rss: 74Mb L: 37/37 MS: 3 ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:08:02.989 [2024-11-28 16:32:00.432289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9b0b9b9 cdw11:b9b9b90a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.989 [2024-11-28 16:32:00.432316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.989 #90 NEW cov: 12426 ft: 15227 corp: 26/284b lim: 40 exec/s: 90 rss: 74Mb L: 8/37 MS: 1 ChangeBinInt- 00:08:02.989 [2024-11-28 16:32:00.472554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.989 [2024-11-28 16:32:00.472578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.989 [2024-11-28 16:32:00.472654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.989 [2024-11-28 16:32:00.472668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.989 #91 NEW cov: 12426 ft: 15234 corp: 27/304b lim: 40 exec/s: 91 rss: 74Mb L: 20/37 MS: 1 EraseBytes- 00:08:02.989 [2024-11-28 16:32:00.533076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:4a454545 cdw11:45454545 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.989 [2024-11-28 16:32:00.533101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.989 [2024-11-28 16:32:00.533174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:45454545 cdw11:45454545 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.989 [2024-11-28 16:32:00.533189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.989 [2024-11-28 16:32:00.533243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:45454545 cdw11:45454545 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.989 [2024-11-28 16:32:00.533257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.989 [2024-11-28 16:32:00.533312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:45454545 cdw11:45454545 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.989 [2024-11-28 16:32:00.533325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.989 #92 NEW cov: 12426 ft: 15257 corp: 28/341b lim: 40 exec/s: 92 rss: 74Mb L: 37/37 MS: 1 ShuffleBytes- 00:08:02.989 [2024-11-28 16:32:00.592798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9b9b9b9 cdw11:c3b92d0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.989 [2024-11-28 16:32:00.592824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.989 #93 NEW cov: 12426 ft: 15351 corp: 29/349b lim: 40 exec/s: 93 rss: 74Mb L: 8/37 MS: 1 ChangeBinInt- 00:08:02.989 [2024-11-28 16:32:00.633076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.989 [2024-11-28 16:32:00.633102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.989 [2024-11-28 16:32:00.633159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffffff7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.989 [2024-11-28 16:32:00.633173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.248 #94 NEW cov: 12426 ft: 15361 corp: 30/370b lim: 40 exec/s: 94 rss: 74Mb L: 21/37 MS: 1 ChangeByte- 00:08:03.248 [2024-11-28 16:32:00.672991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9f90ab9 cdw11:b9b9b9b9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.248 [2024-11-28 16:32:00.673019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.248 #95 NEW cov: 12426 ft: 15372 corp: 31/381b lim: 40 exec/s: 95 rss: 74Mb L: 11/37 MS: 1 ChangeBinInt- 00:08:03.248 [2024-11-28 16:32:00.733186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9b90db9 cdw11:b9b9b9b9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.248 [2024-11-28 16:32:00.733211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.248 [2024-11-28 16:32:00.793331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.248 [2024-11-28 16:32:00.793356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.248 #97 NEW cov: 12426 ft: 15415 corp: 32/396b lim: 40 exec/s: 97 rss: 74Mb L: 15/37 MS: 2 InsertByte-CrossOver- 00:08:03.248 [2024-11-28 16:32:00.833419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:b9b9b9c3 cdw11:b92d0a0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.248 [2024-11-28 16:32:00.833444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.248 #98 NEW cov: 12426 ft: 15429 corp: 33/404b lim: 40 exec/s: 98 rss: 74Mb L: 8/37 MS: 1 CopyPart- 00:08:03.248 [2024-11-28 16:32:00.893794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.248 [2024-11-28 16:32:00.893820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.248 [2024-11-28 16:32:00.893877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.248 [2024-11-28 16:32:00.893891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.508 #99 NEW cov: 12426 ft: 15441 corp: 34/425b lim: 40 exec/s: 49 rss: 74Mb L: 21/37 MS: 1 ChangeBit- 00:08:03.508 #99 DONE cov: 12426 ft: 15441 corp: 34/425b lim: 40 exec/s: 49 rss: 74Mb 00:08:03.508 Done 99 runs in 2 second(s) 00:08:03.508 16:32:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_11.conf /var/tmp/suppress_nvmf_fuzz 00:08:03.508 16:32:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:03.508 16:32:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:03.508 16:32:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:03.508 16:32:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:03.508 16:32:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:03.508 16:32:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:03.508 16:32:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:03.508 16:32:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:03.508 16:32:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:03.508 16:32:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:03.508 16:32:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 12 00:08:03.508 16:32:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4412 00:08:03.508 16:32:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:03.508 16:32:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:03.508 16:32:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:03.508 16:32:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:03.508 16:32:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:03.508 16:32:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 00:08:03.508 [2024-11-28 16:32:01.078576] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:03.508 [2024-11-28 16:32:01.078655] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3759723 ] 00:08:03.767 [2024-11-28 16:32:01.321835] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.767 [2024-11-28 16:32:01.352594] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.767 [2024-11-28 16:32:01.404687] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:04.025 [2024-11-28 16:32:01.421064] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:04.025 INFO: Running with entropic power schedule (0xFF, 100). 00:08:04.025 INFO: Seed: 2908020632 00:08:04.025 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:04.025 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:04.025 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:04.025 INFO: A corpus is not provided, starting from an empty corpus 00:08:04.025 #2 INITED exec/s: 0 rss: 64Mb 00:08:04.025 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:04.025 This may also happen if the target rejected all inputs we tried so far 00:08:04.025 [2024-11-28 16:32:01.492931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a020000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.025 [2024-11-28 16:32:01.492969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.025 [2024-11-28 16:32:01.493047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.025 [2024-11-28 16:32:01.493064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.025 [2024-11-28 16:32:01.493139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.026 [2024-11-28 16:32:01.493154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.026 [2024-11-28 16:32:01.493229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000a14 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.026 [2024-11-28 16:32:01.493244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.285 NEW_FUNC[1/714]: 0x463128 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:04.285 NEW_FUNC[2/714]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:04.285 #6 NEW cov: 12179 ft: 12196 corp: 2/33b lim: 40 exec/s: 0 rss: 72Mb L: 32/32 MS: 4 CMP-CrossOver-ShuffleBytes-InsertRepeatedBytes- DE: "\002\024"- 00:08:04.285 [2024-11-28 16:32:01.842875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.285 [2024-11-28 16:32:01.842929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.285 [2024-11-28 16:32:01.843076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.285 [2024-11-28 16:32:01.843101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.285 [2024-11-28 16:32:01.843241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.285 [2024-11-28 16:32:01.843263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.285 NEW_FUNC[1/1]: 0xfa7a18 in spdk_ring_dequeue /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/env.c:445 00:08:04.285 #7 NEW cov: 12310 ft: 13202 corp: 3/62b lim: 40 exec/s: 0 rss: 72Mb L: 29/32 MS: 1 EraseBytes- 00:08:04.285 [2024-11-28 16:32:01.923102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.285 [2024-11-28 16:32:01.923136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.285 [2024-11-28 16:32:01.923263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.285 [2024-11-28 16:32:01.923280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.285 [2024-11-28 16:32:01.923399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.285 [2024-11-28 16:32:01.923415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.285 [2024-11-28 16:32:01.923546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.285 [2024-11-28 16:32:01.923564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.544 #8 NEW cov: 12316 ft: 13466 corp: 4/100b lim: 40 exec/s: 0 rss: 72Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:04.544 [2024-11-28 16:32:01.973291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.544 [2024-11-28 16:32:01.973327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.544 [2024-11-28 16:32:01.973460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.544 [2024-11-28 16:32:01.973480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.544 [2024-11-28 16:32:01.973604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.544 [2024-11-28 16:32:01.973623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.544 [2024-11-28 16:32:01.973752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.544 [2024-11-28 16:32:01.973770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.544 #14 NEW cov: 12401 ft: 13714 corp: 5/139b lim: 40 exec/s: 0 rss: 72Mb L: 39/39 MS: 1 CopyPart- 00:08:04.544 [2024-11-28 16:32:02.043440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a020000 cdw11:25000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.544 [2024-11-28 16:32:02.043471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.544 [2024-11-28 16:32:02.043603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.544 [2024-11-28 16:32:02.043621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.544 [2024-11-28 16:32:02.043739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.544 [2024-11-28 16:32:02.043758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.544 [2024-11-28 16:32:02.043880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000a14 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.544 [2024-11-28 16:32:02.043896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.544 #15 NEW cov: 12401 ft: 13865 corp: 6/171b lim: 40 exec/s: 0 rss: 72Mb L: 32/39 MS: 1 ChangeByte- 00:08:04.544 [2024-11-28 16:32:02.093633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.544 [2024-11-28 16:32:02.093661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.544 [2024-11-28 16:32:02.093781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.544 [2024-11-28 16:32:02.093799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.544 [2024-11-28 16:32:02.093924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.544 [2024-11-28 16:32:02.093942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.544 [2024-11-28 16:32:02.094066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:09090909 cdw11:09092700 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.544 [2024-11-28 16:32:02.094082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.544 #16 NEW cov: 12401 ft: 13950 corp: 7/210b lim: 40 exec/s: 0 rss: 72Mb L: 39/39 MS: 1 ChangeBinInt- 00:08:04.544 [2024-11-28 16:32:02.163879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a020000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.544 [2024-11-28 16:32:02.163909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.544 [2024-11-28 16:32:02.164038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:14000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.544 [2024-11-28 16:32:02.164057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.544 [2024-11-28 16:32:02.164195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.544 [2024-11-28 16:32:02.164215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.544 [2024-11-28 16:32:02.164343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.544 [2024-11-28 16:32:02.164366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.544 #17 NEW cov: 12401 ft: 14052 corp: 8/244b lim: 40 exec/s: 0 rss: 72Mb L: 34/39 MS: 1 PersAutoDict- DE: "\002\024"- 00:08:04.803 [2024-11-28 16:32:02.213711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.803 [2024-11-28 16:32:02.213740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.803 [2024-11-28 16:32:02.213868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.803 [2024-11-28 16:32:02.213886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.804 [2024-11-28 16:32:02.214004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.804 [2024-11-28 16:32:02.214021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.804 #18 NEW cov: 12401 ft: 14100 corp: 9/269b lim: 40 exec/s: 0 rss: 72Mb L: 25/39 MS: 1 EraseBytes- 00:08:04.804 [2024-11-28 16:32:02.264382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.804 [2024-11-28 16:32:02.264412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.804 [2024-11-28 16:32:02.264539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.804 [2024-11-28 16:32:02.264558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.804 [2024-11-28 16:32:02.264689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:02140909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.804 [2024-11-28 16:32:02.264706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.804 [2024-11-28 16:32:02.264837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.804 [2024-11-28 16:32:02.264855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.804 [2024-11-28 16:32:02.264980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:09090909 cdw11:0909090a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.804 [2024-11-28 16:32:02.264995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.804 #24 NEW cov: 12401 ft: 14194 corp: 10/309b lim: 40 exec/s: 0 rss: 72Mb L: 40/40 MS: 1 PersAutoDict- DE: "\002\024"- 00:08:04.804 [2024-11-28 16:32:02.313959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.804 [2024-11-28 16:32:02.313988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.804 [2024-11-28 16:32:02.314135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.804 [2024-11-28 16:32:02.314156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.804 [2024-11-28 16:32:02.314299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.804 [2024-11-28 16:32:02.314319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.804 #26 NEW cov: 12401 ft: 14220 corp: 11/336b lim: 40 exec/s: 0 rss: 72Mb L: 27/40 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:04.804 [2024-11-28 16:32:02.364106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.804 [2024-11-28 16:32:02.364134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.804 [2024-11-28 16:32:02.364273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:09090909 cdw11:09090809 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.804 [2024-11-28 16:32:02.364290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.804 [2024-11-28 16:32:02.364419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.804 [2024-11-28 16:32:02.364436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.804 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:04.804 #27 NEW cov: 12424 ft: 14269 corp: 12/361b lim: 40 exec/s: 0 rss: 72Mb L: 25/40 MS: 1 ChangeBit- 00:08:04.804 [2024-11-28 16:32:02.434642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08020000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.804 [2024-11-28 16:32:02.434670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.804 [2024-11-28 16:32:02.434791] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:14000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.804 [2024-11-28 16:32:02.434810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.804 [2024-11-28 16:32:02.434942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.804 [2024-11-28 16:32:02.434958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.804 [2024-11-28 16:32:02.435081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.804 [2024-11-28 16:32:02.435098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.074 #33 NEW cov: 12424 ft: 14319 corp: 13/395b lim: 40 exec/s: 33 rss: 72Mb L: 34/40 MS: 1 ChangeBit- 00:08:05.074 [2024-11-28 16:32:02.504875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a020000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.074 [2024-11-28 16:32:02.504907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.074 [2024-11-28 16:32:02.505054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:14000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.074 [2024-11-28 16:32:02.505072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.074 [2024-11-28 16:32:02.505214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00610000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.074 [2024-11-28 16:32:02.505233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.074 [2024-11-28 16:32:02.505369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.074 [2024-11-28 16:32:02.505389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.074 #34 NEW cov: 12424 ft: 14355 corp: 14/429b lim: 40 exec/s: 34 rss: 73Mb L: 34/40 MS: 1 ChangeByte- 00:08:05.074 [2024-11-28 16:32:02.554933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a020000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.074 [2024-11-28 16:32:02.554962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.074 [2024-11-28 16:32:02.555099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:14000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.074 [2024-11-28 16:32:02.555117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.074 [2024-11-28 16:32:02.555251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00610000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.074 [2024-11-28 16:32:02.555269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.074 [2024-11-28 16:32:02.555411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000022 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.074 [2024-11-28 16:32:02.555427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.074 #35 NEW cov: 12424 ft: 14416 corp: 15/463b lim: 40 exec/s: 35 rss: 73Mb L: 34/40 MS: 1 ChangeBinInt- 00:08:05.074 [2024-11-28 16:32:02.625196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a020000 cdw11:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.074 [2024-11-28 16:32:02.625225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.074 [2024-11-28 16:32:02.625375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffff0214 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.074 [2024-11-28 16:32:02.625391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.074 [2024-11-28 16:32:02.625526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.074 [2024-11-28 16:32:02.625542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.074 [2024-11-28 16:32:02.625670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.074 [2024-11-28 16:32:02.625689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.074 #36 NEW cov: 12424 ft: 14501 corp: 16/500b lim: 40 exec/s: 36 rss: 73Mb L: 37/40 MS: 1 InsertRepeatedBytes- 00:08:05.074 [2024-11-28 16:32:02.675324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a020000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.074 [2024-11-28 16:32:02.675353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.074 [2024-11-28 16:32:02.675472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:14000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.074 [2024-11-28 16:32:02.675492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.074 [2024-11-28 16:32:02.675620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00610000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.074 [2024-11-28 16:32:02.675646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.074 [2024-11-28 16:32:02.675760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:14000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.074 [2024-11-28 16:32:02.675777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.074 #37 NEW cov: 12424 ft: 14536 corp: 17/536b lim: 40 exec/s: 37 rss: 73Mb L: 36/40 MS: 1 PersAutoDict- DE: "\002\024"- 00:08:05.335 [2024-11-28 16:32:02.725551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a020000 cdw11:25000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.335 [2024-11-28 16:32:02.725580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.335 [2024-11-28 16:32:02.725703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.335 [2024-11-28 16:32:02.725721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.335 [2024-11-28 16:32:02.725852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.335 [2024-11-28 16:32:02.725869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.335 [2024-11-28 16:32:02.726005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:02140000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.335 [2024-11-28 16:32:02.726022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.335 #43 NEW cov: 12424 ft: 14556 corp: 18/570b lim: 40 exec/s: 43 rss: 73Mb L: 34/40 MS: 1 PersAutoDict- DE: "\002\024"- 00:08:05.335 [2024-11-28 16:32:02.795760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a020000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.335 [2024-11-28 16:32:02.795788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.335 [2024-11-28 16:32:02.795918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:14000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.335 [2024-11-28 16:32:02.795934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.335 [2024-11-28 16:32:02.796065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00610000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.335 [2024-11-28 16:32:02.796083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.335 [2024-11-28 16:32:02.796203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.335 [2024-11-28 16:32:02.796220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.335 #44 NEW cov: 12424 ft: 14573 corp: 19/609b lim: 40 exec/s: 44 rss: 73Mb L: 39/40 MS: 1 InsertRepeatedBytes- 00:08:05.335 [2024-11-28 16:32:02.865381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.335 [2024-11-28 16:32:02.865413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.335 [2024-11-28 16:32:02.865533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000a14 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.335 [2024-11-28 16:32:02.865550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.335 #45 NEW cov: 12424 ft: 14790 corp: 20/625b lim: 40 exec/s: 45 rss: 73Mb L: 16/40 MS: 1 EraseBytes- 00:08:05.335 [2024-11-28 16:32:02.935630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00021400 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.335 [2024-11-28 16:32:02.935659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.335 [2024-11-28 16:32:02.935799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.335 [2024-11-28 16:32:02.935816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.335 #46 NEW cov: 12424 ft: 14896 corp: 21/643b lim: 40 exec/s: 46 rss: 73Mb L: 18/40 MS: 1 PersAutoDict- DE: "\002\024"- 00:08:05.594 [2024-11-28 16:32:03.006437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.594 [2024-11-28 16:32:03.006466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.594 [2024-11-28 16:32:03.006603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.594 [2024-11-28 16:32:03.006620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.594 [2024-11-28 16:32:03.006734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.594 [2024-11-28 16:32:03.006752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.594 [2024-11-28 16:32:03.006886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:02140909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.594 [2024-11-28 16:32:03.006906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.594 #47 NEW cov: 12424 ft: 14911 corp: 22/682b lim: 40 exec/s: 47 rss: 73Mb L: 39/40 MS: 1 PersAutoDict- DE: "\002\024"- 00:08:05.594 [2024-11-28 16:32:03.056630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.594 [2024-11-28 16:32:03.056660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.594 [2024-11-28 16:32:03.056787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.594 [2024-11-28 16:32:03.056805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.594 [2024-11-28 16:32:03.056929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.594 [2024-11-28 16:32:03.056948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.595 [2024-11-28 16:32:03.057078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.595 [2024-11-28 16:32:03.057099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.595 #48 NEW cov: 12424 ft: 14917 corp: 23/720b lim: 40 exec/s: 48 rss: 73Mb L: 38/40 MS: 1 ShuffleBytes- 00:08:05.595 [2024-11-28 16:32:03.106101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00230000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.595 [2024-11-28 16:32:03.106130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.595 [2024-11-28 16:32:03.106262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000a14 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.595 [2024-11-28 16:32:03.106280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.595 #49 NEW cov: 12424 ft: 14918 corp: 24/736b lim: 40 exec/s: 49 rss: 73Mb L: 16/40 MS: 1 ChangeByte- 00:08:05.595 [2024-11-28 16:32:03.156333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00021400 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.595 [2024-11-28 16:32:03.156363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.595 [2024-11-28 16:32:03.156492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000a14 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.595 [2024-11-28 16:32:03.156510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.595 #50 NEW cov: 12424 ft: 14938 corp: 25/752b lim: 40 exec/s: 50 rss: 73Mb L: 16/40 MS: 1 EraseBytes- 00:08:05.595 [2024-11-28 16:32:03.227070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:08020000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.595 [2024-11-28 16:32:03.227099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.595 [2024-11-28 16:32:03.227226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:14000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.595 [2024-11-28 16:32:03.227246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.595 [2024-11-28 16:32:03.227385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000214 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.595 [2024-11-28 16:32:03.227404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.595 [2024-11-28 16:32:03.227536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.595 [2024-11-28 16:32:03.227555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.854 #51 NEW cov: 12424 ft: 14952 corp: 26/786b lim: 40 exec/s: 51 rss: 73Mb L: 34/40 MS: 1 PersAutoDict- DE: "\002\024"- 00:08:05.854 [2024-11-28 16:32:03.297274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a020000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.854 [2024-11-28 16:32:03.297305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.854 [2024-11-28 16:32:03.297439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.854 [2024-11-28 16:32:03.297459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.854 [2024-11-28 16:32:03.297601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.854 [2024-11-28 16:32:03.297620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.854 [2024-11-28 16:32:03.297751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.854 [2024-11-28 16:32:03.297769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.854 #52 NEW cov: 12424 ft: 14979 corp: 27/821b lim: 40 exec/s: 52 rss: 73Mb L: 35/40 MS: 1 InsertRepeatedBytes- 00:08:05.854 [2024-11-28 16:32:03.347758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.854 [2024-11-28 16:32:03.347790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.854 [2024-11-28 16:32:03.347917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.854 [2024-11-28 16:32:03.347935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.854 [2024-11-28 16:32:03.348073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:09090909 cdw11:09090909 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.854 [2024-11-28 16:32:03.348093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.854 [2024-11-28 16:32:03.348217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:09090909 cdw11:09090927 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.854 [2024-11-28 16:32:03.348235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.854 [2024-11-28 16:32:03.348361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:00000009 cdw11:0909090a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.854 [2024-11-28 16:32:03.348379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:05.854 #53 NEW cov: 12424 ft: 14993 corp: 28/861b lim: 40 exec/s: 53 rss: 73Mb L: 40/40 MS: 1 CopyPart- 00:08:05.854 [2024-11-28 16:32:03.417741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a020000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.854 [2024-11-28 16:32:03.417772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.854 [2024-11-28 16:32:03.417905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:14000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.854 [2024-11-28 16:32:03.417922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.854 [2024-11-28 16:32:03.418054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:02140061 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.854 [2024-11-28 16:32:03.418074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.854 [2024-11-28 16:32:03.418203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.854 [2024-11-28 16:32:03.418222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.854 #54 NEW cov: 12424 ft: 14999 corp: 29/897b lim: 40 exec/s: 54 rss: 73Mb L: 36/40 MS: 1 PersAutoDict- DE: "\002\024"- 00:08:05.854 [2024-11-28 16:32:03.467943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a020000 cdw11:25000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.854 [2024-11-28 16:32:03.467973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.854 [2024-11-28 16:32:03.468102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.854 [2024-11-28 16:32:03.468121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.854 [2024-11-28 16:32:03.468250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.854 [2024-11-28 16:32:03.468266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.854 [2024-11-28 16:32:03.468397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:02140000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.854 [2024-11-28 16:32:03.468415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.113 #55 NEW cov: 12424 ft: 15016 corp: 30/931b lim: 40 exec/s: 27 rss: 73Mb L: 34/40 MS: 1 ShuffleBytes- 00:08:06.113 #55 DONE cov: 12424 ft: 15016 corp: 30/931b lim: 40 exec/s: 27 rss: 73Mb 00:08:06.113 ###### Recommended dictionary. ###### 00:08:06.113 "\002\024" # Uses: 12 00:08:06.113 ###### End of recommended dictionary. ###### 00:08:06.113 Done 55 runs in 2 second(s) 00:08:06.113 16:32:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_12.conf /var/tmp/suppress_nvmf_fuzz 00:08:06.113 16:32:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:06.113 16:32:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:06.113 16:32:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:06.113 16:32:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:06.113 16:32:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:06.113 16:32:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:06.113 16:32:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:06.113 16:32:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:06.113 16:32:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:06.113 16:32:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:06.113 16:32:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 13 00:08:06.113 16:32:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4413 00:08:06.113 16:32:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:06.113 16:32:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:06.113 16:32:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:06.113 16:32:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:06.113 16:32:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:06.113 16:32:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 00:08:06.113 [2024-11-28 16:32:03.671750] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:06.113 [2024-11-28 16:32:03.671826] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3760261 ] 00:08:06.372 [2024-11-28 16:32:03.920295] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:06.372 [2024-11-28 16:32:03.950971] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.372 [2024-11-28 16:32:04.003166] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:06.631 [2024-11-28 16:32:04.019532] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:06.631 INFO: Running with entropic power schedule (0xFF, 100). 00:08:06.631 INFO: Seed: 1213049205 00:08:06.631 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:06.631 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:06.631 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:06.631 INFO: A corpus is not provided, starting from an empty corpus 00:08:06.631 #2 INITED exec/s: 0 rss: 64Mb 00:08:06.631 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:06.631 This may also happen if the target rejected all inputs we tried so far 00:08:06.631 [2024-11-28 16:32:04.075067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.631 [2024-11-28 16:32:04.075095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.631 [2024-11-28 16:32:04.075171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.631 [2024-11-28 16:32:04.075186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.890 NEW_FUNC[1/714]: 0x464cf8 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:06.890 NEW_FUNC[2/714]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:06.890 #13 NEW cov: 12181 ft: 12179 corp: 2/22b lim: 40 exec/s: 0 rss: 72Mb L: 21/21 MS: 1 InsertRepeatedBytes- 00:08:06.890 [2024-11-28 16:32:04.406469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.890 [2024-11-28 16:32:04.406520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.890 #14 NEW cov: 12298 ft: 13280 corp: 3/37b lim: 40 exec/s: 0 rss: 72Mb L: 15/21 MS: 1 EraseBytes- 00:08:06.890 [2024-11-28 16:32:04.476607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.890 [2024-11-28 16:32:04.476638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.890 #20 NEW cov: 12304 ft: 13614 corp: 4/52b lim: 40 exec/s: 0 rss: 72Mb L: 15/21 MS: 1 ChangeBit- 00:08:07.149 [2024-11-28 16:32:04.547456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.149 [2024-11-28 16:32:04.547487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.149 [2024-11-28 16:32:04.547609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000000b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.149 [2024-11-28 16:32:04.547629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.149 [2024-11-28 16:32:04.547759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.149 [2024-11-28 16:32:04.547780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.149 [2024-11-28 16:32:04.547902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.149 [2024-11-28 16:32:04.547921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.149 #21 NEW cov: 12389 ft: 14343 corp: 5/90b lim: 40 exec/s: 0 rss: 72Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:07.149 [2024-11-28 16:32:04.607102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.149 [2024-11-28 16:32:04.607133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.149 #22 NEW cov: 12389 ft: 14395 corp: 6/105b lim: 40 exec/s: 0 rss: 72Mb L: 15/38 MS: 1 CrossOver- 00:08:07.149 [2024-11-28 16:32:04.657152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:c3ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.149 [2024-11-28 16:32:04.657181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.149 #24 NEW cov: 12389 ft: 14524 corp: 7/119b lim: 40 exec/s: 0 rss: 72Mb L: 14/38 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:07.149 [2024-11-28 16:32:04.707480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.149 [2024-11-28 16:32:04.707511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.149 [2024-11-28 16:32:04.707640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00040000 cdw11:007f0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.149 [2024-11-28 16:32:04.707659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.149 #25 NEW cov: 12389 ft: 14594 corp: 8/135b lim: 40 exec/s: 0 rss: 72Mb L: 16/38 MS: 1 InsertByte- 00:08:07.149 [2024-11-28 16:32:04.777772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2a0a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.149 [2024-11-28 16:32:04.777802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.149 [2024-11-28 16:32:04.777931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.149 [2024-11-28 16:32:04.777948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.408 #26 NEW cov: 12389 ft: 14609 corp: 9/157b lim: 40 exec/s: 0 rss: 72Mb L: 22/38 MS: 1 InsertByte- 00:08:07.408 [2024-11-28 16:32:04.827909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2a0a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.408 [2024-11-28 16:32:04.827937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.408 [2024-11-28 16:32:04.828048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.408 [2024-11-28 16:32:04.828066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.408 #27 NEW cov: 12389 ft: 14650 corp: 10/179b lim: 40 exec/s: 0 rss: 72Mb L: 22/38 MS: 1 CrossOver- 00:08:07.408 [2024-11-28 16:32:04.898328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.408 [2024-11-28 16:32:04.898356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.408 [2024-11-28 16:32:04.898478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.408 [2024-11-28 16:32:04.898495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.408 [2024-11-28 16:32:04.898608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.409 [2024-11-28 16:32:04.898623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.409 #28 NEW cov: 12389 ft: 14880 corp: 11/204b lim: 40 exec/s: 0 rss: 72Mb L: 25/38 MS: 1 InsertRepeatedBytes- 00:08:07.409 [2024-11-28 16:32:04.948701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000040 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.409 [2024-11-28 16:32:04.948730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.409 [2024-11-28 16:32:04.948855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000000b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.409 [2024-11-28 16:32:04.948882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.409 [2024-11-28 16:32:04.949003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.409 [2024-11-28 16:32:04.949020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.409 [2024-11-28 16:32:04.949138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.409 [2024-11-28 16:32:04.949155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.409 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:07.409 #29 NEW cov: 12412 ft: 14939 corp: 12/242b lim: 40 exec/s: 0 rss: 72Mb L: 38/38 MS: 1 ChangeBit- 00:08:07.409 [2024-11-28 16:32:05.018886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a010000 cdw11:00023716 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.409 [2024-11-28 16:32:05.018913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.409 [2024-11-28 16:32:05.019043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:2d000000 cdw11:000000b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.409 [2024-11-28 16:32:05.019061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.409 [2024-11-28 16:32:05.019192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.409 [2024-11-28 16:32:05.019209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.409 [2024-11-28 16:32:05.019331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.409 [2024-11-28 16:32:05.019351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.669 #30 NEW cov: 12412 ft: 14950 corp: 13/280b lim: 40 exec/s: 30 rss: 73Mb L: 38/38 MS: 1 CMP- DE: "\001\000\000\000\0027\026-"- 00:08:07.669 [2024-11-28 16:32:05.089111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000040 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.669 [2024-11-28 16:32:05.089138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.669 [2024-11-28 16:32:05.089263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000000b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.669 [2024-11-28 16:32:05.089280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.669 [2024-11-28 16:32:05.089405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b4b4b400 cdw11:000000b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.669 [2024-11-28 16:32:05.089422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.669 [2024-11-28 16:32:05.089550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.669 [2024-11-28 16:32:05.089569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.669 #31 NEW cov: 12412 ft: 15011 corp: 14/318b lim: 40 exec/s: 31 rss: 73Mb L: 38/38 MS: 1 CrossOver- 00:08:07.669 [2024-11-28 16:32:05.138817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:0237162d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.669 [2024-11-28 16:32:05.138844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.669 [2024-11-28 16:32:05.138957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.669 [2024-11-28 16:32:05.138975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.669 #32 NEW cov: 12412 ft: 15047 corp: 15/340b lim: 40 exec/s: 32 rss: 73Mb L: 22/38 MS: 1 PersAutoDict- DE: "\001\000\000\000\0027\026-"- 00:08:07.669 [2024-11-28 16:32:05.188688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:c3ffff01 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.669 [2024-11-28 16:32:05.188716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.669 #33 NEW cov: 12412 ft: 15110 corp: 16/354b lim: 40 exec/s: 33 rss: 73Mb L: 14/38 MS: 1 PersAutoDict- DE: "\001\000\000\000\0027\026-"- 00:08:07.669 [2024-11-28 16:32:05.259691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a5b0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.669 [2024-11-28 16:32:05.259718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.669 [2024-11-28 16:32:05.259853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000000b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.669 [2024-11-28 16:32:05.259869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.669 [2024-11-28 16:32:05.259999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.669 [2024-11-28 16:32:05.260015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.669 [2024-11-28 16:32:05.260152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.669 [2024-11-28 16:32:05.260169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.669 #34 NEW cov: 12412 ft: 15144 corp: 17/392b lim: 40 exec/s: 34 rss: 73Mb L: 38/38 MS: 1 ChangeByte- 00:08:07.669 [2024-11-28 16:32:05.309329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:2a0a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.669 [2024-11-28 16:32:05.309356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.669 [2024-11-28 16:32:05.309488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00bb0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.669 [2024-11-28 16:32:05.309506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.930 #35 NEW cov: 12412 ft: 15168 corp: 18/415b lim: 40 exec/s: 35 rss: 73Mb L: 23/38 MS: 1 InsertByte- 00:08:07.930 [2024-11-28 16:32:05.379480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-11-28 16:32:05.379508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.930 [2024-11-28 16:32:05.379643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00040000 cdw11:007f0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-11-28 16:32:05.379662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.930 #36 NEW cov: 12412 ft: 15190 corp: 19/431b lim: 40 exec/s: 36 rss: 73Mb L: 16/38 MS: 1 ChangeBit- 00:08:07.930 [2024-11-28 16:32:05.449946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-11-28 16:32:05.449973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.930 [2024-11-28 16:32:05.450104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-11-28 16:32:05.450122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.930 [2024-11-28 16:32:05.450249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-11-28 16:32:05.450267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.930 #37 NEW cov: 12412 ft: 15234 corp: 20/456b lim: 40 exec/s: 37 rss: 73Mb L: 25/38 MS: 1 CopyPart- 00:08:07.930 [2024-11-28 16:32:05.520194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:0237162d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-11-28 16:32:05.520221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.930 [2024-11-28 16:32:05.520354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-11-28 16:32:05.520373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.930 [2024-11-28 16:32:05.520513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00feffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.930 [2024-11-28 16:32:05.520533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.930 #38 NEW cov: 12412 ft: 15239 corp: 21/482b lim: 40 exec/s: 38 rss: 73Mb L: 26/38 MS: 1 CMP- DE: "\376\377\377\365"- 00:08:08.190 [2024-11-28 16:32:05.590559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000040 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.190 [2024-11-28 16:32:05.590589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.191 [2024-11-28 16:32:05.590725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000000b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.191 [2024-11-28 16:32:05.590743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.191 [2024-11-28 16:32:05.590870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:b4b4b400 cdw11:000000b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.191 [2024-11-28 16:32:05.590889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.191 [2024-11-28 16:32:05.591022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.191 [2024-11-28 16:32:05.591042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.191 #39 NEW cov: 12412 ft: 15271 corp: 22/520b lim: 40 exec/s: 39 rss: 73Mb L: 38/38 MS: 1 ChangeBinInt- 00:08:08.191 [2024-11-28 16:32:05.660637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.191 [2024-11-28 16:32:05.660665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.191 [2024-11-28 16:32:05.660792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:f7000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.191 [2024-11-28 16:32:05.660811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.191 [2024-11-28 16:32:05.660940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.191 [2024-11-28 16:32:05.660956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.191 #40 NEW cov: 12412 ft: 15290 corp: 23/545b lim: 40 exec/s: 40 rss: 73Mb L: 25/38 MS: 1 ChangeBinInt- 00:08:08.191 [2024-11-28 16:32:05.710293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:c3ffff01 cdw11:0000002f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.191 [2024-11-28 16:32:05.710320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.191 #41 NEW cov: 12412 ft: 15317 corp: 24/559b lim: 40 exec/s: 41 rss: 73Mb L: 14/38 MS: 1 ChangeByte- 00:08:08.191 [2024-11-28 16:32:05.781279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a5b0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.191 [2024-11-28 16:32:05.781307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.191 [2024-11-28 16:32:05.781442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.191 [2024-11-28 16:32:05.781461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.191 [2024-11-28 16:32:05.781601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.191 [2024-11-28 16:32:05.781618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.191 [2024-11-28 16:32:05.781746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:b4b4b4b4 cdw11:b4b4b4b4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.191 [2024-11-28 16:32:05.781765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.191 #42 NEW cov: 12412 ft: 15320 corp: 25/597b lim: 40 exec/s: 42 rss: 73Mb L: 38/38 MS: 1 ShuffleBytes- 00:08:08.450 [2024-11-28 16:32:05.851188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.450 [2024-11-28 16:32:05.851215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.450 [2024-11-28 16:32:05.851346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00fdffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.450 [2024-11-28 16:32:05.851363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.450 [2024-11-28 16:32:05.851501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.450 [2024-11-28 16:32:05.851520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.450 #43 NEW cov: 12412 ft: 15347 corp: 26/622b lim: 40 exec/s: 43 rss: 74Mb L: 25/38 MS: 1 ChangeBinInt- 00:08:08.450 [2024-11-28 16:32:05.900803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.450 [2024-11-28 16:32:05.900829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.450 #44 NEW cov: 12412 ft: 15356 corp: 27/630b lim: 40 exec/s: 44 rss: 74Mb L: 8/38 MS: 1 EraseBytes- 00:08:08.450 [2024-11-28 16:32:05.951494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.450 [2024-11-28 16:32:05.951522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.450 [2024-11-28 16:32:05.951658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.451 [2024-11-28 16:32:05.951678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.451 [2024-11-28 16:32:05.951801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.451 [2024-11-28 16:32:05.951819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.451 #45 NEW cov: 12412 ft: 15371 corp: 28/655b lim: 40 exec/s: 45 rss: 74Mb L: 25/38 MS: 1 ShuffleBytes- 00:08:08.451 [2024-11-28 16:32:06.021230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.451 [2024-11-28 16:32:06.021259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.451 #46 NEW cov: 12412 ft: 15399 corp: 29/670b lim: 40 exec/s: 46 rss: 74Mb L: 15/38 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:08.451 [2024-11-28 16:32:06.071812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.451 [2024-11-28 16:32:06.071842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.451 [2024-11-28 16:32:06.071973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000600 cdw11:00030000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.451 [2024-11-28 16:32:06.071990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.451 [2024-11-28 16:32:06.072122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:08.451 [2024-11-28 16:32:06.072140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.710 #47 NEW cov: 12412 ft: 15408 corp: 30/695b lim: 40 exec/s: 23 rss: 74Mb L: 25/38 MS: 1 ChangeBinInt- 00:08:08.710 #47 DONE cov: 12412 ft: 15408 corp: 30/695b lim: 40 exec/s: 23 rss: 74Mb 00:08:08.710 ###### Recommended dictionary. ###### 00:08:08.710 "\001\000\000\000\0027\026-" # Uses: 2 00:08:08.710 "\376\377\377\365" # Uses: 0 00:08:08.710 "\000\000\000\000\000\000\000\000" # Uses: 0 00:08:08.710 ###### End of recommended dictionary. ###### 00:08:08.710 Done 47 runs in 2 second(s) 00:08:08.710 16:32:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_13.conf /var/tmp/suppress_nvmf_fuzz 00:08:08.710 16:32:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:08.710 16:32:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:08.710 16:32:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:08.710 16:32:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:08.710 16:32:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:08.710 16:32:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:08.710 16:32:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:08.711 16:32:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:08.711 16:32:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:08.711 16:32:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:08.711 16:32:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 14 00:08:08.711 16:32:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4414 00:08:08.711 16:32:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:08.711 16:32:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:08.711 16:32:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:08.711 16:32:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:08.711 16:32:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:08.711 16:32:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 00:08:08.711 [2024-11-28 16:32:06.246067] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:08.711 [2024-11-28 16:32:06.246120] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3760637 ] 00:08:08.970 [2024-11-28 16:32:06.426809] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.970 [2024-11-28 16:32:06.448573] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.970 [2024-11-28 16:32:06.500932] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:08.970 [2024-11-28 16:32:06.517269] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:08.970 INFO: Running with entropic power schedule (0xFF, 100). 00:08:08.970 INFO: Seed: 3710058287 00:08:08.970 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:08.970 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:08.970 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:08.970 INFO: A corpus is not provided, starting from an empty corpus 00:08:08.970 #2 INITED exec/s: 0 rss: 64Mb 00:08:08.970 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:08.970 This may also happen if the target rejected all inputs we tried so far 00:08:08.970 [2024-11-28 16:32:06.562870] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.970 [2024-11-28 16:32:06.562899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.970 [2024-11-28 16:32:06.562956] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.970 [2024-11-28 16:32:06.562969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.229 NEW_FUNC[1/715]: 0x4668c8 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:09.229 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:09.229 #5 NEW cov: 12174 ft: 12171 corp: 2/21b lim: 35 exec/s: 0 rss: 72Mb L: 20/20 MS: 3 ShuffleBytes-CrossOver-InsertRepeatedBytes- 00:08:09.489 [2024-11-28 16:32:06.883663] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.489 [2024-11-28 16:32:06.883695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.489 [2024-11-28 16:32:06.883753] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.489 [2024-11-28 16:32:06.883767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.489 #6 NEW cov: 12292 ft: 12632 corp: 3/41b lim: 35 exec/s: 0 rss: 72Mb L: 20/20 MS: 1 ShuffleBytes- 00:08:09.489 [2024-11-28 16:32:06.943803] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.489 [2024-11-28 16:32:06.943829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.489 [2024-11-28 16:32:06.943903] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.489 [2024-11-28 16:32:06.943916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.489 [2024-11-28 16:32:06.943974] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.489 [2024-11-28 16:32:06.943987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.489 #7 NEW cov: 12298 ft: 12986 corp: 4/62b lim: 35 exec/s: 0 rss: 72Mb L: 21/21 MS: 1 CrossOver- 00:08:09.489 [2024-11-28 16:32:07.003946] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:8000004c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.489 [2024-11-28 16:32:07.003977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.489 [2024-11-28 16:32:07.004049] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.489 [2024-11-28 16:32:07.004066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.489 [2024-11-28 16:32:07.004123] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.489 [2024-11-28 16:32:07.004138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.489 #9 NEW cov: 12390 ft: 13388 corp: 5/83b lim: 35 exec/s: 0 rss: 72Mb L: 21/21 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:09.489 [2024-11-28 16:32:07.044099] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.489 [2024-11-28 16:32:07.044127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.489 [2024-11-28 16:32:07.044200] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.489 [2024-11-28 16:32:07.044215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.489 [2024-11-28 16:32:07.044269] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.489 [2024-11-28 16:32:07.044283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.489 #11 NEW cov: 12390 ft: 13514 corp: 6/110b lim: 35 exec/s: 0 rss: 72Mb L: 27/27 MS: 2 ChangeBit-InsertRepeatedBytes- 00:08:09.489 [2024-11-28 16:32:07.084222] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.489 [2024-11-28 16:32:07.084247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.489 [2024-11-28 16:32:07.084319] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.489 [2024-11-28 16:32:07.084333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.489 [2024-11-28 16:32:07.084388] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.489 [2024-11-28 16:32:07.084401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.489 #12 NEW cov: 12390 ft: 13655 corp: 7/137b lim: 35 exec/s: 0 rss: 72Mb L: 27/27 MS: 1 ChangeBit- 00:08:09.748 [2024-11-28 16:32:07.144413] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.748 [2024-11-28 16:32:07.144438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.748 [2024-11-28 16:32:07.144510] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.748 [2024-11-28 16:32:07.144524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.748 [2024-11-28 16:32:07.144583] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.748 [2024-11-28 16:32:07.144596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.748 #13 NEW cov: 12390 ft: 13695 corp: 8/164b lim: 35 exec/s: 0 rss: 72Mb L: 27/27 MS: 1 CopyPart- 00:08:09.748 [2024-11-28 16:32:07.204772] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.748 [2024-11-28 16:32:07.204798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.748 [2024-11-28 16:32:07.204854] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.749 [2024-11-28 16:32:07.204868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.749 [2024-11-28 16:32:07.204923] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.749 [2024-11-28 16:32:07.204936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.749 [2024-11-28 16:32:07.204992] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.749 [2024-11-28 16:32:07.205005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.749 #14 NEW cov: 12390 ft: 14012 corp: 9/192b lim: 35 exec/s: 0 rss: 72Mb L: 28/28 MS: 1 CrossOver- 00:08:09.749 [2024-11-28 16:32:07.244820] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.749 [2024-11-28 16:32:07.244845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.749 [2024-11-28 16:32:07.244902] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.749 [2024-11-28 16:32:07.244915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.749 [2024-11-28 16:32:07.244971] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.749 [2024-11-28 16:32:07.244984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.749 [2024-11-28 16:32:07.245041] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.749 [2024-11-28 16:32:07.245054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.749 #15 NEW cov: 12390 ft: 14054 corp: 10/221b lim: 35 exec/s: 0 rss: 73Mb L: 29/29 MS: 1 CMP- DE: "\004\000"- 00:08:09.749 [2024-11-28 16:32:07.304663] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.749 [2024-11-28 16:32:07.304688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.749 [2024-11-28 16:32:07.304764] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.749 [2024-11-28 16:32:07.304779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.749 #21 NEW cov: 12390 ft: 14101 corp: 11/241b lim: 35 exec/s: 0 rss: 73Mb L: 20/29 MS: 1 ChangeBit- 00:08:09.749 [2024-11-28 16:32:07.344971] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.749 [2024-11-28 16:32:07.344996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.749 [2024-11-28 16:32:07.345069] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.749 [2024-11-28 16:32:07.345087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.749 [2024-11-28 16:32:07.345145] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.749 [2024-11-28 16:32:07.345158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.749 #22 NEW cov: 12390 ft: 14111 corp: 12/262b lim: 35 exec/s: 0 rss: 73Mb L: 21/29 MS: 1 ChangeBinInt- 00:08:09.749 [2024-11-28 16:32:07.385214] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.749 [2024-11-28 16:32:07.385239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.749 [2024-11-28 16:32:07.385315] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.749 [2024-11-28 16:32:07.385329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.749 [2024-11-28 16:32:07.385387] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.749 [2024-11-28 16:32:07.385401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.749 [2024-11-28 16:32:07.385457] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.749 [2024-11-28 16:32:07.385471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.008 #23 NEW cov: 12390 ft: 14123 corp: 13/291b lim: 35 exec/s: 0 rss: 73Mb L: 29/29 MS: 1 ChangeBinInt- 00:08:10.008 [2024-11-28 16:32:07.445047] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.008 [2024-11-28 16:32:07.445072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.008 [2024-11-28 16:32:07.445145] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:000000fd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.008 [2024-11-28 16:32:07.445158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.008 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:10.008 #24 NEW cov: 12413 ft: 14171 corp: 14/311b lim: 35 exec/s: 0 rss: 73Mb L: 20/29 MS: 1 ChangeBinInt- 00:08:10.008 [2024-11-28 16:32:07.485472] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.008 [2024-11-28 16:32:07.485497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.008 [2024-11-28 16:32:07.485554] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.008 [2024-11-28 16:32:07.485567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.008 [2024-11-28 16:32:07.485626] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.008 [2024-11-28 16:32:07.485656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.008 [2024-11-28 16:32:07.485711] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.008 [2024-11-28 16:32:07.485724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.008 #25 NEW cov: 12413 ft: 14209 corp: 15/340b lim: 35 exec/s: 0 rss: 73Mb L: 29/29 MS: 1 ChangeBinInt- 00:08:10.008 [2024-11-28 16:32:07.525807] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.008 [2024-11-28 16:32:07.525832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.008 [2024-11-28 16:32:07.525890] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.008 [2024-11-28 16:32:07.525904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.008 [2024-11-28 16:32:07.525959] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.009 [2024-11-28 16:32:07.525974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.009 [2024-11-28 16:32:07.526032] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.009 [2024-11-28 16:32:07.526045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.009 [2024-11-28 16:32:07.526100] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.009 [2024-11-28 16:32:07.526113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.009 #26 NEW cov: 12413 ft: 14281 corp: 16/375b lim: 35 exec/s: 0 rss: 73Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:10.009 [2024-11-28 16:32:07.565704] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.009 [2024-11-28 16:32:07.565730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.009 [2024-11-28 16:32:07.565788] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.009 [2024-11-28 16:32:07.565804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.009 [2024-11-28 16:32:07.565860] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.009 [2024-11-28 16:32:07.565877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.009 [2024-11-28 16:32:07.565935] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.009 [2024-11-28 16:32:07.565949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.009 NEW_FUNC[1/1]: 0x487e18 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:10.009 #29 NEW cov: 12423 ft: 14299 corp: 17/408b lim: 35 exec/s: 29 rss: 73Mb L: 33/35 MS: 3 CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:08:10.009 [2024-11-28 16:32:07.605499] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.009 [2024-11-28 16:32:07.605526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.009 [2024-11-28 16:32:07.605603] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.009 [2024-11-28 16:32:07.605617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.009 #30 NEW cov: 12423 ft: 14360 corp: 18/428b lim: 35 exec/s: 30 rss: 73Mb L: 20/35 MS: 1 ChangeBit- 00:08:10.009 [2024-11-28 16:32:07.645811] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.009 [2024-11-28 16:32:07.645837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.009 [2024-11-28 16:32:07.645893] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.009 [2024-11-28 16:32:07.645906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.009 [2024-11-28 16:32:07.645961] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.009 [2024-11-28 16:32:07.645974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.268 #31 NEW cov: 12423 ft: 14369 corp: 19/455b lim: 35 exec/s: 31 rss: 73Mb L: 27/35 MS: 1 ChangeBinInt- 00:08:10.268 [2024-11-28 16:32:07.705960] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.268 [2024-11-28 16:32:07.705985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.268 [2024-11-28 16:32:07.706060] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.268 [2024-11-28 16:32:07.706074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.268 [2024-11-28 16:32:07.706134] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.268 [2024-11-28 16:32:07.706148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.268 #32 NEW cov: 12423 ft: 14376 corp: 20/482b lim: 35 exec/s: 32 rss: 73Mb L: 27/35 MS: 1 PersAutoDict- DE: "\004\000"- 00:08:10.268 [2024-11-28 16:32:07.746210] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.268 [2024-11-28 16:32:07.746235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.268 [2024-11-28 16:32:07.746296] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.268 [2024-11-28 16:32:07.746309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.268 [2024-11-28 16:32:07.746367] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.268 [2024-11-28 16:32:07.746380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.268 [2024-11-28 16:32:07.746437] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.268 [2024-11-28 16:32:07.746450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.268 #33 NEW cov: 12423 ft: 14391 corp: 21/512b lim: 35 exec/s: 33 rss: 73Mb L: 30/35 MS: 1 CrossOver- 00:08:10.268 [2024-11-28 16:32:07.806087] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.268 [2024-11-28 16:32:07.806113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.268 [2024-11-28 16:32:07.806170] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.268 [2024-11-28 16:32:07.806190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.268 #34 NEW cov: 12423 ft: 14412 corp: 22/532b lim: 35 exec/s: 34 rss: 73Mb L: 20/35 MS: 1 CopyPart- 00:08:10.268 [2024-11-28 16:32:07.866744] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.268 [2024-11-28 16:32:07.866771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.268 [2024-11-28 16:32:07.866828] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.268 [2024-11-28 16:32:07.866841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.268 [2024-11-28 16:32:07.866898] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.268 [2024-11-28 16:32:07.866912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.268 [2024-11-28 16:32:07.866971] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000075 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.268 [2024-11-28 16:32:07.866985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.268 [2024-11-28 16:32:07.867042] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.268 [2024-11-28 16:32:07.867057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.268 #35 NEW cov: 12423 ft: 14484 corp: 23/567b lim: 35 exec/s: 35 rss: 73Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:10.528 [2024-11-28 16:32:07.926744] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.528 [2024-11-28 16:32:07.926770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.528 [2024-11-28 16:32:07.926847] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.528 [2024-11-28 16:32:07.926861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.528 [2024-11-28 16:32:07.926922] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.528 [2024-11-28 16:32:07.926935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.528 [2024-11-28 16:32:07.926995] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000075 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.528 [2024-11-28 16:32:07.927009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.528 [2024-11-28 16:32:07.986907] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.528 [2024-11-28 16:32:07.986931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.528 [2024-11-28 16:32:07.986992] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.528 [2024-11-28 16:32:07.987005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.528 [2024-11-28 16:32:07.987076] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.528 [2024-11-28 16:32:07.987090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.528 [2024-11-28 16:32:07.987151] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000075 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.528 [2024-11-28 16:32:07.987165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.528 #37 NEW cov: 12423 ft: 14487 corp: 24/601b lim: 35 exec/s: 37 rss: 73Mb L: 34/35 MS: 2 EraseBytes-InsertByte- 00:08:10.528 [2024-11-28 16:32:08.027217] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.528 [2024-11-28 16:32:08.027243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.528 [2024-11-28 16:32:08.027301] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.528 [2024-11-28 16:32:08.027314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.528 [2024-11-28 16:32:08.027373] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.528 [2024-11-28 16:32:08.027387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.528 [2024-11-28 16:32:08.027446] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.528 [2024-11-28 16:32:08.027459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.528 [2024-11-28 16:32:08.027521] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.528 [2024-11-28 16:32:08.027534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.528 #38 NEW cov: 12423 ft: 14542 corp: 25/636b lim: 35 exec/s: 38 rss: 74Mb L: 35/35 MS: 1 CopyPart- 00:08:10.528 [2024-11-28 16:32:08.087009] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.528 [2024-11-28 16:32:08.087035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.528 [2024-11-28 16:32:08.087095] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.528 [2024-11-28 16:32:08.087109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.528 [2024-11-28 16:32:08.087168] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.528 [2024-11-28 16:32:08.087181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.528 #39 NEW cov: 12423 ft: 14553 corp: 26/663b lim: 35 exec/s: 39 rss: 74Mb L: 27/35 MS: 1 ChangeBinInt- 00:08:10.528 [2024-11-28 16:32:08.127116] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.528 [2024-11-28 16:32:08.127142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.528 [2024-11-28 16:32:08.127199] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.528 [2024-11-28 16:32:08.127212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.528 [2024-11-28 16:32:08.127273] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.528 [2024-11-28 16:32:08.127289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.528 #40 NEW cov: 12423 ft: 14574 corp: 27/684b lim: 35 exec/s: 40 rss: 74Mb L: 21/35 MS: 1 InsertByte- 00:08:10.788 [2024-11-28 16:32:08.187596] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.788 [2024-11-28 16:32:08.187625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.788 [2024-11-28 16:32:08.187706] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.788 [2024-11-28 16:32:08.187720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.788 [2024-11-28 16:32:08.187778] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.788 [2024-11-28 16:32:08.187791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.788 [2024-11-28 16:32:08.187847] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.788 [2024-11-28 16:32:08.187863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.788 [2024-11-28 16:32:08.187920] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.788 [2024-11-28 16:32:08.187936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.788 #41 NEW cov: 12423 ft: 14590 corp: 28/719b lim: 35 exec/s: 41 rss: 74Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:10.788 [2024-11-28 16:32:08.227736] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.788 [2024-11-28 16:32:08.227761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.788 [2024-11-28 16:32:08.227846] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000dc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.788 [2024-11-28 16:32:08.227862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.788 [2024-11-28 16:32:08.227958] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.788 [2024-11-28 16:32:08.227972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.788 [2024-11-28 16:32:08.228030] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.788 [2024-11-28 16:32:08.228043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.788 NEW_FUNC[1/2]: 0x4812a8 in feat_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:273 00:08:10.788 NEW_FUNC[2/2]: 0x13556e8 in nvmf_ctrlr_set_features_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1604 00:08:10.788 #42 NEW cov: 12480 ft: 14692 corp: 29/754b lim: 35 exec/s: 42 rss: 74Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:10.788 [2024-11-28 16:32:08.287584] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.788 [2024-11-28 16:32:08.287615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.788 [2024-11-28 16:32:08.287699] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.788 [2024-11-28 16:32:08.287716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.788 [2024-11-28 16:32:08.287789] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.788 [2024-11-28 16:32:08.287804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.788 #43 NEW cov: 12480 ft: 14710 corp: 30/775b lim: 35 exec/s: 43 rss: 74Mb L: 21/35 MS: 1 ChangeByte- 00:08:10.788 [2024-11-28 16:32:08.348124] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.788 [2024-11-28 16:32:08.348150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.788 [2024-11-28 16:32:08.348208] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES NUMBER OF QUEUES cid:5 cdw10:00000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.788 [2024-11-28 16:32:08.348224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: COMMAND SEQUENCE ERROR (00/0c) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.788 [2024-11-28 16:32:08.348284] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.788 [2024-11-28 16:32:08.348297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.788 [2024-11-28 16:32:08.348354] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.788 [2024-11-28 16:32:08.348368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.788 [2024-11-28 16:32:08.348427] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.788 [2024-11-28 16:32:08.348440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.788 NEW_FUNC[1/1]: 0x4858c8 in feat_number_of_queues /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:318 00:08:10.788 #49 NEW cov: 12514 ft: 14788 corp: 31/810b lim: 35 exec/s: 49 rss: 74Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:10.788 [2024-11-28 16:32:08.407739] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.788 [2024-11-28 16:32:08.407765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.788 [2024-11-28 16:32:08.407838] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.788 [2024-11-28 16:32:08.407851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.788 #50 NEW cov: 12514 ft: 14792 corp: 32/830b lim: 35 exec/s: 50 rss: 74Mb L: 20/35 MS: 1 ShuffleBytes- 00:08:11.048 [2024-11-28 16:32:08.447851] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.048 [2024-11-28 16:32:08.447877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.048 [2024-11-28 16:32:08.447951] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.048 [2024-11-28 16:32:08.447967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.048 #51 NEW cov: 12514 ft: 14845 corp: 33/850b lim: 35 exec/s: 51 rss: 74Mb L: 20/35 MS: 1 ChangeBit- 00:08:11.048 [2024-11-28 16:32:08.487967] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.048 [2024-11-28 16:32:08.487996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.048 [2024-11-28 16:32:08.488073] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.048 [2024-11-28 16:32:08.488087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.048 #52 NEW cov: 12514 ft: 14902 corp: 34/870b lim: 35 exec/s: 52 rss: 74Mb L: 20/35 MS: 1 ShuffleBytes- 00:08:11.048 [2024-11-28 16:32:08.528060] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.048 [2024-11-28 16:32:08.528086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.048 [2024-11-28 16:32:08.528170] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.048 [2024-11-28 16:32:08.528184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.048 #53 NEW cov: 12514 ft: 14907 corp: 35/890b lim: 35 exec/s: 53 rss: 74Mb L: 20/35 MS: 1 CrossOver- 00:08:11.048 [2024-11-28 16:32:08.568174] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.048 [2024-11-28 16:32:08.568200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.048 [2024-11-28 16:32:08.568275] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.048 [2024-11-28 16:32:08.568290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.048 #54 NEW cov: 12514 ft: 14913 corp: 36/910b lim: 35 exec/s: 27 rss: 74Mb L: 20/35 MS: 1 ChangeByte- 00:08:11.048 #54 DONE cov: 12514 ft: 14913 corp: 36/910b lim: 35 exec/s: 27 rss: 74Mb 00:08:11.048 ###### Recommended dictionary. ###### 00:08:11.048 "\004\000" # Uses: 1 00:08:11.048 ###### End of recommended dictionary. ###### 00:08:11.048 Done 54 runs in 2 second(s) 00:08:11.307 16:32:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_14.conf /var/tmp/suppress_nvmf_fuzz 00:08:11.307 16:32:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:11.307 16:32:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:11.307 16:32:08 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:11.307 16:32:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:11.307 16:32:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:11.307 16:32:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:11.307 16:32:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:11.307 16:32:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:11.307 16:32:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:11.307 16:32:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:11.307 16:32:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 15 00:08:11.307 16:32:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4415 00:08:11.307 16:32:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:11.307 16:32:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:11.307 16:32:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:11.307 16:32:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:11.307 16:32:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:11.307 16:32:08 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 00:08:11.308 [2024-11-28 16:32:08.740332] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:11.308 [2024-11-28 16:32:08.740403] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3761082 ] 00:08:11.308 [2024-11-28 16:32:08.914424] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.308 [2024-11-28 16:32:08.935998] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.567 [2024-11-28 16:32:08.988524] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:11.567 [2024-11-28 16:32:09.004895] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:11.567 INFO: Running with entropic power schedule (0xFF, 100). 00:08:11.567 INFO: Seed: 1904071752 00:08:11.567 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:11.567 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:11.567 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:11.567 INFO: A corpus is not provided, starting from an empty corpus 00:08:11.567 #2 INITED exec/s: 0 rss: 64Mb 00:08:11.567 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:11.567 This may also happen if the target rejected all inputs we tried so far 00:08:11.567 [2024-11-28 16:32:09.060509] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.567 [2024-11-28 16:32:09.060537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.567 [2024-11-28 16:32:09.060595] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.567 [2024-11-28 16:32:09.060618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.567 [2024-11-28 16:32:09.060676] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.567 [2024-11-28 16:32:09.060690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.567 [2024-11-28 16:32:09.060743] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.567 [2024-11-28 16:32:09.060756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.826 NEW_FUNC[1/714]: 0x467e08 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:11.826 NEW_FUNC[2/714]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:11.826 #11 NEW cov: 12167 ft: 12165 corp: 2/30b lim: 35 exec/s: 0 rss: 72Mb L: 29/29 MS: 4 CrossOver-InsertByte-ChangeBit-InsertRepeatedBytes- 00:08:11.826 [2024-11-28 16:32:09.381428] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.826 [2024-11-28 16:32:09.381459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.826 [2024-11-28 16:32:09.381520] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.826 [2024-11-28 16:32:09.381536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.826 [2024-11-28 16:32:09.381591] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.826 [2024-11-28 16:32:09.381609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.826 [2024-11-28 16:32:09.381665] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.826 [2024-11-28 16:32:09.381678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.826 #12 NEW cov: 12280 ft: 12818 corp: 3/62b lim: 35 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:08:11.826 [2024-11-28 16:32:09.441518] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.826 [2024-11-28 16:32:09.441545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.826 [2024-11-28 16:32:09.441605] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.826 [2024-11-28 16:32:09.441619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.826 [2024-11-28 16:32:09.441674] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.826 [2024-11-28 16:32:09.441687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.826 [2024-11-28 16:32:09.441745] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:11.826 [2024-11-28 16:32:09.441758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.826 #13 NEW cov: 12286 ft: 13013 corp: 4/95b lim: 35 exec/s: 0 rss: 72Mb L: 33/33 MS: 1 CopyPart- 00:08:12.086 [2024-11-28 16:32:09.481412] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.086 [2024-11-28 16:32:09.481439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.086 [2024-11-28 16:32:09.481497] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.086 [2024-11-28 16:32:09.481511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.086 #14 NEW cov: 12371 ft: 13863 corp: 5/112b lim: 35 exec/s: 0 rss: 72Mb L: 17/33 MS: 1 CrossOver- 00:08:12.086 [2024-11-28 16:32:09.541795] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.086 [2024-11-28 16:32:09.541822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.086 [2024-11-28 16:32:09.541879] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.086 [2024-11-28 16:32:09.541893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.086 [2024-11-28 16:32:09.541948] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.086 [2024-11-28 16:32:09.541961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.086 [2024-11-28 16:32:09.542022] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.086 [2024-11-28 16:32:09.542035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.086 #15 NEW cov: 12371 ft: 13983 corp: 6/142b lim: 35 exec/s: 0 rss: 72Mb L: 30/33 MS: 1 InsertByte- 00:08:12.086 [2024-11-28 16:32:09.581930] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.086 [2024-11-28 16:32:09.581956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.086 [2024-11-28 16:32:09.582014] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.086 [2024-11-28 16:32:09.582028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.086 [2024-11-28 16:32:09.582085] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.086 [2024-11-28 16:32:09.582098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.086 [2024-11-28 16:32:09.582156] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.086 [2024-11-28 16:32:09.582169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.086 #16 NEW cov: 12371 ft: 14062 corp: 7/172b lim: 35 exec/s: 0 rss: 72Mb L: 30/33 MS: 1 ChangeBinInt- 00:08:12.086 [2024-11-28 16:32:09.641916] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.086 [2024-11-28 16:32:09.641943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.086 [2024-11-28 16:32:09.642015] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.086 [2024-11-28 16:32:09.642030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.086 [2024-11-28 16:32:09.642084] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.086 [2024-11-28 16:32:09.642097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.086 #17 NEW cov: 12371 ft: 14256 corp: 8/198b lim: 35 exec/s: 0 rss: 72Mb L: 26/33 MS: 1 EraseBytes- 00:08:12.086 [2024-11-28 16:32:09.702216] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.086 [2024-11-28 16:32:09.702243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.086 [2024-11-28 16:32:09.702299] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.086 [2024-11-28 16:32:09.702313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.086 [2024-11-28 16:32:09.702369] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.086 [2024-11-28 16:32:09.702382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.086 [2024-11-28 16:32:09.702437] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.086 [2024-11-28 16:32:09.702450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.086 #18 NEW cov: 12371 ft: 14334 corp: 9/229b lim: 35 exec/s: 0 rss: 72Mb L: 31/33 MS: 1 InsertByte- 00:08:12.345 [2024-11-28 16:32:09.742330] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.345 [2024-11-28 16:32:09.742356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.345 [2024-11-28 16:32:09.742414] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.345 [2024-11-28 16:32:09.742428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.345 [2024-11-28 16:32:09.742484] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.345 [2024-11-28 16:32:09.742498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.345 [2024-11-28 16:32:09.742554] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.346 [2024-11-28 16:32:09.742567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.346 #19 NEW cov: 12371 ft: 14396 corp: 10/261b lim: 35 exec/s: 0 rss: 72Mb L: 32/33 MS: 1 ChangeBit- 00:08:12.346 NEW_FUNC[1/1]: 0x487e18 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:12.346 #22 NEW cov: 12385 ft: 14627 corp: 11/272b lim: 35 exec/s: 0 rss: 72Mb L: 11/33 MS: 3 InsertByte-ChangeBit-CrossOver- 00:08:12.346 [2024-11-28 16:32:09.822697] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.346 [2024-11-28 16:32:09.822722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.346 [2024-11-28 16:32:09.822792] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.346 [2024-11-28 16:32:09.822806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.346 [2024-11-28 16:32:09.822861] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.346 [2024-11-28 16:32:09.822874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.346 [2024-11-28 16:32:09.822929] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.346 [2024-11-28 16:32:09.822943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.346 [2024-11-28 16:32:09.822997] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.346 [2024-11-28 16:32:09.823011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.346 #23 NEW cov: 12385 ft: 14697 corp: 12/307b lim: 35 exec/s: 0 rss: 73Mb L: 35/35 MS: 1 CopyPart- 00:08:12.346 [2024-11-28 16:32:09.862732] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.346 [2024-11-28 16:32:09.862758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.346 [2024-11-28 16:32:09.862817] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.346 [2024-11-28 16:32:09.862831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.346 [2024-11-28 16:32:09.862891] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.346 [2024-11-28 16:32:09.862905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.346 [2024-11-28 16:32:09.862961] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.346 [2024-11-28 16:32:09.862975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.346 #24 NEW cov: 12385 ft: 14754 corp: 13/340b lim: 35 exec/s: 0 rss: 73Mb L: 33/35 MS: 1 ShuffleBytes- 00:08:12.346 [2024-11-28 16:32:09.922594] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.346 [2024-11-28 16:32:09.922625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.346 [2024-11-28 16:32:09.922681] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.346 [2024-11-28 16:32:09.922695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.346 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:12.346 #25 NEW cov: 12408 ft: 14804 corp: 14/357b lim: 35 exec/s: 0 rss: 73Mb L: 17/35 MS: 1 CopyPart- 00:08:12.346 [2024-11-28 16:32:09.983048] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.346 [2024-11-28 16:32:09.983074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.346 [2024-11-28 16:32:09.983147] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.346 [2024-11-28 16:32:09.983161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.346 [2024-11-28 16:32:09.983217] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.346 [2024-11-28 16:32:09.983230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.346 [2024-11-28 16:32:09.983288] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.346 [2024-11-28 16:32:09.983302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.606 #26 NEW cov: 12408 ft: 14814 corp: 15/387b lim: 35 exec/s: 0 rss: 73Mb L: 30/35 MS: 1 ChangeBinInt- 00:08:12.606 [2024-11-28 16:32:10.023548] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000001d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.606 [2024-11-28 16:32:10.023575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.606 [2024-11-28 16:32:10.023636] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000001d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.606 [2024-11-28 16:32:10.023651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.606 [2024-11-28 16:32:10.023709] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.606 [2024-11-28 16:32:10.023723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.606 [2024-11-28 16:32:10.023779] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.606 [2024-11-28 16:32:10.023796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.606 #27 NEW cov: 12408 ft: 14843 corp: 16/417b lim: 35 exec/s: 27 rss: 73Mb L: 30/35 MS: 1 InsertRepeatedBytes- 00:08:12.606 [2024-11-28 16:32:10.083334] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.606 [2024-11-28 16:32:10.083365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.606 [2024-11-28 16:32:10.083423] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.606 [2024-11-28 16:32:10.083437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.606 [2024-11-28 16:32:10.083491] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.606 [2024-11-28 16:32:10.083505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.606 [2024-11-28 16:32:10.083560] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.606 [2024-11-28 16:32:10.083573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.606 #28 NEW cov: 12408 ft: 14873 corp: 17/447b lim: 35 exec/s: 28 rss: 73Mb L: 30/35 MS: 1 CopyPart- 00:08:12.606 [2024-11-28 16:32:10.123339] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.606 [2024-11-28 16:32:10.123368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.606 [2024-11-28 16:32:10.123427] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.606 [2024-11-28 16:32:10.123441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.606 [2024-11-28 16:32:10.123500] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.606 [2024-11-28 16:32:10.123514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.606 #29 NEW cov: 12408 ft: 14933 corp: 18/473b lim: 35 exec/s: 29 rss: 73Mb L: 26/35 MS: 1 ChangeByte- 00:08:12.606 [2024-11-28 16:32:10.183457] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.606 [2024-11-28 16:32:10.183484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.606 [2024-11-28 16:32:10.183558] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.606 [2024-11-28 16:32:10.183572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.606 [2024-11-28 16:32:10.183632] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.606 [2024-11-28 16:32:10.183647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.606 #30 NEW cov: 12408 ft: 14945 corp: 19/496b lim: 35 exec/s: 30 rss: 73Mb L: 23/35 MS: 1 EraseBytes- 00:08:12.606 [2024-11-28 16:32:10.243816] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000001f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.606 [2024-11-28 16:32:10.243842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.606 [2024-11-28 16:32:10.243905] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.606 [2024-11-28 16:32:10.243920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.606 [2024-11-28 16:32:10.243978] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.606 [2024-11-28 16:32:10.243992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.606 [2024-11-28 16:32:10.244049] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.606 [2024-11-28 16:32:10.244063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.866 #31 NEW cov: 12408 ft: 15014 corp: 20/527b lim: 35 exec/s: 31 rss: 73Mb L: 31/35 MS: 1 ChangeBinInt- 00:08:12.866 [2024-11-28 16:32:10.303710] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.866 [2024-11-28 16:32:10.303736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.866 [2024-11-28 16:32:10.303798] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.866 [2024-11-28 16:32:10.303812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.866 #32 NEW cov: 12408 ft: 15039 corp: 21/544b lim: 35 exec/s: 32 rss: 73Mb L: 17/35 MS: 1 ChangeBit- 00:08:12.866 [2024-11-28 16:32:10.344058] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.866 [2024-11-28 16:32:10.344083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.866 [2024-11-28 16:32:10.344143] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.866 [2024-11-28 16:32:10.344157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.866 [2024-11-28 16:32:10.344215] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.866 [2024-11-28 16:32:10.344229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.866 [2024-11-28 16:32:10.344286] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.866 [2024-11-28 16:32:10.344299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.866 #33 NEW cov: 12408 ft: 15049 corp: 22/576b lim: 35 exec/s: 33 rss: 73Mb L: 32/35 MS: 1 ChangeBinInt- 00:08:12.866 [2024-11-28 16:32:10.384154] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000001f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.866 [2024-11-28 16:32:10.384179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.866 [2024-11-28 16:32:10.384241] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.866 [2024-11-28 16:32:10.384255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.866 [2024-11-28 16:32:10.384315] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.866 [2024-11-28 16:32:10.384332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.866 [2024-11-28 16:32:10.384388] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.866 [2024-11-28 16:32:10.384402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.866 #34 NEW cov: 12408 ft: 15065 corp: 23/607b lim: 35 exec/s: 34 rss: 73Mb L: 31/35 MS: 1 CrossOver- 00:08:12.866 [2024-11-28 16:32:10.444316] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.866 [2024-11-28 16:32:10.444342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.866 [2024-11-28 16:32:10.444418] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.866 [2024-11-28 16:32:10.444433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.866 [2024-11-28 16:32:10.444490] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.866 [2024-11-28 16:32:10.444504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.866 [2024-11-28 16:32:10.444563] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.866 [2024-11-28 16:32:10.444576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.866 #35 NEW cov: 12408 ft: 15081 corp: 24/640b lim: 35 exec/s: 35 rss: 74Mb L: 33/35 MS: 1 ShuffleBytes- 00:08:12.866 [2024-11-28 16:32:10.504535] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.866 [2024-11-28 16:32:10.504561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.866 [2024-11-28 16:32:10.504622] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.866 [2024-11-28 16:32:10.504636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.866 [2024-11-28 16:32:10.504692] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.866 [2024-11-28 16:32:10.504706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.866 [2024-11-28 16:32:10.504763] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:12.866 [2024-11-28 16:32:10.504776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.125 #36 NEW cov: 12408 ft: 15099 corp: 25/672b lim: 35 exec/s: 36 rss: 74Mb L: 32/35 MS: 1 EraseBytes- 00:08:13.125 [2024-11-28 16:32:10.544467] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.125 [2024-11-28 16:32:10.544493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.125 [2024-11-28 16:32:10.544565] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.125 [2024-11-28 16:32:10.544579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.125 [2024-11-28 16:32:10.544641] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.125 [2024-11-28 16:32:10.544658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.126 #37 NEW cov: 12408 ft: 15118 corp: 26/698b lim: 35 exec/s: 37 rss: 74Mb L: 26/35 MS: 1 ShuffleBytes- 00:08:13.126 [2024-11-28 16:32:10.584722] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.126 [2024-11-28 16:32:10.584748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.126 [2024-11-28 16:32:10.584811] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.126 [2024-11-28 16:32:10.584825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.126 [2024-11-28 16:32:10.584884] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.126 [2024-11-28 16:32:10.584898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.126 [2024-11-28 16:32:10.584956] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.126 [2024-11-28 16:32:10.584970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.126 #38 NEW cov: 12408 ft: 15126 corp: 27/730b lim: 35 exec/s: 38 rss: 74Mb L: 32/35 MS: 1 ChangeByte- 00:08:13.126 [2024-11-28 16:32:10.624872] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.126 [2024-11-28 16:32:10.624897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.126 [2024-11-28 16:32:10.624957] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.126 [2024-11-28 16:32:10.624970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.126 [2024-11-28 16:32:10.625028] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.126 [2024-11-28 16:32:10.625042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.126 [2024-11-28 16:32:10.625101] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.126 [2024-11-28 16:32:10.625115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.126 #39 NEW cov: 12408 ft: 15134 corp: 28/762b lim: 35 exec/s: 39 rss: 74Mb L: 32/35 MS: 1 ShuffleBytes- 00:08:13.126 [2024-11-28 16:32:10.664807] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000004dc SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.126 [2024-11-28 16:32:10.664833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.126 [2024-11-28 16:32:10.664893] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.126 [2024-11-28 16:32:10.664907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.126 [2024-11-28 16:32:10.664964] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.126 [2024-11-28 16:32:10.664978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.126 #40 NEW cov: 12408 ft: 15152 corp: 29/788b lim: 35 exec/s: 40 rss: 74Mb L: 26/35 MS: 1 CMP- DE: "\334\212\340\332=E\223\000"- 00:08:13.126 [2024-11-28 16:32:10.724967] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.126 [2024-11-28 16:32:10.724994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.126 [2024-11-28 16:32:10.725055] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000445 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.126 [2024-11-28 16:32:10.725069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.126 [2024-11-28 16:32:10.725128] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.126 [2024-11-28 16:32:10.725143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.126 #41 NEW cov: 12408 ft: 15175 corp: 30/813b lim: 35 exec/s: 41 rss: 74Mb L: 25/35 MS: 1 PersAutoDict- DE: "\334\212\340\332=E\223\000"- 00:08:13.126 [2024-11-28 16:32:10.765408] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.126 [2024-11-28 16:32:10.765435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.126 [2024-11-28 16:32:10.765497] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.126 [2024-11-28 16:32:10.765511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.126 [2024-11-28 16:32:10.765568] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.126 [2024-11-28 16:32:10.765581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.126 [2024-11-28 16:32:10.765641] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.126 [2024-11-28 16:32:10.765655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.126 [2024-11-28 16:32:10.765712] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.126 [2024-11-28 16:32:10.765726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:13.386 #42 NEW cov: 12408 ft: 15255 corp: 31/848b lim: 35 exec/s: 42 rss: 74Mb L: 35/35 MS: 1 CrossOver- 00:08:13.386 [2024-11-28 16:32:10.825363] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.386 [2024-11-28 16:32:10.825388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.386 [2024-11-28 16:32:10.825448] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000365 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.386 [2024-11-28 16:32:10.825462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.386 [2024-11-28 16:32:10.825520] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.386 [2024-11-28 16:32:10.825534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.386 [2024-11-28 16:32:10.825590] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.386 [2024-11-28 16:32:10.825609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.386 #43 NEW cov: 12408 ft: 15270 corp: 32/882b lim: 35 exec/s: 43 rss: 74Mb L: 34/35 MS: 1 InsertRepeatedBytes- 00:08:13.386 [2024-11-28 16:32:10.865447] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.386 [2024-11-28 16:32:10.865472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.386 [2024-11-28 16:32:10.865529] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.386 [2024-11-28 16:32:10.865543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.386 [2024-11-28 16:32:10.865604] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.386 [2024-11-28 16:32:10.865617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.386 [2024-11-28 16:32:10.865677] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.386 [2024-11-28 16:32:10.865690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.386 #44 NEW cov: 12408 ft: 15285 corp: 33/915b lim: 35 exec/s: 44 rss: 74Mb L: 33/35 MS: 1 InsertByte- 00:08:13.386 [2024-11-28 16:32:10.925629] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.386 [2024-11-28 16:32:10.925656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.386 [2024-11-28 16:32:10.925742] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.386 [2024-11-28 16:32:10.925757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.386 [2024-11-28 16:32:10.925814] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.386 [2024-11-28 16:32:10.925827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.386 [2024-11-28 16:32:10.925885] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.386 [2024-11-28 16:32:10.925898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.386 #45 NEW cov: 12408 ft: 15307 corp: 34/948b lim: 35 exec/s: 45 rss: 74Mb L: 33/35 MS: 1 ChangeByte- 00:08:13.386 [2024-11-28 16:32:10.965766] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.386 [2024-11-28 16:32:10.965791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.386 [2024-11-28 16:32:10.965865] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.386 [2024-11-28 16:32:10.965879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.386 [2024-11-28 16:32:10.965939] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.386 [2024-11-28 16:32:10.965953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.386 [2024-11-28 16:32:10.966010] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.386 [2024-11-28 16:32:10.966027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.386 #46 NEW cov: 12408 ft: 15320 corp: 35/980b lim: 35 exec/s: 46 rss: 74Mb L: 32/35 MS: 1 ChangeByte- 00:08:13.386 [2024-11-28 16:32:11.025911] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.386 [2024-11-28 16:32:11.025937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.386 [2024-11-28 16:32:11.025996] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.386 [2024-11-28 16:32:11.026010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.386 [2024-11-28 16:32:11.026083] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.386 [2024-11-28 16:32:11.026097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.386 [2024-11-28 16:32:11.026159] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.386 [2024-11-28 16:32:11.026172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.646 #47 NEW cov: 12408 ft: 15324 corp: 36/1012b lim: 35 exec/s: 23 rss: 74Mb L: 32/35 MS: 1 PersAutoDict- DE: "\334\212\340\332=E\223\000"- 00:08:13.646 #47 DONE cov: 12408 ft: 15324 corp: 36/1012b lim: 35 exec/s: 23 rss: 74Mb 00:08:13.646 ###### Recommended dictionary. ###### 00:08:13.646 "\334\212\340\332=E\223\000" # Uses: 2 00:08:13.646 ###### End of recommended dictionary. ###### 00:08:13.646 Done 47 runs in 2 second(s) 00:08:13.646 16:32:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_15.conf /var/tmp/suppress_nvmf_fuzz 00:08:13.646 16:32:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:13.646 16:32:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:13.646 16:32:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:13.646 16:32:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:13.646 16:32:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:13.646 16:32:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:13.646 16:32:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:13.646 16:32:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:13.646 16:32:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:13.646 16:32:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:13.646 16:32:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 16 00:08:13.646 16:32:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4416 00:08:13.646 16:32:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:13.646 16:32:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:13.646 16:32:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:13.646 16:32:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:13.646 16:32:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:13.646 16:32:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 00:08:13.646 [2024-11-28 16:32:11.194898] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:13.646 [2024-11-28 16:32:11.194987] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3761584 ] 00:08:13.905 [2024-11-28 16:32:11.371301] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.905 [2024-11-28 16:32:11.393212] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.905 [2024-11-28 16:32:11.445308] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:13.905 [2024-11-28 16:32:11.461687] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:13.905 INFO: Running with entropic power schedule (0xFF, 100). 00:08:13.905 INFO: Seed: 65113479 00:08:13.905 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:13.905 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:13.905 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:13.905 INFO: A corpus is not provided, starting from an empty corpus 00:08:13.905 #2 INITED exec/s: 0 rss: 64Mb 00:08:13.905 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:13.905 This may also happen if the target rejected all inputs we tried so far 00:08:13.905 [2024-11-28 16:32:11.506534] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.905 [2024-11-28 16:32:11.506569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.905 [2024-11-28 16:32:11.506614] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.905 [2024-11-28 16:32:11.506633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.905 [2024-11-28 16:32:11.506664] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.905 [2024-11-28 16:32:11.506681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.905 [2024-11-28 16:32:11.506710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.905 [2024-11-28 16:32:11.506726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.424 NEW_FUNC[1/715]: 0x4692c8 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:14.424 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:14.424 #7 NEW cov: 12271 ft: 12269 corp: 2/103b lim: 105 exec/s: 0 rss: 72Mb L: 102/102 MS: 5 InsertByte-ShuffleBytes-CMP-CrossOver-InsertRepeatedBytes- DE: "e\000"- 00:08:14.424 [2024-11-28 16:32:11.857382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.424 [2024-11-28 16:32:11.857417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.424 [2024-11-28 16:32:11.857466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.424 [2024-11-28 16:32:11.857484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.424 [2024-11-28 16:32:11.857518] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.424 [2024-11-28 16:32:11.857535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.424 [2024-11-28 16:32:11.857564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.424 [2024-11-28 16:32:11.857580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.424 #8 NEW cov: 12384 ft: 12858 corp: 3/205b lim: 105 exec/s: 0 rss: 72Mb L: 102/102 MS: 1 ChangeByte- 00:08:14.424 [2024-11-28 16:32:11.947548] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.424 [2024-11-28 16:32:11.947579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.424 [2024-11-28 16:32:11.947621] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.424 [2024-11-28 16:32:11.947640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.424 [2024-11-28 16:32:11.947671] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.424 [2024-11-28 16:32:11.947687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.424 [2024-11-28 16:32:11.947716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.424 [2024-11-28 16:32:11.947732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.424 #9 NEW cov: 12390 ft: 13177 corp: 4/296b lim: 105 exec/s: 0 rss: 72Mb L: 91/102 MS: 1 EraseBytes- 00:08:14.424 [2024-11-28 16:32:11.997552] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:1591483802247173654 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.424 [2024-11-28 16:32:11.997580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.424 [2024-11-28 16:32:11.997634] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.424 [2024-11-28 16:32:11.997653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.424 [2024-11-28 16:32:11.997684] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1591483802437686806 len:5655 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.424 [2024-11-28 16:32:11.997701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.424 #11 NEW cov: 12475 ft: 13992 corp: 5/369b lim: 105 exec/s: 0 rss: 72Mb L: 73/102 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:14.424 [2024-11-28 16:32:12.057839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.424 [2024-11-28 16:32:12.057869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.424 [2024-11-28 16:32:12.057902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.424 [2024-11-28 16:32:12.057920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.424 [2024-11-28 16:32:12.057955] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.424 [2024-11-28 16:32:12.057972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.424 [2024-11-28 16:32:12.058001] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.424 [2024-11-28 16:32:12.058017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.684 #12 NEW cov: 12475 ft: 14100 corp: 6/460b lim: 105 exec/s: 0 rss: 72Mb L: 91/102 MS: 1 ChangeByte- 00:08:14.684 [2024-11-28 16:32:12.148068] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.684 [2024-11-28 16:32:12.148098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.684 [2024-11-28 16:32:12.148130] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.684 [2024-11-28 16:32:12.148148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.684 [2024-11-28 16:32:12.148179] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.684 [2024-11-28 16:32:12.148195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.684 [2024-11-28 16:32:12.148240] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.684 [2024-11-28 16:32:12.148256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.684 #13 NEW cov: 12475 ft: 14187 corp: 7/564b lim: 105 exec/s: 0 rss: 72Mb L: 104/104 MS: 1 CrossOver- 00:08:14.684 [2024-11-28 16:32:12.238313] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.684 [2024-11-28 16:32:12.238344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.684 [2024-11-28 16:32:12.238377] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:72057589742960640 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.684 [2024-11-28 16:32:12.238394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.684 [2024-11-28 16:32:12.238424] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.684 [2024-11-28 16:32:12.238441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.684 [2024-11-28 16:32:12.238471] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.684 [2024-11-28 16:32:12.238487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.684 #14 NEW cov: 12475 ft: 14307 corp: 8/662b lim: 105 exec/s: 0 rss: 72Mb L: 98/104 MS: 1 InsertRepeatedBytes- 00:08:14.684 [2024-11-28 16:32:12.328626] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.684 [2024-11-28 16:32:12.328658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.684 [2024-11-28 16:32:12.328696] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.684 [2024-11-28 16:32:12.328714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.684 [2024-11-28 16:32:12.328746] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.684 [2024-11-28 16:32:12.328763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.684 [2024-11-28 16:32:12.328792] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.684 [2024-11-28 16:32:12.328809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.684 [2024-11-28 16:32:12.328838] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.684 [2024-11-28 16:32:12.328855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:14.943 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:14.943 #15 NEW cov: 12492 ft: 14379 corp: 9/767b lim: 105 exec/s: 0 rss: 73Mb L: 105/105 MS: 1 InsertByte- 00:08:14.943 [2024-11-28 16:32:12.418742] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.943 [2024-11-28 16:32:12.418773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.943 [2024-11-28 16:32:12.418806] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.943 [2024-11-28 16:32:12.418824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.944 [2024-11-28 16:32:12.418855] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.944 [2024-11-28 16:32:12.418871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.944 #16 NEW cov: 12492 ft: 14452 corp: 10/837b lim: 105 exec/s: 0 rss: 73Mb L: 70/105 MS: 1 EraseBytes- 00:08:14.944 [2024-11-28 16:32:12.478870] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.944 [2024-11-28 16:32:12.478900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.944 [2024-11-28 16:32:12.478947] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:72057589742960640 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.944 [2024-11-28 16:32:12.478966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.944 [2024-11-28 16:32:12.478997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.944 [2024-11-28 16:32:12.479014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.944 [2024-11-28 16:32:12.479043] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.944 [2024-11-28 16:32:12.479064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.944 #17 NEW cov: 12492 ft: 14534 corp: 11/936b lim: 105 exec/s: 17 rss: 73Mb L: 99/105 MS: 1 CopyPart- 00:08:14.944 [2024-11-28 16:32:12.569167] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.944 [2024-11-28 16:32:12.569197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.944 [2024-11-28 16:32:12.569244] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:72057589742960640 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.944 [2024-11-28 16:32:12.569261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.944 [2024-11-28 16:32:12.569292] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.944 [2024-11-28 16:32:12.569308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.944 [2024-11-28 16:32:12.569337] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.944 [2024-11-28 16:32:12.569353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.203 #18 NEW cov: 12492 ft: 14551 corp: 12/1040b lim: 105 exec/s: 18 rss: 73Mb L: 104/105 MS: 1 InsertRepeatedBytes- 00:08:15.203 [2024-11-28 16:32:12.619278] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.203 [2024-11-28 16:32:12.619308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.203 [2024-11-28 16:32:12.619341] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:72057589742960640 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.204 [2024-11-28 16:32:12.619358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.204 [2024-11-28 16:32:12.619388] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.204 [2024-11-28 16:32:12.619405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.204 [2024-11-28 16:32:12.619434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.204 [2024-11-28 16:32:12.619450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.204 #19 NEW cov: 12492 ft: 14567 corp: 13/1144b lim: 105 exec/s: 19 rss: 73Mb L: 104/105 MS: 1 ShuffleBytes- 00:08:15.204 [2024-11-28 16:32:12.709542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.204 [2024-11-28 16:32:12.709573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.204 [2024-11-28 16:32:12.709628] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:71777240047681536 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.204 [2024-11-28 16:32:12.709647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.204 [2024-11-28 16:32:12.709679] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.204 [2024-11-28 16:32:12.709700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.204 [2024-11-28 16:32:12.709730] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.204 [2024-11-28 16:32:12.709746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.204 #20 NEW cov: 12492 ft: 14589 corp: 14/1248b lim: 105 exec/s: 20 rss: 73Mb L: 104/105 MS: 1 ChangeBinInt- 00:08:15.204 [2024-11-28 16:32:12.799771] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.204 [2024-11-28 16:32:12.799802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.204 [2024-11-28 16:32:12.799835] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:72057589742960640 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.204 [2024-11-28 16:32:12.799853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.204 [2024-11-28 16:32:12.799883] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.204 [2024-11-28 16:32:12.799900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.204 [2024-11-28 16:32:12.799929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.204 [2024-11-28 16:32:12.799945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.204 #21 NEW cov: 12492 ft: 14627 corp: 15/1352b lim: 105 exec/s: 21 rss: 73Mb L: 104/105 MS: 1 CMP- DE: "\366\377\377\377"- 00:08:15.204 [2024-11-28 16:32:12.849927] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.204 [2024-11-28 16:32:12.849958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.204 [2024-11-28 16:32:12.849991] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:72057589742960640 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.204 [2024-11-28 16:32:12.850010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.204 [2024-11-28 16:32:12.850042] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.204 [2024-11-28 16:32:12.850058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.204 [2024-11-28 16:32:12.850088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.204 [2024-11-28 16:32:12.850105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.463 #22 NEW cov: 12492 ft: 14653 corp: 16/1456b lim: 105 exec/s: 22 rss: 73Mb L: 104/105 MS: 1 ChangeBinInt- 00:08:15.463 [2024-11-28 16:32:12.900020] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.463 [2024-11-28 16:32:12.900049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.463 [2024-11-28 16:32:12.900095] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:71777240047681536 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.463 [2024-11-28 16:32:12.900117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.463 [2024-11-28 16:32:12.900148] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65408 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.463 [2024-11-28 16:32:12.900165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.463 [2024-11-28 16:32:12.900193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.463 [2024-11-28 16:32:12.900210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.463 #23 NEW cov: 12492 ft: 14711 corp: 17/1560b lim: 105 exec/s: 23 rss: 73Mb L: 104/105 MS: 1 ChangeBit- 00:08:15.463 [2024-11-28 16:32:12.990322] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.463 [2024-11-28 16:32:12.990352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.463 [2024-11-28 16:32:12.990400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.463 [2024-11-28 16:32:12.990418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.463 [2024-11-28 16:32:12.990450] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.464 [2024-11-28 16:32:12.990467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.464 [2024-11-28 16:32:12.990496] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.464 [2024-11-28 16:32:12.990512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.464 [2024-11-28 16:32:12.990542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.464 [2024-11-28 16:32:12.990558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:15.464 #24 NEW cov: 12492 ft: 14736 corp: 18/1665b lim: 105 exec/s: 24 rss: 73Mb L: 105/105 MS: 1 ShuffleBytes- 00:08:15.464 [2024-11-28 16:32:13.080476] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.464 [2024-11-28 16:32:13.080504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.464 [2024-11-28 16:32:13.080551] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:72057589742960640 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.464 [2024-11-28 16:32:13.080568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.464 [2024-11-28 16:32:13.080606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.464 [2024-11-28 16:32:13.080623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.464 [2024-11-28 16:32:13.080652] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.464 [2024-11-28 16:32:13.080672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.723 #25 NEW cov: 12492 ft: 14791 corp: 19/1769b lim: 105 exec/s: 25 rss: 73Mb L: 104/105 MS: 1 ChangeByte- 00:08:15.723 [2024-11-28 16:32:13.130645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.723 [2024-11-28 16:32:13.130675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.723 [2024-11-28 16:32:13.130732] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:71777240047681536 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.723 [2024-11-28 16:32:13.130750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.723 [2024-11-28 16:32:13.130781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:281470681743360 len:65408 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.723 [2024-11-28 16:32:13.130798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.723 [2024-11-28 16:32:13.130828] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.723 [2024-11-28 16:32:13.130845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.723 #26 NEW cov: 12492 ft: 14803 corp: 20/1873b lim: 105 exec/s: 26 rss: 73Mb L: 104/105 MS: 1 ChangeBinInt- 00:08:15.723 [2024-11-28 16:32:13.220958] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.723 [2024-11-28 16:32:13.220987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.723 [2024-11-28 16:32:13.221034] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:72057589742960640 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.723 [2024-11-28 16:32:13.221052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.723 [2024-11-28 16:32:13.221083] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18409307901807034367 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.723 [2024-11-28 16:32:13.221100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.723 [2024-11-28 16:32:13.221128] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.723 [2024-11-28 16:32:13.221143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.723 [2024-11-28 16:32:13.221172] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.723 [2024-11-28 16:32:13.221188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:15.723 #27 NEW cov: 12492 ft: 14847 corp: 21/1978b lim: 105 exec/s: 27 rss: 73Mb L: 105/105 MS: 1 InsertByte- 00:08:15.723 [2024-11-28 16:32:13.311154] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.723 [2024-11-28 16:32:13.311183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.723 [2024-11-28 16:32:13.311216] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446462710401990911 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.723 [2024-11-28 16:32:13.311238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.723 [2024-11-28 16:32:13.311269] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.723 [2024-11-28 16:32:13.311285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.723 [2024-11-28 16:32:13.311329] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.723 [2024-11-28 16:32:13.311346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.983 #28 NEW cov: 12492 ft: 14897 corp: 22/2082b lim: 105 exec/s: 28 rss: 74Mb L: 104/105 MS: 1 ShuffleBytes- 00:08:15.983 [2024-11-28 16:32:13.401413] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.983 [2024-11-28 16:32:13.401442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.983 [2024-11-28 16:32:13.401488] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.983 [2024-11-28 16:32:13.401506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.983 [2024-11-28 16:32:13.401537] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.983 [2024-11-28 16:32:13.401554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.983 [2024-11-28 16:32:13.401582] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.983 [2024-11-28 16:32:13.401605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.983 [2024-11-28 16:32:13.401635] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:15360 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.983 [2024-11-28 16:32:13.401651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:15.983 #29 NEW cov: 12499 ft: 14921 corp: 23/2187b lim: 105 exec/s: 29 rss: 74Mb L: 105/105 MS: 1 CopyPart- 00:08:15.983 [2024-11-28 16:32:13.451444] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744071462085898 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.983 [2024-11-28 16:32:13.451473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.983 [2024-11-28 16:32:13.451520] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:72057589742960640 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.983 [2024-11-28 16:32:13.451538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.983 [2024-11-28 16:32:13.451569] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.983 [2024-11-28 16:32:13.451585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.983 [2024-11-28 16:32:13.451620] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.983 [2024-11-28 16:32:13.451641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.983 #30 NEW cov: 12499 ft: 14951 corp: 24/2286b lim: 105 exec/s: 15 rss: 74Mb L: 99/105 MS: 1 ShuffleBytes- 00:08:15.983 #30 DONE cov: 12499 ft: 14951 corp: 24/2286b lim: 105 exec/s: 15 rss: 74Mb 00:08:15.983 ###### Recommended dictionary. ###### 00:08:15.984 "e\000" # Uses: 0 00:08:15.984 "\366\377\377\377" # Uses: 0 00:08:15.984 ###### End of recommended dictionary. ###### 00:08:15.984 Done 30 runs in 2 second(s) 00:08:16.243 16:32:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_16.conf /var/tmp/suppress_nvmf_fuzz 00:08:16.243 16:32:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:16.243 16:32:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:16.243 16:32:13 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:16.243 16:32:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:16.243 16:32:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:16.243 16:32:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:16.243 16:32:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:16.243 16:32:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:16.243 16:32:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:16.243 16:32:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:16.243 16:32:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 17 00:08:16.243 16:32:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4417 00:08:16.243 16:32:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:16.243 16:32:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:16.243 16:32:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:16.243 16:32:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:16.243 16:32:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:16.243 16:32:13 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 00:08:16.243 [2024-11-28 16:32:13.672216] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:16.243 [2024-11-28 16:32:13.672298] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3761899 ] 00:08:16.243 [2024-11-28 16:32:13.846919] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.243 [2024-11-28 16:32:13.868698] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.502 [2024-11-28 16:32:13.921324] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:16.502 [2024-11-28 16:32:13.937714] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:16.502 INFO: Running with entropic power schedule (0xFF, 100). 00:08:16.502 INFO: Seed: 2540103648 00:08:16.502 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:16.502 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:16.502 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:16.502 INFO: A corpus is not provided, starting from an empty corpus 00:08:16.502 #2 INITED exec/s: 0 rss: 65Mb 00:08:16.502 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:16.502 This may also happen if the target rejected all inputs we tried so far 00:08:16.502 [2024-11-28 16:32:14.004098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.502 [2024-11-28 16:32:14.004134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.502 [2024-11-28 16:32:14.004251] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.502 [2024-11-28 16:32:14.004274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.502 [2024-11-28 16:32:14.004387] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.502 [2024-11-28 16:32:14.004408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.761 NEW_FUNC[1/716]: 0x46c648 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:16.761 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:16.761 #12 NEW cov: 12292 ft: 12292 corp: 2/90b lim: 120 exec/s: 0 rss: 72Mb L: 89/89 MS: 5 ShuffleBytes-ShuffleBytes-ShuffleBytes-ChangeBinInt-InsertRepeatedBytes- 00:08:16.761 [2024-11-28 16:32:14.345214] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1688849860263935 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.761 [2024-11-28 16:32:14.345279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.761 [2024-11-28 16:32:14.345424] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.761 [2024-11-28 16:32:14.345461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.761 [2024-11-28 16:32:14.345617] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.761 [2024-11-28 16:32:14.345653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.761 #13 NEW cov: 12405 ft: 12858 corp: 3/179b lim: 120 exec/s: 0 rss: 72Mb L: 89/89 MS: 1 CMP- DE: "\000\005"- 00:08:17.021 [2024-11-28 16:32:14.414922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069867569151 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.021 [2024-11-28 16:32:14.414954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.021 [2024-11-28 16:32:14.415066] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.021 [2024-11-28 16:32:14.415091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.021 #17 NEW cov: 12411 ft: 13593 corp: 4/231b lim: 120 exec/s: 0 rss: 72Mb L: 52/89 MS: 4 ShuffleBytes-CopyPart-ChangeBit-InsertRepeatedBytes- 00:08:17.021 [2024-11-28 16:32:14.465509] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.021 [2024-11-28 16:32:14.465542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.021 [2024-11-28 16:32:14.465625] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.021 [2024-11-28 16:32:14.465668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.021 [2024-11-28 16:32:14.465780] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.021 [2024-11-28 16:32:14.465803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.021 [2024-11-28 16:32:14.465918] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.021 [2024-11-28 16:32:14.465943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.021 #18 NEW cov: 12496 ft: 14142 corp: 5/341b lim: 120 exec/s: 0 rss: 72Mb L: 110/110 MS: 1 InsertRepeatedBytes- 00:08:17.021 [2024-11-28 16:32:14.515176] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069867569151 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.021 [2024-11-28 16:32:14.515204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.021 [2024-11-28 16:32:14.515340] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18391012028320841727 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.021 [2024-11-28 16:32:14.515363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.021 #19 NEW cov: 12496 ft: 14213 corp: 6/393b lim: 120 exec/s: 0 rss: 72Mb L: 52/110 MS: 1 ChangeByte- 00:08:17.021 [2024-11-28 16:32:14.585832] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.021 [2024-11-28 16:32:14.585868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.021 [2024-11-28 16:32:14.585978] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.021 [2024-11-28 16:32:14.586007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.021 [2024-11-28 16:32:14.586139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.021 [2024-11-28 16:32:14.586163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.021 [2024-11-28 16:32:14.586282] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.021 [2024-11-28 16:32:14.586301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.021 #25 NEW cov: 12496 ft: 14268 corp: 7/503b lim: 120 exec/s: 0 rss: 73Mb L: 110/110 MS: 1 ChangeByte- 00:08:17.021 [2024-11-28 16:32:14.655889] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.021 [2024-11-28 16:32:14.655922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.021 [2024-11-28 16:32:14.656017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.021 [2024-11-28 16:32:14.656042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.021 [2024-11-28 16:32:14.656172] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.021 [2024-11-28 16:32:14.656200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.281 #26 NEW cov: 12496 ft: 14326 corp: 8/581b lim: 120 exec/s: 0 rss: 73Mb L: 78/110 MS: 1 EraseBytes- 00:08:17.281 [2024-11-28 16:32:14.705785] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.281 [2024-11-28 16:32:14.705819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.281 [2024-11-28 16:32:14.705925] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.281 [2024-11-28 16:32:14.705947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.281 #27 NEW cov: 12496 ft: 14370 corp: 9/648b lim: 120 exec/s: 0 rss: 73Mb L: 67/110 MS: 1 EraseBytes- 00:08:17.281 [2024-11-28 16:32:14.776289] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.281 [2024-11-28 16:32:14.776322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.281 [2024-11-28 16:32:14.776431] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.281 [2024-11-28 16:32:14.776455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.281 [2024-11-28 16:32:14.776577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.281 [2024-11-28 16:32:14.776601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.281 #28 NEW cov: 12496 ft: 14402 corp: 10/726b lim: 120 exec/s: 0 rss: 73Mb L: 78/110 MS: 1 ShuffleBytes- 00:08:17.281 [2024-11-28 16:32:14.826637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.281 [2024-11-28 16:32:14.826671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.281 [2024-11-28 16:32:14.826753] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.281 [2024-11-28 16:32:14.826779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.281 [2024-11-28 16:32:14.826901] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.281 [2024-11-28 16:32:14.826926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.281 [2024-11-28 16:32:14.827052] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.281 [2024-11-28 16:32:14.827079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.281 #29 NEW cov: 12496 ft: 14456 corp: 11/840b lim: 120 exec/s: 0 rss: 73Mb L: 114/114 MS: 1 InsertRepeatedBytes- 00:08:17.282 [2024-11-28 16:32:14.876577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1688849860263935 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.282 [2024-11-28 16:32:14.876619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.282 [2024-11-28 16:32:14.876739] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.282 [2024-11-28 16:32:14.876767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.282 [2024-11-28 16:32:14.876904] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.282 [2024-11-28 16:32:14.876926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.282 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:17.282 #30 NEW cov: 12519 ft: 14498 corp: 12/929b lim: 120 exec/s: 0 rss: 73Mb L: 89/114 MS: 1 ChangeBinInt- 00:08:17.541 [2024-11-28 16:32:14.947003] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.541 [2024-11-28 16:32:14.947035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.541 [2024-11-28 16:32:14.947105] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.541 [2024-11-28 16:32:14.947132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.541 [2024-11-28 16:32:14.947268] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.541 [2024-11-28 16:32:14.947296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.541 [2024-11-28 16:32:14.947427] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.541 [2024-11-28 16:32:14.947450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.541 #31 NEW cov: 12519 ft: 14537 corp: 13/1044b lim: 120 exec/s: 31 rss: 73Mb L: 115/115 MS: 1 CopyPart- 00:08:17.541 [2024-11-28 16:32:15.016650] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069867569151 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.541 [2024-11-28 16:32:15.016686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.541 [2024-11-28 16:32:15.016786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18391012028320841727 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.541 [2024-11-28 16:32:15.016811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.541 #32 NEW cov: 12519 ft: 14583 corp: 14/1096b lim: 120 exec/s: 32 rss: 73Mb L: 52/115 MS: 1 ChangeBit- 00:08:17.541 [2024-11-28 16:32:15.087448] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.541 [2024-11-28 16:32:15.087480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.541 [2024-11-28 16:32:15.087567] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.541 [2024-11-28 16:32:15.087593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.541 [2024-11-28 16:32:15.087713] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:83886080 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.541 [2024-11-28 16:32:15.087737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.541 [2024-11-28 16:32:15.087857] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.541 [2024-11-28 16:32:15.087877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.541 #33 NEW cov: 12519 ft: 14676 corp: 15/1206b lim: 120 exec/s: 33 rss: 73Mb L: 110/115 MS: 1 ChangeBinInt- 00:08:17.541 [2024-11-28 16:32:15.157081] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069867569151 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.541 [2024-11-28 16:32:15.157115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.541 [2024-11-28 16:32:15.157230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18391012028320841727 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.541 [2024-11-28 16:32:15.157256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.541 #34 NEW cov: 12519 ft: 14706 corp: 16/1258b lim: 120 exec/s: 34 rss: 73Mb L: 52/115 MS: 1 CrossOver- 00:08:17.800 [2024-11-28 16:32:15.207850] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.800 [2024-11-28 16:32:15.207881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.800 [2024-11-28 16:32:15.207964] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.800 [2024-11-28 16:32:15.207982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.800 [2024-11-28 16:32:15.208104] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.800 [2024-11-28 16:32:15.208140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.800 [2024-11-28 16:32:15.208263] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.800 [2024-11-28 16:32:15.208288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.800 #35 NEW cov: 12519 ft: 14726 corp: 17/1357b lim: 120 exec/s: 35 rss: 73Mb L: 99/115 MS: 1 EraseBytes- 00:08:17.800 [2024-11-28 16:32:15.277760] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.800 [2024-11-28 16:32:15.277794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.800 [2024-11-28 16:32:15.277912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65287 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.800 [2024-11-28 16:32:15.277939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.800 [2024-11-28 16:32:15.278064] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.800 [2024-11-28 16:32:15.278091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.800 #36 NEW cov: 12519 ft: 14741 corp: 18/1435b lim: 120 exec/s: 36 rss: 73Mb L: 78/115 MS: 1 ChangeBinInt- 00:08:17.800 [2024-11-28 16:32:15.328179] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.800 [2024-11-28 16:32:15.328215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.800 [2024-11-28 16:32:15.328339] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888556372449 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.800 [2024-11-28 16:32:15.328362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.800 [2024-11-28 16:32:15.328475] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.800 [2024-11-28 16:32:15.328498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.800 [2024-11-28 16:32:15.328621] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.800 [2024-11-28 16:32:15.328646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.800 #37 NEW cov: 12519 ft: 14775 corp: 19/1550b lim: 120 exec/s: 37 rss: 73Mb L: 115/115 MS: 1 InsertByte- 00:08:17.801 [2024-11-28 16:32:15.398464] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.801 [2024-11-28 16:32:15.398499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.801 [2024-11-28 16:32:15.398616] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.801 [2024-11-28 16:32:15.398639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.801 [2024-11-28 16:32:15.398767] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:83886080 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.801 [2024-11-28 16:32:15.398788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.801 [2024-11-28 16:32:15.398904] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:3974949888 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.801 [2024-11-28 16:32:15.398927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.801 #38 NEW cov: 12519 ft: 14816 corp: 20/1663b lim: 120 exec/s: 38 rss: 74Mb L: 113/115 MS: 1 InsertRepeatedBytes- 00:08:18.059 [2024-11-28 16:32:15.468624] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.059 [2024-11-28 16:32:15.468656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.059 [2024-11-28 16:32:15.468741] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888556372449 len:57692 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.059 [2024-11-28 16:32:15.468762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.059 [2024-11-28 16:32:15.468891] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.059 [2024-11-28 16:32:15.468919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.059 [2024-11-28 16:32:15.469049] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.059 [2024-11-28 16:32:15.469076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.059 #39 NEW cov: 12519 ft: 14857 corp: 21/1778b lim: 120 exec/s: 39 rss: 74Mb L: 115/115 MS: 1 ChangeByte- 00:08:18.059 [2024-11-28 16:32:15.538273] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.059 [2024-11-28 16:32:15.538306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.059 [2024-11-28 16:32:15.538424] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.059 [2024-11-28 16:32:15.538445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.059 #40 NEW cov: 12519 ft: 14866 corp: 22/1845b lim: 120 exec/s: 40 rss: 74Mb L: 67/115 MS: 1 CopyPart- 00:08:18.059 [2024-11-28 16:32:15.608546] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744069867569151 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.060 [2024-11-28 16:32:15.608582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.060 [2024-11-28 16:32:15.608714] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18391012028320841727 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.060 [2024-11-28 16:32:15.608739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.060 #41 NEW cov: 12519 ft: 14878 corp: 23/1897b lim: 120 exec/s: 41 rss: 74Mb L: 52/115 MS: 1 ChangeBinInt- 00:08:18.060 [2024-11-28 16:32:15.679093] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1688849860263935 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.060 [2024-11-28 16:32:15.679129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.060 [2024-11-28 16:32:15.679238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.060 [2024-11-28 16:32:15.679264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.060 [2024-11-28 16:32:15.679391] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446511252122370047 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.060 [2024-11-28 16:32:15.679415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.060 #42 NEW cov: 12519 ft: 14898 corp: 24/1986b lim: 120 exec/s: 42 rss: 74Mb L: 89/115 MS: 1 CMP- DE: ",?"- 00:08:18.319 [2024-11-28 16:32:15.729495] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.319 [2024-11-28 16:32:15.729528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.319 [2024-11-28 16:32:15.729632] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538888556372449 len:57692 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.319 [2024-11-28 16:32:15.729659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.319 [2024-11-28 16:32:15.729795] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.319 [2024-11-28 16:32:15.729821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.319 [2024-11-28 16:32:15.729943] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.319 [2024-11-28 16:32:15.729970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.319 #43 NEW cov: 12519 ft: 14921 corp: 25/2103b lim: 120 exec/s: 43 rss: 74Mb L: 117/117 MS: 1 PersAutoDict- DE: "\000\005"- 00:08:18.319 [2024-11-28 16:32:15.799706] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.319 [2024-11-28 16:32:15.799737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.319 [2024-11-28 16:32:15.799822] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.319 [2024-11-28 16:32:15.799846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.319 [2024-11-28 16:32:15.799972] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.319 [2024-11-28 16:32:15.799994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.319 [2024-11-28 16:32:15.800127] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.319 [2024-11-28 16:32:15.800149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.319 #44 NEW cov: 12519 ft: 14943 corp: 26/2213b lim: 120 exec/s: 44 rss: 74Mb L: 110/117 MS: 1 CMP- DE: "\377~"- 00:08:18.319 [2024-11-28 16:32:15.849883] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.319 [2024-11-28 16:32:15.849915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.319 [2024-11-28 16:32:15.849990] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.319 [2024-11-28 16:32:15.850018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.319 [2024-11-28 16:32:15.850137] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:83886080 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.319 [2024-11-28 16:32:15.850161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.319 [2024-11-28 16:32:15.850288] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:3974949888 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.319 [2024-11-28 16:32:15.850310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.319 #45 NEW cov: 12519 ft: 14959 corp: 27/2326b lim: 120 exec/s: 45 rss: 74Mb L: 113/117 MS: 1 ChangeBinInt- 00:08:18.319 [2024-11-28 16:32:15.909991] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.319 [2024-11-28 16:32:15.910023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.319 [2024-11-28 16:32:15.910086] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:16276538885293743655 len:57692 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.319 [2024-11-28 16:32:15.910113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.319 [2024-11-28 16:32:15.910232] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.319 [2024-11-28 16:32:15.910261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.319 [2024-11-28 16:32:15.910379] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:16276538888567251425 len:57826 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.319 [2024-11-28 16:32:15.910405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.319 #46 NEW cov: 12519 ft: 14981 corp: 28/2443b lim: 120 exec/s: 46 rss: 74Mb L: 117/117 MS: 1 ChangeBinInt- 00:08:18.579 [2024-11-28 16:32:15.980017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.579 [2024-11-28 16:32:15.980051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.579 [2024-11-28 16:32:15.980171] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.579 [2024-11-28 16:32:15.980192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.579 [2024-11-28 16:32:15.980315] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.579 [2024-11-28 16:32:15.980343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.579 #47 NEW cov: 12519 ft: 14990 corp: 29/2521b lim: 120 exec/s: 23 rss: 74Mb L: 78/117 MS: 1 ShuffleBytes- 00:08:18.579 #47 DONE cov: 12519 ft: 14990 corp: 29/2521b lim: 120 exec/s: 23 rss: 74Mb 00:08:18.579 ###### Recommended dictionary. ###### 00:08:18.579 "\000\005" # Uses: 1 00:08:18.579 ",?" # Uses: 0 00:08:18.579 "\377~" # Uses: 0 00:08:18.579 ###### End of recommended dictionary. ###### 00:08:18.579 Done 47 runs in 2 second(s) 00:08:18.579 16:32:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_17.conf /var/tmp/suppress_nvmf_fuzz 00:08:18.579 16:32:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:18.579 16:32:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:18.579 16:32:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:18.579 16:32:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:18.579 16:32:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:18.579 16:32:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:18.579 16:32:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:18.579 16:32:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:18.579 16:32:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:18.579 16:32:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:18.579 16:32:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 18 00:08:18.579 16:32:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4418 00:08:18.579 16:32:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:18.579 16:32:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:18.579 16:32:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:18.579 16:32:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:18.579 16:32:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:18.579 16:32:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 00:08:18.579 [2024-11-28 16:32:16.153714] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:18.579 [2024-11-28 16:32:16.153805] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3762432 ] 00:08:18.839 [2024-11-28 16:32:16.330764] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.839 [2024-11-28 16:32:16.353343] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.839 [2024-11-28 16:32:16.405740] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:18.839 [2024-11-28 16:32:16.422045] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:18.839 INFO: Running with entropic power schedule (0xFF, 100). 00:08:18.839 INFO: Seed: 731137895 00:08:18.839 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:18.839 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:18.839 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:18.839 INFO: A corpus is not provided, starting from an empty corpus 00:08:18.839 #2 INITED exec/s: 0 rss: 66Mb 00:08:18.839 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:18.839 This may also happen if the target rejected all inputs we tried so far 00:08:18.839 [2024-11-28 16:32:16.477333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:18.839 [2024-11-28 16:32:16.477364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.357 NEW_FUNC[1/714]: 0x46ff38 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:19.357 NEW_FUNC[2/714]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:19.357 #5 NEW cov: 12228 ft: 12221 corp: 2/30b lim: 100 exec/s: 0 rss: 73Mb L: 29/29 MS: 3 ChangeBit-InsertByte-InsertRepeatedBytes- 00:08:19.357 [2024-11-28 16:32:16.788325] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.357 [2024-11-28 16:32:16.788362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.357 [2024-11-28 16:32:16.788432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.357 [2024-11-28 16:32:16.788451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.358 [2024-11-28 16:32:16.788508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.358 [2024-11-28 16:32:16.788525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.358 #7 NEW cov: 12348 ft: 13241 corp: 3/107b lim: 100 exec/s: 0 rss: 73Mb L: 77/77 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:19.358 [2024-11-28 16:32:16.828195] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.358 [2024-11-28 16:32:16.828220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.358 [2024-11-28 16:32:16.828270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.358 [2024-11-28 16:32:16.828285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.358 #8 NEW cov: 12354 ft: 13662 corp: 4/154b lim: 100 exec/s: 0 rss: 73Mb L: 47/77 MS: 1 CopyPart- 00:08:19.358 [2024-11-28 16:32:16.888477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.358 [2024-11-28 16:32:16.888504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.358 [2024-11-28 16:32:16.888553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.358 [2024-11-28 16:32:16.888568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.358 [2024-11-28 16:32:16.888626] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.358 [2024-11-28 16:32:16.888641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.358 #9 NEW cov: 12439 ft: 13879 corp: 5/231b lim: 100 exec/s: 0 rss: 73Mb L: 77/77 MS: 1 CMP- DE: "\000\362"- 00:08:19.358 [2024-11-28 16:32:16.948927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.358 [2024-11-28 16:32:16.948954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.358 [2024-11-28 16:32:16.949019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.358 [2024-11-28 16:32:16.949034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.358 [2024-11-28 16:32:16.949083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.358 [2024-11-28 16:32:16.949097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.358 [2024-11-28 16:32:16.949146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.358 [2024-11-28 16:32:16.949160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.358 [2024-11-28 16:32:16.949214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:19.358 [2024-11-28 16:32:16.949228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:19.358 #10 NEW cov: 12439 ft: 14341 corp: 6/331b lim: 100 exec/s: 0 rss: 73Mb L: 100/100 MS: 1 CrossOver- 00:08:19.617 [2024-11-28 16:32:17.008825] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.617 [2024-11-28 16:32:17.008852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.617 [2024-11-28 16:32:17.008890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.617 [2024-11-28 16:32:17.008904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.617 [2024-11-28 16:32:17.008959] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.617 [2024-11-28 16:32:17.008973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.617 #11 NEW cov: 12439 ft: 14479 corp: 7/396b lim: 100 exec/s: 0 rss: 74Mb L: 65/100 MS: 1 InsertRepeatedBytes- 00:08:19.617 [2024-11-28 16:32:17.048931] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.618 [2024-11-28 16:32:17.048957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.618 [2024-11-28 16:32:17.048993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.618 [2024-11-28 16:32:17.049008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.618 [2024-11-28 16:32:17.049060] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.618 [2024-11-28 16:32:17.049081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.618 #12 NEW cov: 12439 ft: 14579 corp: 8/466b lim: 100 exec/s: 0 rss: 74Mb L: 70/100 MS: 1 EraseBytes- 00:08:19.618 [2024-11-28 16:32:17.108949] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.618 [2024-11-28 16:32:17.108976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.618 [2024-11-28 16:32:17.109027] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.618 [2024-11-28 16:32:17.109043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.618 #13 NEW cov: 12439 ft: 14666 corp: 9/513b lim: 100 exec/s: 0 rss: 74Mb L: 47/100 MS: 1 ChangeBit- 00:08:19.618 [2024-11-28 16:32:17.149036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.618 [2024-11-28 16:32:17.149061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.618 [2024-11-28 16:32:17.149111] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.618 [2024-11-28 16:32:17.149126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.618 #19 NEW cov: 12439 ft: 14737 corp: 10/560b lim: 100 exec/s: 0 rss: 74Mb L: 47/100 MS: 1 ChangeByte- 00:08:19.618 [2024-11-28 16:32:17.189137] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.618 [2024-11-28 16:32:17.189163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.618 [2024-11-28 16:32:17.189214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.618 [2024-11-28 16:32:17.189229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.618 #20 NEW cov: 12439 ft: 14797 corp: 11/607b lim: 100 exec/s: 0 rss: 74Mb L: 47/100 MS: 1 ChangeBit- 00:08:19.618 [2024-11-28 16:32:17.249224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.618 [2024-11-28 16:32:17.249249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.877 #21 NEW cov: 12439 ft: 14810 corp: 12/636b lim: 100 exec/s: 0 rss: 74Mb L: 29/100 MS: 1 PersAutoDict- DE: "\000\362"- 00:08:19.877 [2024-11-28 16:32:17.289667] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.877 [2024-11-28 16:32:17.289692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.877 [2024-11-28 16:32:17.289749] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.877 [2024-11-28 16:32:17.289765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.877 [2024-11-28 16:32:17.289817] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.877 [2024-11-28 16:32:17.289831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.877 [2024-11-28 16:32:17.289884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:19.878 [2024-11-28 16:32:17.289899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.878 #22 NEW cov: 12439 ft: 14836 corp: 13/734b lim: 100 exec/s: 0 rss: 74Mb L: 98/100 MS: 1 EraseBytes- 00:08:19.878 [2024-11-28 16:32:17.349607] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.878 [2024-11-28 16:32:17.349659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.878 [2024-11-28 16:32:17.349718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.878 [2024-11-28 16:32:17.349735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.878 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:19.878 #23 NEW cov: 12462 ft: 14948 corp: 14/781b lim: 100 exec/s: 0 rss: 74Mb L: 47/100 MS: 1 ChangeBit- 00:08:19.878 [2024-11-28 16:32:17.409708] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.878 [2024-11-28 16:32:17.409735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.878 #24 NEW cov: 12462 ft: 14962 corp: 15/812b lim: 100 exec/s: 0 rss: 74Mb L: 31/100 MS: 1 PersAutoDict- DE: "\000\362"- 00:08:19.878 [2024-11-28 16:32:17.449909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.878 [2024-11-28 16:32:17.449937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.878 [2024-11-28 16:32:17.449976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.878 [2024-11-28 16:32:17.449991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.878 #25 NEW cov: 12462 ft: 14973 corp: 16/859b lim: 100 exec/s: 25 rss: 74Mb L: 47/100 MS: 1 ChangeBinInt- 00:08:19.878 [2024-11-28 16:32:17.490123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:19.878 [2024-11-28 16:32:17.490150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.878 [2024-11-28 16:32:17.490200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:19.878 [2024-11-28 16:32:17.490214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.878 [2024-11-28 16:32:17.490269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:19.878 [2024-11-28 16:32:17.490283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.138 #26 NEW cov: 12462 ft: 15000 corp: 17/924b lim: 100 exec/s: 26 rss: 74Mb L: 65/100 MS: 1 ChangeBinInt- 00:08:20.138 [2024-11-28 16:32:17.550180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.138 [2024-11-28 16:32:17.550206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.138 [2024-11-28 16:32:17.550270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.138 [2024-11-28 16:32:17.550284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.138 #27 NEW cov: 12462 ft: 15010 corp: 18/971b lim: 100 exec/s: 27 rss: 74Mb L: 47/100 MS: 1 ShuffleBytes- 00:08:20.138 [2024-11-28 16:32:17.590518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.138 [2024-11-28 16:32:17.590545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.138 [2024-11-28 16:32:17.590614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.138 [2024-11-28 16:32:17.590629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.138 [2024-11-28 16:32:17.590680] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.138 [2024-11-28 16:32:17.590696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.138 [2024-11-28 16:32:17.590747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:20.138 [2024-11-28 16:32:17.590762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.138 #28 NEW cov: 12462 ft: 15020 corp: 19/1056b lim: 100 exec/s: 28 rss: 74Mb L: 85/100 MS: 1 CrossOver- 00:08:20.138 [2024-11-28 16:32:17.630389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.138 [2024-11-28 16:32:17.630414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.138 [2024-11-28 16:32:17.630478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.138 [2024-11-28 16:32:17.630493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.138 #29 NEW cov: 12462 ft: 15087 corp: 20/1104b lim: 100 exec/s: 29 rss: 74Mb L: 48/100 MS: 1 InsertByte- 00:08:20.138 [2024-11-28 16:32:17.690665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.138 [2024-11-28 16:32:17.690691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.138 [2024-11-28 16:32:17.690755] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.138 [2024-11-28 16:32:17.690770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.138 [2024-11-28 16:32:17.690822] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.138 [2024-11-28 16:32:17.690837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.138 #30 NEW cov: 12462 ft: 15102 corp: 21/1169b lim: 100 exec/s: 30 rss: 74Mb L: 65/100 MS: 1 ChangeBit- 00:08:20.138 [2024-11-28 16:32:17.730539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.138 [2024-11-28 16:32:17.730564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.138 #31 NEW cov: 12462 ft: 15170 corp: 22/1198b lim: 100 exec/s: 31 rss: 74Mb L: 29/100 MS: 1 ChangeByte- 00:08:20.138 [2024-11-28 16:32:17.770773] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.138 [2024-11-28 16:32:17.770800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.138 [2024-11-28 16:32:17.770834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.138 [2024-11-28 16:32:17.770847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.398 #32 NEW cov: 12462 ft: 15184 corp: 23/1244b lim: 100 exec/s: 32 rss: 74Mb L: 46/100 MS: 1 EraseBytes- 00:08:20.398 [2024-11-28 16:32:17.811164] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.398 [2024-11-28 16:32:17.811191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.398 [2024-11-28 16:32:17.811257] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.398 [2024-11-28 16:32:17.811272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.398 [2024-11-28 16:32:17.811324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.398 [2024-11-28 16:32:17.811337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.398 [2024-11-28 16:32:17.811393] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:20.398 [2024-11-28 16:32:17.811407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.398 #33 NEW cov: 12462 ft: 15202 corp: 24/1324b lim: 100 exec/s: 33 rss: 74Mb L: 80/100 MS: 1 InsertRepeatedBytes- 00:08:20.398 [2024-11-28 16:32:17.871099] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.398 [2024-11-28 16:32:17.871124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.398 [2024-11-28 16:32:17.871161] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.398 [2024-11-28 16:32:17.871175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.398 #34 NEW cov: 12462 ft: 15252 corp: 25/1371b lim: 100 exec/s: 34 rss: 74Mb L: 47/100 MS: 1 ShuffleBytes- 00:08:20.398 [2024-11-28 16:32:17.911315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.398 [2024-11-28 16:32:17.911340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.398 [2024-11-28 16:32:17.911372] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.398 [2024-11-28 16:32:17.911386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.398 [2024-11-28 16:32:17.911437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.398 [2024-11-28 16:32:17.911451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.398 #40 NEW cov: 12462 ft: 15270 corp: 26/1436b lim: 100 exec/s: 40 rss: 74Mb L: 65/100 MS: 1 ShuffleBytes- 00:08:20.398 [2024-11-28 16:32:17.971612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.398 [2024-11-28 16:32:17.971639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.398 [2024-11-28 16:32:17.971686] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.398 [2024-11-28 16:32:17.971701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.398 [2024-11-28 16:32:17.971750] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.398 [2024-11-28 16:32:17.971763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.398 [2024-11-28 16:32:17.971814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:20.398 [2024-11-28 16:32:17.971827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.398 #41 NEW cov: 12462 ft: 15306 corp: 27/1531b lim: 100 exec/s: 41 rss: 74Mb L: 95/100 MS: 1 InsertRepeatedBytes- 00:08:20.398 [2024-11-28 16:32:18.011346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.398 [2024-11-28 16:32:18.011372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.398 #42 NEW cov: 12462 ft: 15359 corp: 28/1568b lim: 100 exec/s: 42 rss: 74Mb L: 37/100 MS: 1 CopyPart- 00:08:20.658 [2024-11-28 16:32:18.051716] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.658 [2024-11-28 16:32:18.051741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.658 [2024-11-28 16:32:18.051794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.658 [2024-11-28 16:32:18.051808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.658 [2024-11-28 16:32:18.051860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.658 [2024-11-28 16:32:18.051876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.658 #43 NEW cov: 12462 ft: 15410 corp: 29/1645b lim: 100 exec/s: 43 rss: 74Mb L: 77/100 MS: 1 PersAutoDict- DE: "\000\362"- 00:08:20.658 [2024-11-28 16:32:18.091553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.658 [2024-11-28 16:32:18.091579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.658 #44 NEW cov: 12462 ft: 15484 corp: 30/1669b lim: 100 exec/s: 44 rss: 74Mb L: 24/100 MS: 1 EraseBytes- 00:08:20.658 [2024-11-28 16:32:18.151873] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.658 [2024-11-28 16:32:18.151898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.658 [2024-11-28 16:32:18.151934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.658 [2024-11-28 16:32:18.151949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.658 #45 NEW cov: 12462 ft: 15492 corp: 31/1717b lim: 100 exec/s: 45 rss: 74Mb L: 48/100 MS: 1 CopyPart- 00:08:20.658 [2024-11-28 16:32:18.212173] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.658 [2024-11-28 16:32:18.212200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.658 [2024-11-28 16:32:18.212245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.658 [2024-11-28 16:32:18.212260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.658 [2024-11-28 16:32:18.212313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.658 [2024-11-28 16:32:18.212344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.658 #46 NEW cov: 12462 ft: 15496 corp: 32/1783b lim: 100 exec/s: 46 rss: 74Mb L: 66/100 MS: 1 InsertByte- 00:08:20.658 [2024-11-28 16:32:18.251984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.658 [2024-11-28 16:32:18.252009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.658 #47 NEW cov: 12462 ft: 15504 corp: 33/1803b lim: 100 exec/s: 47 rss: 74Mb L: 20/100 MS: 1 EraseBytes- 00:08:20.918 [2024-11-28 16:32:18.312419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.918 [2024-11-28 16:32:18.312443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.919 [2024-11-28 16:32:18.312494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.919 [2024-11-28 16:32:18.312509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.919 [2024-11-28 16:32:18.312562] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.919 [2024-11-28 16:32:18.312577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.919 #48 NEW cov: 12462 ft: 15520 corp: 34/1868b lim: 100 exec/s: 48 rss: 74Mb L: 65/100 MS: 1 CrossOver- 00:08:20.919 [2024-11-28 16:32:18.352363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.919 [2024-11-28 16:32:18.352389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.919 [2024-11-28 16:32:18.352444] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.919 [2024-11-28 16:32:18.352459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.919 #49 NEW cov: 12462 ft: 15524 corp: 35/1915b lim: 100 exec/s: 49 rss: 74Mb L: 47/100 MS: 1 PersAutoDict- DE: "\000\362"- 00:08:20.919 [2024-11-28 16:32:18.412639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.919 [2024-11-28 16:32:18.412664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.919 [2024-11-28 16:32:18.412712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.919 [2024-11-28 16:32:18.412727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.919 [2024-11-28 16:32:18.412782] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.919 [2024-11-28 16:32:18.412796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.919 #50 NEW cov: 12462 ft: 15536 corp: 36/1992b lim: 100 exec/s: 50 rss: 74Mb L: 77/100 MS: 1 ChangeByte- 00:08:20.919 [2024-11-28 16:32:18.452868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:20.919 [2024-11-28 16:32:18.452894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.919 [2024-11-28 16:32:18.452944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:20.919 [2024-11-28 16:32:18.452958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.919 [2024-11-28 16:32:18.453009] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:20.919 [2024-11-28 16:32:18.453025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.919 [2024-11-28 16:32:18.453076] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:20.919 [2024-11-28 16:32:18.453091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.919 #51 NEW cov: 12462 ft: 15615 corp: 37/2090b lim: 100 exec/s: 25 rss: 74Mb L: 98/100 MS: 1 PersAutoDict- DE: "\000\362"- 00:08:20.919 #51 DONE cov: 12462 ft: 15615 corp: 37/2090b lim: 100 exec/s: 25 rss: 74Mb 00:08:20.919 ###### Recommended dictionary. ###### 00:08:20.919 "\000\362" # Uses: 5 00:08:20.919 ###### End of recommended dictionary. ###### 00:08:20.919 Done 51 runs in 2 second(s) 00:08:21.179 16:32:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_18.conf /var/tmp/suppress_nvmf_fuzz 00:08:21.179 16:32:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:21.179 16:32:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:21.179 16:32:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:21.179 16:32:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:21.179 16:32:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:21.179 16:32:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:21.179 16:32:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:21.179 16:32:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:21.179 16:32:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:21.179 16:32:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:21.179 16:32:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 19 00:08:21.179 16:32:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4419 00:08:21.179 16:32:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:21.179 16:32:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:21.179 16:32:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:21.179 16:32:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:21.179 16:32:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:21.179 16:32:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 00:08:21.179 [2024-11-28 16:32:18.646405] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:21.179 [2024-11-28 16:32:18.646481] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3762791 ] 00:08:21.438 [2024-11-28 16:32:18.828401] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.438 [2024-11-28 16:32:18.851545] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.438 [2024-11-28 16:32:18.904124] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:21.438 [2024-11-28 16:32:18.920483] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:21.438 INFO: Running with entropic power schedule (0xFF, 100). 00:08:21.438 INFO: Seed: 3227138961 00:08:21.438 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:21.439 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:21.439 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:21.439 INFO: A corpus is not provided, starting from an empty corpus 00:08:21.439 #2 INITED exec/s: 0 rss: 65Mb 00:08:21.439 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:21.439 This may also happen if the target rejected all inputs we tried so far 00:08:21.439 [2024-11-28 16:32:18.989717] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071713062911 len:65536 00:08:21.439 [2024-11-28 16:32:18.989755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.697 NEW_FUNC[1/713]: 0x472ef8 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:21.698 NEW_FUNC[2/713]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:21.698 #19 NEW cov: 12212 ft: 12209 corp: 2/19b lim: 50 exec/s: 0 rss: 72Mb L: 18/18 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:21.698 [2024-11-28 16:32:19.330711] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:168427520 len:1 00:08:21.698 [2024-11-28 16:32:19.330758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.957 NEW_FUNC[1/1]: 0x1a1abb8 in nvme_tcp_read_data /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk_internal/nvme_tcp.h:405 00:08:21.957 #28 NEW cov: 12326 ft: 12909 corp: 3/33b lim: 50 exec/s: 0 rss: 72Mb L: 14/18 MS: 4 CopyPart-CMP-ChangeBit-CopyPart- DE: "\012\000\000\000\000\000\000\000"- 00:08:21.957 [2024-11-28 16:32:19.380725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446735275620040703 len:65536 00:08:21.957 [2024-11-28 16:32:19.380752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.957 #29 NEW cov: 12332 ft: 13105 corp: 4/51b lim: 50 exec/s: 0 rss: 72Mb L: 18/18 MS: 1 ChangeBit- 00:08:21.957 [2024-11-28 16:32:19.450912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:60297969664 len:1 00:08:21.957 [2024-11-28 16:32:19.450943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.957 #30 NEW cov: 12417 ft: 13321 corp: 5/65b lim: 50 exec/s: 0 rss: 72Mb L: 14/18 MS: 1 ChangeBinInt- 00:08:21.957 [2024-11-28 16:32:19.521059] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071713062911 len:65536 00:08:21.957 [2024-11-28 16:32:19.521091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.957 #31 NEW cov: 12417 ft: 13374 corp: 6/83b lim: 50 exec/s: 0 rss: 73Mb L: 18/18 MS: 1 ChangeBit- 00:08:21.957 [2024-11-28 16:32:19.591810] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:60297969664 len:1 00:08:21.957 [2024-11-28 16:32:19.591839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.957 [2024-11-28 16:32:19.591907] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:21.957 [2024-11-28 16:32:19.591932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.957 [2024-11-28 16:32:19.592060] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:21.957 [2024-11-28 16:32:19.592079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.957 [2024-11-28 16:32:19.592194] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:21.957 [2024-11-28 16:32:19.592218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.217 #32 NEW cov: 12417 ft: 13860 corp: 7/131b lim: 50 exec/s: 0 rss: 73Mb L: 48/48 MS: 1 InsertRepeatedBytes- 00:08:22.217 [2024-11-28 16:32:19.661986] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:60297969664 len:1 00:08:22.217 [2024-11-28 16:32:19.662019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.217 [2024-11-28 16:32:19.662114] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:22.217 [2024-11-28 16:32:19.662137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.217 [2024-11-28 16:32:19.662254] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:22.217 [2024-11-28 16:32:19.662281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.217 [2024-11-28 16:32:19.662398] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:22.217 [2024-11-28 16:32:19.662423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.217 #33 NEW cov: 12417 ft: 13978 corp: 8/180b lim: 50 exec/s: 0 rss: 73Mb L: 49/49 MS: 1 InsertByte- 00:08:22.217 [2024-11-28 16:32:19.732203] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:60297969664 len:1 00:08:22.217 [2024-11-28 16:32:19.732236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.217 [2024-11-28 16:32:19.732304] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:22.217 [2024-11-28 16:32:19.732328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.217 [2024-11-28 16:32:19.732438] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:22.217 [2024-11-28 16:32:19.732462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.217 [2024-11-28 16:32:19.732575] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:22.217 [2024-11-28 16:32:19.732607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.217 #34 NEW cov: 12417 ft: 13995 corp: 9/229b lim: 50 exec/s: 0 rss: 73Mb L: 49/49 MS: 1 ShuffleBytes- 00:08:22.217 [2024-11-28 16:32:19.802032] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:168430080 len:1 00:08:22.217 [2024-11-28 16:32:19.802062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.217 [2024-11-28 16:32:19.802175] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:22.217 [2024-11-28 16:32:19.802199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.217 #35 NEW cov: 12417 ft: 14282 corp: 10/251b lim: 50 exec/s: 0 rss: 73Mb L: 22/49 MS: 1 PersAutoDict- DE: "\012\000\000\000\000\000\000\000"- 00:08:22.217 [2024-11-28 16:32:19.852640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:60297969664 len:1 00:08:22.217 [2024-11-28 16:32:19.852672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.218 [2024-11-28 16:32:19.852751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:22.218 [2024-11-28 16:32:19.852779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.218 [2024-11-28 16:32:19.852896] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:22.218 [2024-11-28 16:32:19.852920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.218 [2024-11-28 16:32:19.853036] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:22.218 [2024-11-28 16:32:19.853061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.477 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:22.477 #36 NEW cov: 12440 ft: 14357 corp: 11/300b lim: 50 exec/s: 0 rss: 73Mb L: 49/49 MS: 1 CopyPart- 00:08:22.477 [2024-11-28 16:32:19.902264] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071713062911 len:65536 00:08:22.477 [2024-11-28 16:32:19.902291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.477 #42 NEW cov: 12440 ft: 14433 corp: 12/318b lim: 50 exec/s: 0 rss: 73Mb L: 18/49 MS: 1 ShuffleBytes- 00:08:22.477 [2024-11-28 16:32:19.952358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071713046527 len:65536 00:08:22.477 [2024-11-28 16:32:19.952386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.477 #43 NEW cov: 12440 ft: 14581 corp: 13/336b lim: 50 exec/s: 43 rss: 73Mb L: 18/49 MS: 1 ChangeBit- 00:08:22.477 [2024-11-28 16:32:20.002628] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446735275620040703 len:65280 00:08:22.477 [2024-11-28 16:32:20.002659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.477 #44 NEW cov: 12440 ft: 14612 corp: 14/354b lim: 50 exec/s: 44 rss: 73Mb L: 18/49 MS: 1 ChangeBit- 00:08:22.477 [2024-11-28 16:32:20.052909] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071478181887 len:65536 00:08:22.477 [2024-11-28 16:32:20.052942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.477 [2024-11-28 16:32:20.053058] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:22.477 [2024-11-28 16:32:20.053082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.477 #46 NEW cov: 12440 ft: 14659 corp: 15/374b lim: 50 exec/s: 46 rss: 73Mb L: 20/49 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:22.477 [2024-11-28 16:32:20.102966] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071478181887 len:65536 00:08:22.477 [2024-11-28 16:32:20.103000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.737 #47 NEW cov: 12440 ft: 14793 corp: 16/391b lim: 50 exec/s: 47 rss: 73Mb L: 17/49 MS: 1 EraseBytes- 00:08:22.737 [2024-11-28 16:32:20.173158] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:96757191671808 len:3585 00:08:22.737 [2024-11-28 16:32:20.173187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.737 #48 NEW cov: 12440 ft: 14818 corp: 17/406b lim: 50 exec/s: 48 rss: 73Mb L: 15/49 MS: 1 InsertByte- 00:08:22.737 [2024-11-28 16:32:20.223378] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2666130977406844927 len:65536 00:08:22.737 [2024-11-28 16:32:20.223407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.737 #49 NEW cov: 12440 ft: 14836 corp: 18/424b lim: 50 exec/s: 49 rss: 73Mb L: 18/49 MS: 1 ChangeByte- 00:08:22.737 [2024-11-28 16:32:20.273444] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4503597630816512 len:65536 00:08:22.737 [2024-11-28 16:32:20.273472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.737 #50 NEW cov: 12440 ft: 14883 corp: 19/442b lim: 50 exec/s: 50 rss: 73Mb L: 18/49 MS: 1 CMP- DE: "\001\000\000\017"- 00:08:22.737 [2024-11-28 16:32:20.343662] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:16140901062499368959 len:65536 00:08:22.737 [2024-11-28 16:32:20.343693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.996 #51 NEW cov: 12440 ft: 14913 corp: 20/460b lim: 50 exec/s: 51 rss: 73Mb L: 18/49 MS: 1 ChangeBinInt- 00:08:22.996 [2024-11-28 16:32:20.413860] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:22.996 [2024-11-28 16:32:20.413894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.996 #52 NEW cov: 12440 ft: 14983 corp: 21/470b lim: 50 exec/s: 52 rss: 73Mb L: 10/49 MS: 1 EraseBytes- 00:08:22.996 [2024-11-28 16:32:20.484256] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071713062911 len:257 00:08:22.997 [2024-11-28 16:32:20.484296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.997 [2024-11-28 16:32:20.484414] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744069415632895 len:65536 00:08:22.997 [2024-11-28 16:32:20.484437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.997 #53 NEW cov: 12440 ft: 15005 corp: 22/492b lim: 50 exec/s: 53 rss: 73Mb L: 22/49 MS: 1 PersAutoDict- DE: "\001\000\000\017"- 00:08:22.997 [2024-11-28 16:32:20.534242] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071713062911 len:65536 00:08:22.997 [2024-11-28 16:32:20.534270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.997 #54 NEW cov: 12440 ft: 15089 corp: 23/510b lim: 50 exec/s: 54 rss: 73Mb L: 18/49 MS: 1 CopyPart- 00:08:22.997 [2024-11-28 16:32:20.584439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2666130977406844927 len:65536 00:08:22.997 [2024-11-28 16:32:20.584472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.997 #55 NEW cov: 12440 ft: 15105 corp: 24/528b lim: 50 exec/s: 55 rss: 73Mb L: 18/49 MS: 1 ShuffleBytes- 00:08:22.997 [2024-11-28 16:32:20.624655] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:168430081 len:1 00:08:22.997 [2024-11-28 16:32:20.624687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.997 [2024-11-28 16:32:20.624763] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:22.997 [2024-11-28 16:32:20.624787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.255 #56 NEW cov: 12440 ft: 15162 corp: 25/550b lim: 50 exec/s: 56 rss: 74Mb L: 22/49 MS: 1 ChangeBinInt- 00:08:23.255 [2024-11-28 16:32:20.695303] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071713062911 len:65536 00:08:23.256 [2024-11-28 16:32:20.695332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.256 [2024-11-28 16:32:20.695418] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:23.256 [2024-11-28 16:32:20.695442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.256 [2024-11-28 16:32:20.695558] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:23.256 [2024-11-28 16:32:20.695582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.256 [2024-11-28 16:32:20.695688] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:23.256 [2024-11-28 16:32:20.695713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.256 #62 NEW cov: 12440 ft: 15171 corp: 26/591b lim: 50 exec/s: 62 rss: 74Mb L: 41/49 MS: 1 InsertRepeatedBytes- 00:08:23.256 [2024-11-28 16:32:20.764937] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2666130977406844927 len:65536 00:08:23.256 [2024-11-28 16:32:20.764966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.256 #63 NEW cov: 12440 ft: 15181 corp: 27/610b lim: 50 exec/s: 63 rss: 74Mb L: 19/49 MS: 1 InsertByte- 00:08:23.256 [2024-11-28 16:32:20.835208] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744071478181887 len:65536 00:08:23.256 [2024-11-28 16:32:20.835241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.256 #64 NEW cov: 12440 ft: 15190 corp: 28/627b lim: 50 exec/s: 64 rss: 74Mb L: 17/49 MS: 1 CopyPart- 00:08:23.256 [2024-11-28 16:32:20.885323] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:16140901062499368959 len:65536 00:08:23.256 [2024-11-28 16:32:20.885355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.514 #65 NEW cov: 12440 ft: 15244 corp: 29/645b lim: 50 exec/s: 65 rss: 74Mb L: 18/49 MS: 1 CopyPart- 00:08:23.514 [2024-11-28 16:32:20.955442] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:201326592 len:65536 00:08:23.514 [2024-11-28 16:32:20.955474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.514 #66 NEW cov: 12440 ft: 15284 corp: 30/663b lim: 50 exec/s: 33 rss: 74Mb L: 18/49 MS: 1 CMP- DE: "\014\000\000\000\000\000\000\000"- 00:08:23.514 #66 DONE cov: 12440 ft: 15284 corp: 30/663b lim: 50 exec/s: 33 rss: 74Mb 00:08:23.514 ###### Recommended dictionary. ###### 00:08:23.514 "\012\000\000\000\000\000\000\000" # Uses: 1 00:08:23.514 "\001\000\000\017" # Uses: 1 00:08:23.514 "\014\000\000\000\000\000\000\000" # Uses: 0 00:08:23.514 ###### End of recommended dictionary. ###### 00:08:23.514 Done 66 runs in 2 second(s) 00:08:23.514 16:32:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_19.conf /var/tmp/suppress_nvmf_fuzz 00:08:23.514 16:32:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:23.514 16:32:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:23.514 16:32:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:23.514 16:32:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:23.514 16:32:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:23.514 16:32:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:23.514 16:32:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:23.514 16:32:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:23.514 16:32:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:23.514 16:32:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:23.514 16:32:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 20 00:08:23.514 16:32:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4420 00:08:23.514 16:32:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:23.514 16:32:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:23.514 16:32:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:23.514 16:32:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:23.514 16:32:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:23.514 16:32:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 00:08:23.514 [2024-11-28 16:32:21.123911] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:23.514 [2024-11-28 16:32:21.123982] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3763252 ] 00:08:23.773 [2024-11-28 16:32:21.299855] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.773 [2024-11-28 16:32:21.321478] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.773 [2024-11-28 16:32:21.373551] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:23.773 [2024-11-28 16:32:21.389862] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:23.773 INFO: Running with entropic power schedule (0xFF, 100). 00:08:23.773 INFO: Seed: 1400184008 00:08:24.032 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:24.032 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:24.032 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:24.032 INFO: A corpus is not provided, starting from an empty corpus 00:08:24.032 #2 INITED exec/s: 0 rss: 65Mb 00:08:24.032 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:24.032 This may also happen if the target rejected all inputs we tried so far 00:08:24.032 [2024-11-28 16:32:21.458422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.032 [2024-11-28 16:32:21.458458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.032 [2024-11-28 16:32:21.458596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.032 [2024-11-28 16:32:21.458623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.032 [2024-11-28 16:32:21.458741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.032 [2024-11-28 16:32:21.458767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.290 NEW_FUNC[1/716]: 0x474ab8 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:24.290 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:24.290 #11 NEW cov: 12271 ft: 12271 corp: 2/70b lim: 90 exec/s: 0 rss: 73Mb L: 69/69 MS: 4 ChangeBit-ChangeBit-InsertByte-InsertRepeatedBytes- 00:08:24.290 [2024-11-28 16:32:21.799220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.290 [2024-11-28 16:32:21.799260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.290 [2024-11-28 16:32:21.799380] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.290 [2024-11-28 16:32:21.799403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.290 [2024-11-28 16:32:21.799522] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.290 [2024-11-28 16:32:21.799545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.290 #12 NEW cov: 12384 ft: 12726 corp: 3/139b lim: 90 exec/s: 0 rss: 73Mb L: 69/69 MS: 1 ChangeBit- 00:08:24.290 [2024-11-28 16:32:21.869405] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.290 [2024-11-28 16:32:21.869438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.290 [2024-11-28 16:32:21.869539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.290 [2024-11-28 16:32:21.869563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.290 [2024-11-28 16:32:21.869706] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.290 [2024-11-28 16:32:21.869729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.290 #13 NEW cov: 12390 ft: 12916 corp: 4/208b lim: 90 exec/s: 0 rss: 73Mb L: 69/69 MS: 1 ChangeByte- 00:08:24.549 [2024-11-28 16:32:21.939881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.549 [2024-11-28 16:32:21.939918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.549 [2024-11-28 16:32:21.940010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.549 [2024-11-28 16:32:21.940034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.549 [2024-11-28 16:32:21.940153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.549 [2024-11-28 16:32:21.940179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.549 [2024-11-28 16:32:21.940313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.549 [2024-11-28 16:32:21.940336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.549 #14 NEW cov: 12475 ft: 13771 corp: 5/283b lim: 90 exec/s: 0 rss: 73Mb L: 75/75 MS: 1 InsertRepeatedBytes- 00:08:24.549 [2024-11-28 16:32:22.010062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.549 [2024-11-28 16:32:22.010095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.549 [2024-11-28 16:32:22.010220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.549 [2024-11-28 16:32:22.010248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.549 [2024-11-28 16:32:22.010371] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.549 [2024-11-28 16:32:22.010397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.549 [2024-11-28 16:32:22.010525] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.549 [2024-11-28 16:32:22.010546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.549 #15 NEW cov: 12475 ft: 13877 corp: 6/362b lim: 90 exec/s: 0 rss: 73Mb L: 79/79 MS: 1 InsertRepeatedBytes- 00:08:24.549 [2024-11-28 16:32:22.079960] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.549 [2024-11-28 16:32:22.079992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.549 [2024-11-28 16:32:22.080138] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.549 [2024-11-28 16:32:22.080159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.549 [2024-11-28 16:32:22.080285] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.549 [2024-11-28 16:32:22.080305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.549 #16 NEW cov: 12475 ft: 13946 corp: 7/432b lim: 90 exec/s: 0 rss: 73Mb L: 70/79 MS: 1 InsertByte- 00:08:24.549 [2024-11-28 16:32:22.130072] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.549 [2024-11-28 16:32:22.130108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.549 [2024-11-28 16:32:22.130249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.549 [2024-11-28 16:32:22.130267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.549 [2024-11-28 16:32:22.130385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.549 [2024-11-28 16:32:22.130410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.549 #17 NEW cov: 12475 ft: 14002 corp: 8/501b lim: 90 exec/s: 0 rss: 73Mb L: 69/79 MS: 1 CrossOver- 00:08:24.549 [2024-11-28 16:32:22.180573] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.549 [2024-11-28 16:32:22.180604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.549 [2024-11-28 16:32:22.180704] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.549 [2024-11-28 16:32:22.180729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.549 [2024-11-28 16:32:22.180844] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.549 [2024-11-28 16:32:22.180867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.549 [2024-11-28 16:32:22.180985] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.549 [2024-11-28 16:32:22.181008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.808 #18 NEW cov: 12475 ft: 14043 corp: 9/573b lim: 90 exec/s: 0 rss: 73Mb L: 72/79 MS: 1 InsertRepeatedBytes- 00:08:24.808 [2024-11-28 16:32:22.230715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.808 [2024-11-28 16:32:22.230749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.808 [2024-11-28 16:32:22.230854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.808 [2024-11-28 16:32:22.230877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.808 [2024-11-28 16:32:22.230998] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.808 [2024-11-28 16:32:22.231021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.808 [2024-11-28 16:32:22.231143] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.808 [2024-11-28 16:32:22.231162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.808 #24 NEW cov: 12475 ft: 14063 corp: 10/648b lim: 90 exec/s: 0 rss: 73Mb L: 75/79 MS: 1 ShuffleBytes- 00:08:24.808 [2024-11-28 16:32:22.280846] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.808 [2024-11-28 16:32:22.280878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.808 [2024-11-28 16:32:22.280983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.808 [2024-11-28 16:32:22.281005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.808 [2024-11-28 16:32:22.281122] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.808 [2024-11-28 16:32:22.281145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.808 [2024-11-28 16:32:22.281264] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.808 [2024-11-28 16:32:22.281282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.808 #25 NEW cov: 12475 ft: 14105 corp: 11/720b lim: 90 exec/s: 0 rss: 74Mb L: 72/79 MS: 1 ShuffleBytes- 00:08:24.808 [2024-11-28 16:32:22.351146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.808 [2024-11-28 16:32:22.351176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.808 [2024-11-28 16:32:22.351246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.808 [2024-11-28 16:32:22.351269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.808 [2024-11-28 16:32:22.351386] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.808 [2024-11-28 16:32:22.351407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.808 [2024-11-28 16:32:22.351521] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:24.808 [2024-11-28 16:32:22.351547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.808 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:24.808 #26 NEW cov: 12498 ft: 14172 corp: 12/800b lim: 90 exec/s: 0 rss: 74Mb L: 80/80 MS: 1 InsertByte- 00:08:24.808 [2024-11-28 16:32:22.421038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:24.808 [2024-11-28 16:32:22.421070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.808 [2024-11-28 16:32:22.421191] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:24.808 [2024-11-28 16:32:22.421218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.808 [2024-11-28 16:32:22.421345] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:24.808 [2024-11-28 16:32:22.421364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.067 #27 NEW cov: 12498 ft: 14212 corp: 13/869b lim: 90 exec/s: 27 rss: 74Mb L: 69/80 MS: 1 ChangeBinInt- 00:08:25.067 [2024-11-28 16:32:22.491481] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.067 [2024-11-28 16:32:22.491511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.067 [2024-11-28 16:32:22.491602] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.067 [2024-11-28 16:32:22.491626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.067 [2024-11-28 16:32:22.491762] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.067 [2024-11-28 16:32:22.491782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.067 [2024-11-28 16:32:22.491909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:25.067 [2024-11-28 16:32:22.491932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.067 #28 NEW cov: 12498 ft: 14244 corp: 14/945b lim: 90 exec/s: 28 rss: 74Mb L: 76/80 MS: 1 InsertRepeatedBytes- 00:08:25.067 [2024-11-28 16:32:22.541389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.067 [2024-11-28 16:32:22.541424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.067 [2024-11-28 16:32:22.541535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.067 [2024-11-28 16:32:22.541560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.067 [2024-11-28 16:32:22.541686] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.067 [2024-11-28 16:32:22.541709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.067 #29 NEW cov: 12498 ft: 14290 corp: 15/1006b lim: 90 exec/s: 29 rss: 74Mb L: 61/80 MS: 1 EraseBytes- 00:08:25.067 [2024-11-28 16:32:22.591462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.067 [2024-11-28 16:32:22.591495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.067 [2024-11-28 16:32:22.591602] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.067 [2024-11-28 16:32:22.591625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.067 [2024-11-28 16:32:22.591750] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.067 [2024-11-28 16:32:22.591771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.067 #30 NEW cov: 12498 ft: 14332 corp: 16/1077b lim: 90 exec/s: 30 rss: 74Mb L: 71/80 MS: 1 InsertRepeatedBytes- 00:08:25.067 [2024-11-28 16:32:22.641934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.067 [2024-11-28 16:32:22.641965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.067 [2024-11-28 16:32:22.642039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.067 [2024-11-28 16:32:22.642062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.067 [2024-11-28 16:32:22.642193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.067 [2024-11-28 16:32:22.642220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.067 [2024-11-28 16:32:22.642347] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:25.067 [2024-11-28 16:32:22.642370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.067 #31 NEW cov: 12498 ft: 14348 corp: 17/1149b lim: 90 exec/s: 31 rss: 74Mb L: 72/80 MS: 1 CopyPart- 00:08:25.067 [2024-11-28 16:32:22.712278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.067 [2024-11-28 16:32:22.712307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.068 [2024-11-28 16:32:22.712384] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.068 [2024-11-28 16:32:22.712407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.068 [2024-11-28 16:32:22.712536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.068 [2024-11-28 16:32:22.712559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.068 [2024-11-28 16:32:22.712678] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:25.068 [2024-11-28 16:32:22.712700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.387 #32 NEW cov: 12498 ft: 14390 corp: 18/1225b lim: 90 exec/s: 32 rss: 74Mb L: 76/80 MS: 1 InsertByte- 00:08:25.387 [2024-11-28 16:32:22.782412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.387 [2024-11-28 16:32:22.782441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.387 [2024-11-28 16:32:22.782535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.387 [2024-11-28 16:32:22.782560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.387 [2024-11-28 16:32:22.782675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.387 [2024-11-28 16:32:22.782697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.387 [2024-11-28 16:32:22.782816] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:25.387 [2024-11-28 16:32:22.782841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.387 #33 NEW cov: 12498 ft: 14405 corp: 19/1297b lim: 90 exec/s: 33 rss: 74Mb L: 72/80 MS: 1 ChangeBinInt- 00:08:25.387 [2024-11-28 16:32:22.852521] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.387 [2024-11-28 16:32:22.852552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.387 [2024-11-28 16:32:22.852687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.387 [2024-11-28 16:32:22.852709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.387 [2024-11-28 16:32:22.852826] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.387 [2024-11-28 16:32:22.852848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.387 #34 NEW cov: 12498 ft: 14425 corp: 20/1366b lim: 90 exec/s: 34 rss: 74Mb L: 69/80 MS: 1 ChangeBinInt- 00:08:25.387 [2024-11-28 16:32:22.902752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.387 [2024-11-28 16:32:22.902779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.387 [2024-11-28 16:32:22.902888] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.387 [2024-11-28 16:32:22.902911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.387 [2024-11-28 16:32:22.903044] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.387 [2024-11-28 16:32:22.903069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.387 [2024-11-28 16:32:22.903203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:25.387 [2024-11-28 16:32:22.903225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.387 #35 NEW cov: 12498 ft: 14468 corp: 21/1449b lim: 90 exec/s: 35 rss: 74Mb L: 83/83 MS: 1 InsertRepeatedBytes- 00:08:25.387 [2024-11-28 16:32:22.952736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.387 [2024-11-28 16:32:22.952769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.387 [2024-11-28 16:32:22.952877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.387 [2024-11-28 16:32:22.952902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.387 [2024-11-28 16:32:22.953026] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.387 [2024-11-28 16:32:22.953048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.387 #36 NEW cov: 12498 ft: 14490 corp: 22/1518b lim: 90 exec/s: 36 rss: 74Mb L: 69/83 MS: 1 ChangeByte- 00:08:25.387 [2024-11-28 16:32:23.023267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.387 [2024-11-28 16:32:23.023303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.387 [2024-11-28 16:32:23.023422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.387 [2024-11-28 16:32:23.023447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.387 [2024-11-28 16:32:23.023572] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.387 [2024-11-28 16:32:23.023602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.387 [2024-11-28 16:32:23.023722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:25.387 [2024-11-28 16:32:23.023744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.682 #37 NEW cov: 12498 ft: 14524 corp: 23/1590b lim: 90 exec/s: 37 rss: 74Mb L: 72/83 MS: 1 ShuffleBytes- 00:08:25.682 [2024-11-28 16:32:23.073174] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.682 [2024-11-28 16:32:23.073211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.682 [2024-11-28 16:32:23.073310] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.682 [2024-11-28 16:32:23.073336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.682 [2024-11-28 16:32:23.073460] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.682 [2024-11-28 16:32:23.073486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.682 #38 NEW cov: 12498 ft: 14536 corp: 24/1661b lim: 90 exec/s: 38 rss: 74Mb L: 71/83 MS: 1 ChangeBit- 00:08:25.682 [2024-11-28 16:32:23.143414] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.682 [2024-11-28 16:32:23.143449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.682 [2024-11-28 16:32:23.143547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.682 [2024-11-28 16:32:23.143572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.682 [2024-11-28 16:32:23.143691] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.682 [2024-11-28 16:32:23.143713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.682 #39 NEW cov: 12498 ft: 14570 corp: 25/1731b lim: 90 exec/s: 39 rss: 75Mb L: 70/83 MS: 1 InsertByte- 00:08:25.682 [2024-11-28 16:32:23.213773] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.682 [2024-11-28 16:32:23.213806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.682 [2024-11-28 16:32:23.213901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.682 [2024-11-28 16:32:23.213926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.682 [2024-11-28 16:32:23.214052] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.682 [2024-11-28 16:32:23.214078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.682 [2024-11-28 16:32:23.214200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:25.682 [2024-11-28 16:32:23.214226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.682 #40 NEW cov: 12498 ft: 14593 corp: 26/1811b lim: 90 exec/s: 40 rss: 75Mb L: 80/83 MS: 1 ChangeByte- 00:08:25.682 [2024-11-28 16:32:23.283941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.682 [2024-11-28 16:32:23.283970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.682 [2024-11-28 16:32:23.284040] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.682 [2024-11-28 16:32:23.284064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.682 [2024-11-28 16:32:23.284194] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.682 [2024-11-28 16:32:23.284219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.682 [2024-11-28 16:32:23.284344] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:25.682 [2024-11-28 16:32:23.284369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.682 #41 NEW cov: 12498 ft: 14598 corp: 27/1885b lim: 90 exec/s: 41 rss: 75Mb L: 74/83 MS: 1 EraseBytes- 00:08:25.941 [2024-11-28 16:32:23.333897] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.941 [2024-11-28 16:32:23.333930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.941 [2024-11-28 16:32:23.334048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.941 [2024-11-28 16:32:23.334071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.941 [2024-11-28 16:32:23.334188] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.941 [2024-11-28 16:32:23.334210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.941 #42 NEW cov: 12498 ft: 14611 corp: 28/1954b lim: 90 exec/s: 42 rss: 75Mb L: 69/83 MS: 1 EraseBytes- 00:08:25.941 [2024-11-28 16:32:23.404206] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:25.941 [2024-11-28 16:32:23.404239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.941 [2024-11-28 16:32:23.404369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:25.941 [2024-11-28 16:32:23.404388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.941 [2024-11-28 16:32:23.404512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:25.941 [2024-11-28 16:32:23.404536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.941 [2024-11-28 16:32:23.404654] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:25.941 [2024-11-28 16:32:23.404677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.941 #43 NEW cov: 12498 ft: 14650 corp: 29/2030b lim: 90 exec/s: 21 rss: 75Mb L: 76/83 MS: 1 CopyPart- 00:08:25.941 #43 DONE cov: 12498 ft: 14650 corp: 29/2030b lim: 90 exec/s: 21 rss: 75Mb 00:08:25.941 Done 43 runs in 2 second(s) 00:08:25.941 16:32:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_20.conf /var/tmp/suppress_nvmf_fuzz 00:08:25.941 16:32:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:25.941 16:32:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:25.941 16:32:23 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:25.941 16:32:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:25.941 16:32:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:25.941 16:32:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:25.941 16:32:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:25.941 16:32:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:25.941 16:32:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:25.941 16:32:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:25.941 16:32:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 21 00:08:25.941 16:32:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4421 00:08:25.941 16:32:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:25.941 16:32:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:25.941 16:32:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:25.941 16:32:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:25.941 16:32:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:25.941 16:32:23 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 00:08:25.941 [2024-11-28 16:32:23.580545] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:25.941 [2024-11-28 16:32:23.580621] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3763782 ] 00:08:26.200 [2024-11-28 16:32:23.756057] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:26.200 [2024-11-28 16:32:23.778587] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.200 [2024-11-28 16:32:23.830983] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:26.200 [2024-11-28 16:32:23.847308] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:26.459 INFO: Running with entropic power schedule (0xFF, 100). 00:08:26.459 INFO: Seed: 3861162186 00:08:26.459 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:26.459 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:26.459 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:26.459 INFO: A corpus is not provided, starting from an empty corpus 00:08:26.459 #2 INITED exec/s: 0 rss: 64Mb 00:08:26.459 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:26.459 This may also happen if the target rejected all inputs we tried so far 00:08:26.459 [2024-11-28 16:32:23.892888] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.459 [2024-11-28 16:32:23.892919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.459 [2024-11-28 16:32:23.892957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.459 [2024-11-28 16:32:23.892974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.459 [2024-11-28 16:32:23.893031] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.459 [2024-11-28 16:32:23.893049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.718 NEW_FUNC[1/716]: 0x477ce8 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:26.718 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:26.718 #8 NEW cov: 12246 ft: 12244 corp: 2/39b lim: 50 exec/s: 0 rss: 71Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:26.718 [2024-11-28 16:32:24.203897] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.718 [2024-11-28 16:32:24.203937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.718 [2024-11-28 16:32:24.204000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.718 [2024-11-28 16:32:24.204018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.718 [2024-11-28 16:32:24.204079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.718 [2024-11-28 16:32:24.204096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.718 [2024-11-28 16:32:24.204160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:26.718 [2024-11-28 16:32:24.204179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.718 #16 NEW cov: 12359 ft: 13117 corp: 3/86b lim: 50 exec/s: 0 rss: 72Mb L: 47/47 MS: 3 ChangeBinInt-ChangeByte-InsertRepeatedBytes- 00:08:26.718 [2024-11-28 16:32:24.243414] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.718 [2024-11-28 16:32:24.243444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.718 #21 NEW cov: 12365 ft: 14086 corp: 4/96b lim: 50 exec/s: 0 rss: 72Mb L: 10/47 MS: 5 InsertByte-InsertByte-EraseBytes-ChangeBit-CMP- DE: "\001\000\000\000\377\377\377\377"- 00:08:26.718 [2024-11-28 16:32:24.283844] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.718 [2024-11-28 16:32:24.283873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.718 [2024-11-28 16:32:24.283922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.718 [2024-11-28 16:32:24.283936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.718 [2024-11-28 16:32:24.283992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.718 [2024-11-28 16:32:24.284008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.718 #22 NEW cov: 12450 ft: 14375 corp: 5/134b lim: 50 exec/s: 0 rss: 72Mb L: 38/47 MS: 1 ChangeByte- 00:08:26.718 [2024-11-28 16:32:24.343709] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.718 [2024-11-28 16:32:24.343738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.977 #23 NEW cov: 12450 ft: 14448 corp: 6/144b lim: 50 exec/s: 0 rss: 72Mb L: 10/47 MS: 1 PersAutoDict- DE: "\001\000\000\000\377\377\377\377"- 00:08:26.977 [2024-11-28 16:32:24.404001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.977 [2024-11-28 16:32:24.404028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.977 [2024-11-28 16:32:24.404068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.977 [2024-11-28 16:32:24.404084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.977 #24 NEW cov: 12450 ft: 14748 corp: 7/167b lim: 50 exec/s: 0 rss: 72Mb L: 23/47 MS: 1 InsertRepeatedBytes- 00:08:26.977 [2024-11-28 16:32:24.464321] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.977 [2024-11-28 16:32:24.464348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.977 [2024-11-28 16:32:24.464386] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.977 [2024-11-28 16:32:24.464401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.977 [2024-11-28 16:32:24.464457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.977 [2024-11-28 16:32:24.464474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.977 #25 NEW cov: 12450 ft: 14790 corp: 8/200b lim: 50 exec/s: 0 rss: 72Mb L: 33/47 MS: 1 CrossOver- 00:08:26.977 [2024-11-28 16:32:24.504439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.977 [2024-11-28 16:32:24.504469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.977 [2024-11-28 16:32:24.504515] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.977 [2024-11-28 16:32:24.504530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.977 [2024-11-28 16:32:24.504586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.977 [2024-11-28 16:32:24.504606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.977 #26 NEW cov: 12450 ft: 14859 corp: 9/236b lim: 50 exec/s: 0 rss: 72Mb L: 36/47 MS: 1 InsertRepeatedBytes- 00:08:26.977 [2024-11-28 16:32:24.544665] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.977 [2024-11-28 16:32:24.544693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.977 [2024-11-28 16:32:24.544739] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.977 [2024-11-28 16:32:24.544756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.977 [2024-11-28 16:32:24.544811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.977 [2024-11-28 16:32:24.544828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.978 [2024-11-28 16:32:24.544886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:26.978 [2024-11-28 16:32:24.544902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.978 #27 NEW cov: 12450 ft: 14997 corp: 10/282b lim: 50 exec/s: 0 rss: 72Mb L: 46/47 MS: 1 PersAutoDict- DE: "\001\000\000\000\377\377\377\377"- 00:08:26.978 [2024-11-28 16:32:24.604869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:26.978 [2024-11-28 16:32:24.604897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.978 [2024-11-28 16:32:24.604945] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:26.978 [2024-11-28 16:32:24.604961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.978 [2024-11-28 16:32:24.605017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:26.978 [2024-11-28 16:32:24.605034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.978 [2024-11-28 16:32:24.605089] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:26.978 [2024-11-28 16:32:24.605104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.236 #28 NEW cov: 12450 ft: 15047 corp: 11/330b lim: 50 exec/s: 0 rss: 72Mb L: 48/48 MS: 1 InsertRepeatedBytes- 00:08:27.236 [2024-11-28 16:32:24.644821] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.236 [2024-11-28 16:32:24.644851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.236 [2024-11-28 16:32:24.644890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.236 [2024-11-28 16:32:24.644908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.236 [2024-11-28 16:32:24.644966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.236 [2024-11-28 16:32:24.644984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.236 #30 NEW cov: 12450 ft: 15093 corp: 12/369b lim: 50 exec/s: 0 rss: 72Mb L: 39/48 MS: 2 ShuffleBytes-CrossOver- 00:08:27.236 [2024-11-28 16:32:24.684955] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.236 [2024-11-28 16:32:24.684984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.236 [2024-11-28 16:32:24.685023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.236 [2024-11-28 16:32:24.685039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.236 [2024-11-28 16:32:24.685098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.236 [2024-11-28 16:32:24.685119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.236 #31 NEW cov: 12450 ft: 15121 corp: 13/407b lim: 50 exec/s: 0 rss: 72Mb L: 38/48 MS: 1 CrossOver- 00:08:27.236 [2024-11-28 16:32:24.725202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.236 [2024-11-28 16:32:24.725231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.236 [2024-11-28 16:32:24.725274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.236 [2024-11-28 16:32:24.725291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.237 [2024-11-28 16:32:24.725347] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.237 [2024-11-28 16:32:24.725363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.237 [2024-11-28 16:32:24.725420] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.237 [2024-11-28 16:32:24.725436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.237 #32 NEW cov: 12450 ft: 15127 corp: 14/453b lim: 50 exec/s: 0 rss: 72Mb L: 46/48 MS: 1 ChangeBit- 00:08:27.237 [2024-11-28 16:32:24.785509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.237 [2024-11-28 16:32:24.785537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.237 [2024-11-28 16:32:24.785586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.237 [2024-11-28 16:32:24.785605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.237 [2024-11-28 16:32:24.785662] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.237 [2024-11-28 16:32:24.785679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.237 [2024-11-28 16:32:24.785735] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.237 [2024-11-28 16:32:24.785749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.237 [2024-11-28 16:32:24.785806] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:27.237 [2024-11-28 16:32:24.785822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:27.237 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:27.237 #33 NEW cov: 12473 ft: 15245 corp: 15/503b lim: 50 exec/s: 0 rss: 72Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:08:27.237 [2024-11-28 16:32:24.845380] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.237 [2024-11-28 16:32:24.845409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.237 [2024-11-28 16:32:24.845459] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.237 [2024-11-28 16:32:24.845476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.237 [2024-11-28 16:32:24.845534] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.237 [2024-11-28 16:32:24.845551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.495 #34 NEW cov: 12473 ft: 15296 corp: 16/541b lim: 50 exec/s: 34 rss: 73Mb L: 38/50 MS: 1 ShuffleBytes- 00:08:27.495 [2024-11-28 16:32:24.905707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.495 [2024-11-28 16:32:24.905736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.495 [2024-11-28 16:32:24.905784] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.495 [2024-11-28 16:32:24.905800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.495 [2024-11-28 16:32:24.905858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.495 [2024-11-28 16:32:24.905875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.495 [2024-11-28 16:32:24.905933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.495 [2024-11-28 16:32:24.905951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.495 #35 NEW cov: 12473 ft: 15334 corp: 17/584b lim: 50 exec/s: 35 rss: 73Mb L: 43/50 MS: 1 EraseBytes- 00:08:27.495 [2024-11-28 16:32:24.965703] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.495 [2024-11-28 16:32:24.965731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.496 [2024-11-28 16:32:24.965771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.496 [2024-11-28 16:32:24.965788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.496 [2024-11-28 16:32:24.965845] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.496 [2024-11-28 16:32:24.965860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.496 #36 NEW cov: 12473 ft: 15343 corp: 18/622b lim: 50 exec/s: 36 rss: 73Mb L: 38/50 MS: 1 ShuffleBytes- 00:08:27.496 [2024-11-28 16:32:25.005979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.496 [2024-11-28 16:32:25.006009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.496 [2024-11-28 16:32:25.006058] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.496 [2024-11-28 16:32:25.006075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.496 [2024-11-28 16:32:25.006132] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.496 [2024-11-28 16:32:25.006148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.496 [2024-11-28 16:32:25.006207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.496 [2024-11-28 16:32:25.006223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.496 #37 NEW cov: 12473 ft: 15359 corp: 19/669b lim: 50 exec/s: 37 rss: 73Mb L: 47/50 MS: 1 InsertRepeatedBytes- 00:08:27.496 [2024-11-28 16:32:25.065627] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.496 [2024-11-28 16:32:25.065656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.496 #38 NEW cov: 12473 ft: 15430 corp: 20/679b lim: 50 exec/s: 38 rss: 73Mb L: 10/50 MS: 1 ChangeBinInt- 00:08:27.496 [2024-11-28 16:32:25.106207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.496 [2024-11-28 16:32:25.106236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.496 [2024-11-28 16:32:25.106285] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.496 [2024-11-28 16:32:25.106302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.496 [2024-11-28 16:32:25.106357] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.496 [2024-11-28 16:32:25.106373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.496 [2024-11-28 16:32:25.106429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.496 [2024-11-28 16:32:25.106446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.754 #39 NEW cov: 12473 ft: 15457 corp: 21/726b lim: 50 exec/s: 39 rss: 73Mb L: 47/50 MS: 1 CopyPart- 00:08:27.754 [2024-11-28 16:32:25.166574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.754 [2024-11-28 16:32:25.166604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.754 [2024-11-28 16:32:25.166659] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.754 [2024-11-28 16:32:25.166676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.754 [2024-11-28 16:32:25.166734] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.754 [2024-11-28 16:32:25.166751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.754 [2024-11-28 16:32:25.166807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.754 [2024-11-28 16:32:25.166823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.754 [2024-11-28 16:32:25.166883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:27.754 [2024-11-28 16:32:25.166900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:27.754 #40 NEW cov: 12473 ft: 15470 corp: 22/776b lim: 50 exec/s: 40 rss: 73Mb L: 50/50 MS: 1 CrossOver- 00:08:27.754 [2024-11-28 16:32:25.226615] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.754 [2024-11-28 16:32:25.226643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.754 [2024-11-28 16:32:25.226694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.754 [2024-11-28 16:32:25.226709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.754 [2024-11-28 16:32:25.226764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.754 [2024-11-28 16:32:25.226782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.755 [2024-11-28 16:32:25.226838] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.755 [2024-11-28 16:32:25.226852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.755 #46 NEW cov: 12473 ft: 15487 corp: 23/820b lim: 50 exec/s: 46 rss: 73Mb L: 44/50 MS: 1 InsertRepeatedBytes- 00:08:27.755 [2024-11-28 16:32:25.266733] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.755 [2024-11-28 16:32:25.266761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.755 [2024-11-28 16:32:25.266814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.755 [2024-11-28 16:32:25.266832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.755 [2024-11-28 16:32:25.266888] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:27.755 [2024-11-28 16:32:25.266903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.755 [2024-11-28 16:32:25.266959] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:27.755 [2024-11-28 16:32:25.266974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.755 #47 NEW cov: 12473 ft: 15504 corp: 24/868b lim: 50 exec/s: 47 rss: 73Mb L: 48/50 MS: 1 CrossOver- 00:08:27.755 [2024-11-28 16:32:25.326393] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.755 [2024-11-28 16:32:25.326422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.755 #48 NEW cov: 12473 ft: 15554 corp: 25/885b lim: 50 exec/s: 48 rss: 73Mb L: 17/50 MS: 1 InsertRepeatedBytes- 00:08:27.755 [2024-11-28 16:32:25.366692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:27.755 [2024-11-28 16:32:25.366720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.755 [2024-11-28 16:32:25.366777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:27.755 [2024-11-28 16:32:25.366793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.755 #49 NEW cov: 12473 ft: 15568 corp: 26/912b lim: 50 exec/s: 49 rss: 73Mb L: 27/50 MS: 1 CrossOver- 00:08:28.014 [2024-11-28 16:32:25.407169] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.014 [2024-11-28 16:32:25.407197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.014 [2024-11-28 16:32:25.407241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.014 [2024-11-28 16:32:25.407256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.014 [2024-11-28 16:32:25.407313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.014 [2024-11-28 16:32:25.407330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.014 [2024-11-28 16:32:25.407389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:28.014 [2024-11-28 16:32:25.407406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.014 #50 NEW cov: 12473 ft: 15606 corp: 27/959b lim: 50 exec/s: 50 rss: 73Mb L: 47/50 MS: 1 CMP- DE: "\001\223EFiw\322\304"- 00:08:28.014 [2024-11-28 16:32:25.447071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.014 [2024-11-28 16:32:25.447098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.014 [2024-11-28 16:32:25.447137] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.014 [2024-11-28 16:32:25.447156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.014 [2024-11-28 16:32:25.447215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.014 [2024-11-28 16:32:25.447231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.014 #51 NEW cov: 12473 ft: 15664 corp: 28/997b lim: 50 exec/s: 51 rss: 73Mb L: 38/50 MS: 1 ShuffleBytes- 00:08:28.014 [2024-11-28 16:32:25.507296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.014 [2024-11-28 16:32:25.507325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.014 [2024-11-28 16:32:25.507382] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.014 [2024-11-28 16:32:25.507399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.014 [2024-11-28 16:32:25.507455] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.014 [2024-11-28 16:32:25.507470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.014 #52 NEW cov: 12473 ft: 15685 corp: 29/1035b lim: 50 exec/s: 52 rss: 73Mb L: 38/50 MS: 1 CopyPart- 00:08:28.014 [2024-11-28 16:32:25.547692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.014 [2024-11-28 16:32:25.547721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.014 [2024-11-28 16:32:25.547774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.014 [2024-11-28 16:32:25.547790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.014 [2024-11-28 16:32:25.547849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.014 [2024-11-28 16:32:25.547866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.014 [2024-11-28 16:32:25.547923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:28.014 [2024-11-28 16:32:25.547937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.014 [2024-11-28 16:32:25.547994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:28.014 [2024-11-28 16:32:25.548011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:28.014 #53 NEW cov: 12473 ft: 15748 corp: 30/1085b lim: 50 exec/s: 53 rss: 73Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:08:28.014 [2024-11-28 16:32:25.587674] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.014 [2024-11-28 16:32:25.587702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.014 [2024-11-28 16:32:25.587752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.014 [2024-11-28 16:32:25.587768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.014 [2024-11-28 16:32:25.587825] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.014 [2024-11-28 16:32:25.587841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.014 [2024-11-28 16:32:25.587902] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:28.014 [2024-11-28 16:32:25.587918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.014 #54 NEW cov: 12473 ft: 15768 corp: 31/1133b lim: 50 exec/s: 54 rss: 73Mb L: 48/50 MS: 1 InsertByte- 00:08:28.014 [2024-11-28 16:32:25.647338] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.014 [2024-11-28 16:32:25.647367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.274 #55 NEW cov: 12473 ft: 15807 corp: 32/1150b lim: 50 exec/s: 55 rss: 73Mb L: 17/50 MS: 1 ChangeByte- 00:08:28.274 [2024-11-28 16:32:25.707808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.274 [2024-11-28 16:32:25.707837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.274 [2024-11-28 16:32:25.707877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.274 [2024-11-28 16:32:25.707891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.274 [2024-11-28 16:32:25.707949] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.274 [2024-11-28 16:32:25.707966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.274 #56 NEW cov: 12473 ft: 15820 corp: 33/1183b lim: 50 exec/s: 56 rss: 74Mb L: 33/50 MS: 1 ShuffleBytes- 00:08:28.274 [2024-11-28 16:32:25.768164] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.274 [2024-11-28 16:32:25.768191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.274 [2024-11-28 16:32:25.768237] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.274 [2024-11-28 16:32:25.768254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.274 [2024-11-28 16:32:25.768311] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.274 [2024-11-28 16:32:25.768327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.274 [2024-11-28 16:32:25.768385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:28.274 [2024-11-28 16:32:25.768401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.274 #57 NEW cov: 12473 ft: 15838 corp: 34/1231b lim: 50 exec/s: 57 rss: 74Mb L: 48/50 MS: 1 InsertRepeatedBytes- 00:08:28.274 [2024-11-28 16:32:25.807739] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.274 [2024-11-28 16:32:25.807768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.274 #58 NEW cov: 12473 ft: 15901 corp: 35/1241b lim: 50 exec/s: 58 rss: 74Mb L: 10/50 MS: 1 CopyPart- 00:08:28.274 [2024-11-28 16:32:25.868427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:28.274 [2024-11-28 16:32:25.868455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.274 [2024-11-28 16:32:25.868505] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:28.274 [2024-11-28 16:32:25.868522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.274 [2024-11-28 16:32:25.868581] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:28.274 [2024-11-28 16:32:25.868604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:28.274 [2024-11-28 16:32:25.868663] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:28.274 [2024-11-28 16:32:25.868678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:28.274 #59 NEW cov: 12473 ft: 15905 corp: 36/1290b lim: 50 exec/s: 29 rss: 74Mb L: 49/50 MS: 1 InsertByte- 00:08:28.274 #59 DONE cov: 12473 ft: 15905 corp: 36/1290b lim: 50 exec/s: 29 rss: 74Mb 00:08:28.274 ###### Recommended dictionary. ###### 00:08:28.274 "\001\000\000\000\377\377\377\377" # Uses: 2 00:08:28.274 "\001\223EFiw\322\304" # Uses: 0 00:08:28.274 ###### End of recommended dictionary. ###### 00:08:28.274 Done 59 runs in 2 second(s) 00:08:28.533 16:32:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_21.conf /var/tmp/suppress_nvmf_fuzz 00:08:28.533 16:32:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:28.533 16:32:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:28.533 16:32:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:28.533 16:32:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:28.533 16:32:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:28.533 16:32:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:28.534 16:32:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:28.534 16:32:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:28.534 16:32:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:28.534 16:32:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:28.534 16:32:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 22 00:08:28.534 16:32:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4422 00:08:28.534 16:32:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:28.534 16:32:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:28.534 16:32:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:28.534 16:32:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:28.534 16:32:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:28.534 16:32:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 00:08:28.534 [2024-11-28 16:32:26.062535] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:28.534 [2024-11-28 16:32:26.062625] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3764071 ] 00:08:28.793 [2024-11-28 16:32:26.245878] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.793 [2024-11-28 16:32:26.268056] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:28.793 [2024-11-28 16:32:26.320162] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:28.793 [2024-11-28 16:32:26.336527] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:28.793 INFO: Running with entropic power schedule (0xFF, 100). 00:08:28.793 INFO: Seed: 2055198216 00:08:28.793 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:28.793 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:28.793 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:28.793 INFO: A corpus is not provided, starting from an empty corpus 00:08:28.793 #2 INITED exec/s: 0 rss: 64Mb 00:08:28.793 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:28.793 This may also happen if the target rejected all inputs we tried so far 00:08:28.793 [2024-11-28 16:32:26.392009] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:28.793 [2024-11-28 16:32:26.392040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:28.793 [2024-11-28 16:32:26.392091] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:28.793 [2024-11-28 16:32:26.392114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:28.793 [2024-11-28 16:32:26.392166] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:28.793 [2024-11-28 16:32:26.392181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.362 NEW_FUNC[1/715]: 0x479fb8 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:29.362 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:29.362 #12 NEW cov: 12271 ft: 12270 corp: 2/54b lim: 85 exec/s: 0 rss: 72Mb L: 53/53 MS: 5 InsertByte-ShuffleBytes-InsertByte-InsertRepeatedBytes-InsertRepeatedBytes- 00:08:29.362 [2024-11-28 16:32:26.722722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.362 [2024-11-28 16:32:26.722756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.362 [2024-11-28 16:32:26.722811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.362 [2024-11-28 16:32:26.722826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.362 NEW_FUNC[1/1]: 0xf81358 in spdk_get_ticks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/env.c:321 00:08:29.362 #15 NEW cov: 12385 ft: 13092 corp: 3/97b lim: 85 exec/s: 0 rss: 72Mb L: 43/53 MS: 3 ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:29.362 [2024-11-28 16:32:26.762747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.362 [2024-11-28 16:32:26.762775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.362 [2024-11-28 16:32:26.762827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.362 [2024-11-28 16:32:26.762843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.362 #16 NEW cov: 12391 ft: 13231 corp: 4/136b lim: 85 exec/s: 0 rss: 72Mb L: 39/53 MS: 1 EraseBytes- 00:08:29.362 [2024-11-28 16:32:26.822926] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.362 [2024-11-28 16:32:26.822955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.362 [2024-11-28 16:32:26.823005] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.362 [2024-11-28 16:32:26.823022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.362 #17 NEW cov: 12476 ft: 13424 corp: 5/180b lim: 85 exec/s: 0 rss: 72Mb L: 44/53 MS: 1 InsertByte- 00:08:29.362 [2024-11-28 16:32:26.883107] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.362 [2024-11-28 16:32:26.883133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.362 [2024-11-28 16:32:26.883170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.362 [2024-11-28 16:32:26.883187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.362 #19 NEW cov: 12476 ft: 13703 corp: 6/218b lim: 85 exec/s: 0 rss: 72Mb L: 38/53 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:29.362 [2024-11-28 16:32:26.923195] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.362 [2024-11-28 16:32:26.923222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.362 [2024-11-28 16:32:26.923272] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.362 [2024-11-28 16:32:26.923289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.362 #20 NEW cov: 12476 ft: 13752 corp: 7/262b lim: 85 exec/s: 0 rss: 72Mb L: 44/53 MS: 1 ChangeBit- 00:08:29.362 [2024-11-28 16:32:26.983511] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.362 [2024-11-28 16:32:26.983537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.362 [2024-11-28 16:32:26.983572] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.362 [2024-11-28 16:32:26.983588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.362 [2024-11-28 16:32:26.983662] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.362 [2024-11-28 16:32:26.983678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.362 #21 NEW cov: 12476 ft: 13783 corp: 8/315b lim: 85 exec/s: 0 rss: 72Mb L: 53/53 MS: 1 ChangeByte- 00:08:29.622 [2024-11-28 16:32:27.023500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.622 [2024-11-28 16:32:27.023527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.622 [2024-11-28 16:32:27.023574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.622 [2024-11-28 16:32:27.023590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.622 #22 NEW cov: 12476 ft: 13842 corp: 9/351b lim: 85 exec/s: 0 rss: 72Mb L: 36/53 MS: 1 EraseBytes- 00:08:29.622 [2024-11-28 16:32:27.083847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.622 [2024-11-28 16:32:27.083873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.622 [2024-11-28 16:32:27.083920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.622 [2024-11-28 16:32:27.083935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.622 [2024-11-28 16:32:27.083991] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.622 [2024-11-28 16:32:27.084006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.622 #23 NEW cov: 12476 ft: 13884 corp: 10/404b lim: 85 exec/s: 0 rss: 72Mb L: 53/53 MS: 1 ShuffleBytes- 00:08:29.622 [2024-11-28 16:32:27.124088] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.622 [2024-11-28 16:32:27.124114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.622 [2024-11-28 16:32:27.124164] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.622 [2024-11-28 16:32:27.124179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.622 [2024-11-28 16:32:27.124231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.622 [2024-11-28 16:32:27.124247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.622 [2024-11-28 16:32:27.124300] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.622 [2024-11-28 16:32:27.124315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.622 #24 NEW cov: 12476 ft: 14264 corp: 11/487b lim: 85 exec/s: 0 rss: 72Mb L: 83/83 MS: 1 InsertRepeatedBytes- 00:08:29.622 [2024-11-28 16:32:27.183799] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.622 [2024-11-28 16:32:27.183826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.622 #25 NEW cov: 12476 ft: 15127 corp: 12/504b lim: 85 exec/s: 0 rss: 72Mb L: 17/83 MS: 1 CrossOver- 00:08:29.622 [2024-11-28 16:32:27.224078] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.622 [2024-11-28 16:32:27.224105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.622 [2024-11-28 16:32:27.224141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.622 [2024-11-28 16:32:27.224159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.622 #26 NEW cov: 12476 ft: 15229 corp: 13/543b lim: 85 exec/s: 0 rss: 72Mb L: 39/83 MS: 1 EraseBytes- 00:08:29.622 [2024-11-28 16:32:27.264214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.622 [2024-11-28 16:32:27.264243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.622 [2024-11-28 16:32:27.264296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.622 [2024-11-28 16:32:27.264312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.881 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:29.882 #27 NEW cov: 12499 ft: 15308 corp: 14/584b lim: 85 exec/s: 0 rss: 73Mb L: 41/83 MS: 1 CMP- DE: "\002\000"- 00:08:29.882 [2024-11-28 16:32:27.304334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.882 [2024-11-28 16:32:27.304359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.882 [2024-11-28 16:32:27.304395] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.882 [2024-11-28 16:32:27.304410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.882 #28 NEW cov: 12499 ft: 15333 corp: 15/623b lim: 85 exec/s: 0 rss: 73Mb L: 39/83 MS: 1 InsertByte- 00:08:29.882 [2024-11-28 16:32:27.344720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.882 [2024-11-28 16:32:27.344747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.882 [2024-11-28 16:32:27.344794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.882 [2024-11-28 16:32:27.344809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.882 [2024-11-28 16:32:27.344864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.882 [2024-11-28 16:32:27.344880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.882 [2024-11-28 16:32:27.344936] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.882 [2024-11-28 16:32:27.344952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.882 #29 NEW cov: 12499 ft: 15350 corp: 16/706b lim: 85 exec/s: 29 rss: 73Mb L: 83/83 MS: 1 ChangeBinInt- 00:08:29.882 [2024-11-28 16:32:27.404884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.882 [2024-11-28 16:32:27.404910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.882 [2024-11-28 16:32:27.404962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.882 [2024-11-28 16:32:27.404976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.882 [2024-11-28 16:32:27.405029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.882 [2024-11-28 16:32:27.405046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.882 [2024-11-28 16:32:27.405100] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:29.882 [2024-11-28 16:32:27.405116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.882 #30 NEW cov: 12499 ft: 15362 corp: 17/789b lim: 85 exec/s: 30 rss: 73Mb L: 83/83 MS: 1 ChangeBinInt- 00:08:29.882 [2024-11-28 16:32:27.444873] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.882 [2024-11-28 16:32:27.444899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.882 [2024-11-28 16:32:27.444946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.882 [2024-11-28 16:32:27.444961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.882 [2024-11-28 16:32:27.445015] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:29.882 [2024-11-28 16:32:27.445030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.882 #31 NEW cov: 12499 ft: 15410 corp: 18/843b lim: 85 exec/s: 31 rss: 73Mb L: 54/83 MS: 1 InsertByte- 00:08:29.882 [2024-11-28 16:32:27.504885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:29.882 [2024-11-28 16:32:27.504911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.882 [2024-11-28 16:32:27.504947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:29.882 [2024-11-28 16:32:27.504961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.882 #32 NEW cov: 12499 ft: 15434 corp: 19/882b lim: 85 exec/s: 32 rss: 73Mb L: 39/83 MS: 1 ChangeBit- 00:08:30.141 [2024-11-28 16:32:27.544940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.141 [2024-11-28 16:32:27.544967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.141 [2024-11-28 16:32:27.545019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.141 [2024-11-28 16:32:27.545034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.141 #33 NEW cov: 12499 ft: 15450 corp: 20/930b lim: 85 exec/s: 33 rss: 73Mb L: 48/83 MS: 1 InsertRepeatedBytes- 00:08:30.141 [2024-11-28 16:32:27.605286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.141 [2024-11-28 16:32:27.605312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.141 [2024-11-28 16:32:27.605361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.141 [2024-11-28 16:32:27.605377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.141 [2024-11-28 16:32:27.605431] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.141 [2024-11-28 16:32:27.605446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.141 #34 NEW cov: 12499 ft: 15538 corp: 21/984b lim: 85 exec/s: 34 rss: 73Mb L: 54/83 MS: 1 InsertByte- 00:08:30.141 [2024-11-28 16:32:27.665291] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.141 [2024-11-28 16:32:27.665317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.141 [2024-11-28 16:32:27.665352] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.141 [2024-11-28 16:32:27.665368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.141 #35 NEW cov: 12499 ft: 15565 corp: 22/1032b lim: 85 exec/s: 35 rss: 73Mb L: 48/83 MS: 1 ChangeBinInt- 00:08:30.141 [2024-11-28 16:32:27.725479] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.141 [2024-11-28 16:32:27.725507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.141 [2024-11-28 16:32:27.725560] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.141 [2024-11-28 16:32:27.725576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.141 #36 NEW cov: 12499 ft: 15578 corp: 23/1078b lim: 85 exec/s: 36 rss: 73Mb L: 46/83 MS: 1 PersAutoDict- DE: "\002\000"- 00:08:30.141 [2024-11-28 16:32:27.785830] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.141 [2024-11-28 16:32:27.785859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.141 [2024-11-28 16:32:27.785899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.141 [2024-11-28 16:32:27.785915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.141 [2024-11-28 16:32:27.785972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.141 [2024-11-28 16:32:27.785988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.401 #37 NEW cov: 12499 ft: 15588 corp: 24/1133b lim: 85 exec/s: 37 rss: 73Mb L: 55/83 MS: 1 InsertByte- 00:08:30.401 [2024-11-28 16:32:27.845960] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.401 [2024-11-28 16:32:27.845986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.401 [2024-11-28 16:32:27.846034] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.401 [2024-11-28 16:32:27.846050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.401 [2024-11-28 16:32:27.846104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.401 [2024-11-28 16:32:27.846120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.401 #38 NEW cov: 12499 ft: 15600 corp: 25/1188b lim: 85 exec/s: 38 rss: 73Mb L: 55/83 MS: 1 ChangeByte- 00:08:30.401 [2024-11-28 16:32:27.905939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.401 [2024-11-28 16:32:27.905965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.401 [2024-11-28 16:32:27.906002] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.401 [2024-11-28 16:32:27.906017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.401 #39 NEW cov: 12499 ft: 15612 corp: 26/1232b lim: 85 exec/s: 39 rss: 73Mb L: 44/83 MS: 1 ChangeBinInt- 00:08:30.401 [2024-11-28 16:32:27.946080] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.401 [2024-11-28 16:32:27.946106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.401 [2024-11-28 16:32:27.946144] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.401 [2024-11-28 16:32:27.946160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.401 #40 NEW cov: 12499 ft: 15648 corp: 27/1268b lim: 85 exec/s: 40 rss: 73Mb L: 36/83 MS: 1 ChangeBinInt- 00:08:30.401 [2024-11-28 16:32:28.006273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.401 [2024-11-28 16:32:28.006302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.401 [2024-11-28 16:32:28.006341] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.401 [2024-11-28 16:32:28.006359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.661 #41 NEW cov: 12499 ft: 15710 corp: 28/1316b lim: 85 exec/s: 41 rss: 74Mb L: 48/83 MS: 1 ChangeBit- 00:08:30.661 [2024-11-28 16:32:28.066569] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.661 [2024-11-28 16:32:28.066603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.661 [2024-11-28 16:32:28.066645] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.661 [2024-11-28 16:32:28.066661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.661 [2024-11-28 16:32:28.066714] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.661 [2024-11-28 16:32:28.066729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.661 #42 NEW cov: 12499 ft: 15718 corp: 29/1370b lim: 85 exec/s: 42 rss: 74Mb L: 54/83 MS: 1 InsertByte- 00:08:30.661 [2024-11-28 16:32:28.106633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.661 [2024-11-28 16:32:28.106663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.661 [2024-11-28 16:32:28.106699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.661 [2024-11-28 16:32:28.106730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.661 [2024-11-28 16:32:28.106786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.661 [2024-11-28 16:32:28.106802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.661 #43 NEW cov: 12499 ft: 15728 corp: 30/1429b lim: 85 exec/s: 43 rss: 74Mb L: 59/83 MS: 1 CMP- DE: "\002\000\000\000"- 00:08:30.661 [2024-11-28 16:32:28.146775] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.661 [2024-11-28 16:32:28.146801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.661 [2024-11-28 16:32:28.146849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.661 [2024-11-28 16:32:28.146866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.661 [2024-11-28 16:32:28.146922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.661 [2024-11-28 16:32:28.146938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.661 #44 NEW cov: 12499 ft: 15738 corp: 31/1483b lim: 85 exec/s: 44 rss: 74Mb L: 54/83 MS: 1 ChangeBinInt- 00:08:30.661 [2024-11-28 16:32:28.186605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.661 [2024-11-28 16:32:28.186633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.661 #45 NEW cov: 12499 ft: 15759 corp: 32/1500b lim: 85 exec/s: 45 rss: 74Mb L: 17/83 MS: 1 CopyPart- 00:08:30.661 [2024-11-28 16:32:28.246914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.661 [2024-11-28 16:32:28.246940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.661 [2024-11-28 16:32:28.246979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.661 [2024-11-28 16:32:28.246995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.661 #46 NEW cov: 12499 ft: 15779 corp: 33/1548b lim: 85 exec/s: 46 rss: 74Mb L: 48/83 MS: 1 ChangeBinInt- 00:08:30.661 [2024-11-28 16:32:28.307418] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.661 [2024-11-28 16:32:28.307445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.921 [2024-11-28 16:32:28.307492] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.921 [2024-11-28 16:32:28.307509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.921 [2024-11-28 16:32:28.307562] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:30.921 [2024-11-28 16:32:28.307579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.921 [2024-11-28 16:32:28.307633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:30.921 [2024-11-28 16:32:28.307658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.921 #47 NEW cov: 12499 ft: 15849 corp: 34/1631b lim: 85 exec/s: 47 rss: 74Mb L: 83/83 MS: 1 ChangeBinInt- 00:08:30.921 [2024-11-28 16:32:28.367228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:30.921 [2024-11-28 16:32:28.367254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.921 [2024-11-28 16:32:28.367290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:30.921 [2024-11-28 16:32:28.367307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.921 #48 NEW cov: 12499 ft: 15879 corp: 35/1675b lim: 85 exec/s: 24 rss: 74Mb L: 44/83 MS: 1 ChangeASCIIInt- 00:08:30.921 #48 DONE cov: 12499 ft: 15879 corp: 35/1675b lim: 85 exec/s: 24 rss: 74Mb 00:08:30.921 ###### Recommended dictionary. ###### 00:08:30.921 "\002\000" # Uses: 1 00:08:30.921 "\002\000\000\000" # Uses: 0 00:08:30.921 ###### End of recommended dictionary. ###### 00:08:30.921 Done 48 runs in 2 second(s) 00:08:30.921 16:32:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_22.conf /var/tmp/suppress_nvmf_fuzz 00:08:30.921 16:32:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:30.921 16:32:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:30.921 16:32:28 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:30.921 16:32:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:30.921 16:32:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:30.921 16:32:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:30.921 16:32:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:30.921 16:32:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:30.921 16:32:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:30.921 16:32:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:30.921 16:32:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 23 00:08:30.921 16:32:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4423 00:08:30.921 16:32:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:30.921 16:32:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:30.921 16:32:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:30.921 16:32:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:30.921 16:32:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:30.921 16:32:28 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 00:08:30.921 [2024-11-28 16:32:28.543828] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:30.921 [2024-11-28 16:32:28.543924] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3764610 ] 00:08:31.181 [2024-11-28 16:32:28.717654] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:31.181 [2024-11-28 16:32:28.739462] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.181 [2024-11-28 16:32:28.791583] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:31.181 [2024-11-28 16:32:28.807941] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:31.181 INFO: Running with entropic power schedule (0xFF, 100). 00:08:31.181 INFO: Seed: 232239479 00:08:31.440 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:31.440 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:31.440 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:31.440 INFO: A corpus is not provided, starting from an empty corpus 00:08:31.440 #2 INITED exec/s: 0 rss: 65Mb 00:08:31.440 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:31.440 This may also happen if the target rejected all inputs we tried so far 00:08:31.440 [2024-11-28 16:32:28.852661] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.440 [2024-11-28 16:32:28.852695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.699 NEW_FUNC[1/715]: 0x47d1f8 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:31.699 NEW_FUNC[2/715]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:31.699 #4 NEW cov: 12201 ft: 12188 corp: 2/6b lim: 25 exec/s: 0 rss: 72Mb L: 5/5 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:31.699 [2024-11-28 16:32:29.193450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.699 [2024-11-28 16:32:29.193485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.699 #5 NEW cov: 12318 ft: 12708 corp: 3/12b lim: 25 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 InsertByte- 00:08:31.699 [2024-11-28 16:32:29.283605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.699 [2024-11-28 16:32:29.283636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.959 #6 NEW cov: 12324 ft: 13087 corp: 4/18b lim: 25 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 CopyPart- 00:08:31.959 [2024-11-28 16:32:29.373810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.959 [2024-11-28 16:32:29.373840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.959 #7 NEW cov: 12409 ft: 13331 corp: 5/24b lim: 25 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:31.959 [2024-11-28 16:32:29.464050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.959 [2024-11-28 16:32:29.464079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.959 #8 NEW cov: 12409 ft: 13449 corp: 6/30b lim: 25 exec/s: 0 rss: 73Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:31.959 [2024-11-28 16:32:29.524197] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.959 [2024-11-28 16:32:29.524228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.959 #9 NEW cov: 12409 ft: 13503 corp: 7/37b lim: 25 exec/s: 0 rss: 73Mb L: 7/7 MS: 1 InsertByte- 00:08:31.959 [2024-11-28 16:32:29.574346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:31.959 [2024-11-28 16:32:29.574377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.217 #12 NEW cov: 12409 ft: 13592 corp: 8/44b lim: 25 exec/s: 0 rss: 73Mb L: 7/7 MS: 3 EraseBytes-CrossOver-CopyPart- 00:08:32.217 [2024-11-28 16:32:29.624478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.217 [2024-11-28 16:32:29.624509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.217 #13 NEW cov: 12409 ft: 13719 corp: 9/51b lim: 25 exec/s: 0 rss: 73Mb L: 7/7 MS: 1 InsertByte- 00:08:32.217 [2024-11-28 16:32:29.674579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.217 [2024-11-28 16:32:29.674615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.217 #15 NEW cov: 12409 ft: 13817 corp: 10/58b lim: 25 exec/s: 0 rss: 73Mb L: 7/7 MS: 2 CopyPart-CrossOver- 00:08:32.217 [2024-11-28 16:32:29.735391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.217 [2024-11-28 16:32:29.735425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.217 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:32.217 #16 NEW cov: 12426 ft: 13920 corp: 11/64b lim: 25 exec/s: 0 rss: 73Mb L: 6/7 MS: 1 ChangeByte- 00:08:32.217 [2024-11-28 16:32:29.795478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.217 [2024-11-28 16:32:29.795505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.217 #17 NEW cov: 12426 ft: 13994 corp: 12/69b lim: 25 exec/s: 0 rss: 73Mb L: 5/7 MS: 1 EraseBytes- 00:08:32.217 [2024-11-28 16:32:29.856017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.217 [2024-11-28 16:32:29.856045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.217 [2024-11-28 16:32:29.856092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.217 [2024-11-28 16:32:29.856109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.217 [2024-11-28 16:32:29.856168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.218 [2024-11-28 16:32:29.856183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.218 [2024-11-28 16:32:29.856241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.218 [2024-11-28 16:32:29.856256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.476 #18 NEW cov: 12426 ft: 14653 corp: 13/92b lim: 25 exec/s: 18 rss: 73Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:08:32.476 [2024-11-28 16:32:29.916244] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.476 [2024-11-28 16:32:29.916270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.476 [2024-11-28 16:32:29.916322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.476 [2024-11-28 16:32:29.916338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.476 [2024-11-28 16:32:29.916390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.476 [2024-11-28 16:32:29.916405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.476 [2024-11-28 16:32:29.916457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.476 [2024-11-28 16:32:29.916472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.476 #19 NEW cov: 12426 ft: 14669 corp: 14/115b lim: 25 exec/s: 19 rss: 73Mb L: 23/23 MS: 1 ShuffleBytes- 00:08:32.476 [2024-11-28 16:32:29.975951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.476 [2024-11-28 16:32:29.975981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.476 #20 NEW cov: 12426 ft: 14725 corp: 15/122b lim: 25 exec/s: 20 rss: 73Mb L: 7/23 MS: 1 InsertByte- 00:08:32.476 [2024-11-28 16:32:30.036569] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.476 [2024-11-28 16:32:30.036602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.476 [2024-11-28 16:32:30.036651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.476 [2024-11-28 16:32:30.036668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.476 [2024-11-28 16:32:30.036724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.476 [2024-11-28 16:32:30.036738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.476 [2024-11-28 16:32:30.036794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:32.476 [2024-11-28 16:32:30.036812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.476 #21 NEW cov: 12426 ft: 14816 corp: 16/145b lim: 25 exec/s: 21 rss: 73Mb L: 23/23 MS: 1 ChangeBinInt- 00:08:32.476 [2024-11-28 16:32:30.096355] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.476 [2024-11-28 16:32:30.096389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.476 #22 NEW cov: 12426 ft: 14898 corp: 17/150b lim: 25 exec/s: 22 rss: 73Mb L: 5/23 MS: 1 EraseBytes- 00:08:32.735 [2024-11-28 16:32:30.136450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.736 [2024-11-28 16:32:30.136479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.736 #23 NEW cov: 12426 ft: 14913 corp: 18/156b lim: 25 exec/s: 23 rss: 73Mb L: 6/23 MS: 1 InsertByte- 00:08:32.736 [2024-11-28 16:32:30.196884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.736 [2024-11-28 16:32:30.196910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.736 [2024-11-28 16:32:30.196956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.736 [2024-11-28 16:32:30.196970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.736 [2024-11-28 16:32:30.197028] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.736 [2024-11-28 16:32:30.197044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.736 #24 NEW cov: 12426 ft: 15160 corp: 19/173b lim: 25 exec/s: 24 rss: 73Mb L: 17/23 MS: 1 InsertRepeatedBytes- 00:08:32.736 [2024-11-28 16:32:30.236749] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.736 [2024-11-28 16:32:30.236777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.736 #25 NEW cov: 12426 ft: 15197 corp: 20/180b lim: 25 exec/s: 25 rss: 73Mb L: 7/23 MS: 1 ChangeByte- 00:08:32.736 [2024-11-28 16:32:30.296888] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.736 [2024-11-28 16:32:30.296917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.736 #26 NEW cov: 12426 ft: 15226 corp: 21/188b lim: 25 exec/s: 26 rss: 74Mb L: 8/23 MS: 1 CopyPart- 00:08:32.736 [2024-11-28 16:32:30.337242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.736 [2024-11-28 16:32:30.337269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.736 [2024-11-28 16:32:30.337314] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.736 [2024-11-28 16:32:30.337331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.736 [2024-11-28 16:32:30.337390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.736 [2024-11-28 16:32:30.337407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.736 #27 NEW cov: 12426 ft: 15262 corp: 22/203b lim: 25 exec/s: 27 rss: 74Mb L: 15/23 MS: 1 CMP- DE: "\001t\017FIE\223\000"- 00:08:32.736 [2024-11-28 16:32:30.377102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.736 [2024-11-28 16:32:30.377131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.995 #28 NEW cov: 12426 ft: 15289 corp: 23/210b lim: 25 exec/s: 28 rss: 74Mb L: 7/23 MS: 1 ChangeByte- 00:08:32.995 [2024-11-28 16:32:30.417484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.995 [2024-11-28 16:32:30.417511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.995 [2024-11-28 16:32:30.417556] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.995 [2024-11-28 16:32:30.417572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.995 [2024-11-28 16:32:30.417632] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.995 [2024-11-28 16:32:30.417649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.995 #29 NEW cov: 12426 ft: 15315 corp: 24/226b lim: 25 exec/s: 29 rss: 74Mb L: 16/23 MS: 1 InsertRepeatedBytes- 00:08:32.995 [2024-11-28 16:32:30.477633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.995 [2024-11-28 16:32:30.477660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.995 [2024-11-28 16:32:30.477711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:32.995 [2024-11-28 16:32:30.477727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.995 [2024-11-28 16:32:30.477785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:32.995 [2024-11-28 16:32:30.477801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.995 #30 NEW cov: 12426 ft: 15359 corp: 25/241b lim: 25 exec/s: 30 rss: 74Mb L: 15/23 MS: 1 ChangeBinInt- 00:08:32.995 [2024-11-28 16:32:30.537578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.995 [2024-11-28 16:32:30.537609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.995 #31 NEW cov: 12426 ft: 15385 corp: 26/246b lim: 25 exec/s: 31 rss: 74Mb L: 5/23 MS: 1 CopyPart- 00:08:32.995 [2024-11-28 16:32:30.577674] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.995 [2024-11-28 16:32:30.577703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.995 #32 NEW cov: 12426 ft: 15388 corp: 27/252b lim: 25 exec/s: 32 rss: 74Mb L: 6/23 MS: 1 CrossOver- 00:08:32.995 [2024-11-28 16:32:30.617779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:32.995 [2024-11-28 16:32:30.617806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.995 #33 NEW cov: 12426 ft: 15432 corp: 28/259b lim: 25 exec/s: 33 rss: 74Mb L: 7/23 MS: 1 InsertByte- 00:08:33.254 [2024-11-28 16:32:30.658035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.254 [2024-11-28 16:32:30.658062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.254 [2024-11-28 16:32:30.658104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.254 [2024-11-28 16:32:30.658121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.254 #34 NEW cov: 12426 ft: 15712 corp: 29/272b lim: 25 exec/s: 34 rss: 74Mb L: 13/23 MS: 1 PersAutoDict- DE: "\001t\017FIE\223\000"- 00:08:33.254 [2024-11-28 16:32:30.698446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.254 [2024-11-28 16:32:30.698474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.254 [2024-11-28 16:32:30.698526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:33.254 [2024-11-28 16:32:30.698543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.254 [2024-11-28 16:32:30.698595] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:33.254 [2024-11-28 16:32:30.698616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.254 [2024-11-28 16:32:30.698673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:33.255 [2024-11-28 16:32:30.698700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.255 #35 NEW cov: 12426 ft: 15724 corp: 30/295b lim: 25 exec/s: 35 rss: 74Mb L: 23/23 MS: 1 CopyPart- 00:08:33.255 [2024-11-28 16:32:30.758265] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.255 [2024-11-28 16:32:30.758293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.255 #36 NEW cov: 12433 ft: 15744 corp: 31/302b lim: 25 exec/s: 36 rss: 74Mb L: 7/23 MS: 1 ChangeBit- 00:08:33.255 [2024-11-28 16:32:30.798343] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.255 [2024-11-28 16:32:30.798372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.255 #37 NEW cov: 12433 ft: 15755 corp: 32/309b lim: 25 exec/s: 37 rss: 74Mb L: 7/23 MS: 1 ChangeByte- 00:08:33.255 [2024-11-28 16:32:30.858496] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:33.255 [2024-11-28 16:32:30.858525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.255 #38 NEW cov: 12433 ft: 15796 corp: 33/314b lim: 25 exec/s: 19 rss: 74Mb L: 5/23 MS: 1 ChangeBinInt- 00:08:33.255 #38 DONE cov: 12433 ft: 15796 corp: 33/314b lim: 25 exec/s: 19 rss: 74Mb 00:08:33.255 ###### Recommended dictionary. ###### 00:08:33.255 "\001t\017FIE\223\000" # Uses: 1 00:08:33.255 ###### End of recommended dictionary. ###### 00:08:33.255 Done 38 runs in 2 second(s) 00:08:33.514 16:32:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_23.conf /var/tmp/suppress_nvmf_fuzz 00:08:33.514 16:32:30 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:33.514 16:32:30 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:33.514 16:32:30 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:33.514 16:32:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:33.514 16:32:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:33.514 16:32:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:33.514 16:32:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:33.514 16:32:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:33.514 16:32:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:33.514 16:32:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:33.514 16:32:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 24 00:08:33.514 16:32:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4424 00:08:33.514 16:32:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:33.514 16:32:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:33.514 16:32:30 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:33.514 16:32:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:33.514 16:32:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:33.514 16:32:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 00:08:33.514 [2024-11-28 16:32:31.030056] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:33.514 [2024-11-28 16:32:31.030142] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3764994 ] 00:08:33.773 [2024-11-28 16:32:31.217097] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.773 [2024-11-28 16:32:31.239197] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.773 [2024-11-28 16:32:31.291480] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:33.773 [2024-11-28 16:32:31.307846] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:33.773 INFO: Running with entropic power schedule (0xFF, 100). 00:08:33.773 INFO: Seed: 2730230648 00:08:33.773 INFO: Loaded 1 modules (384223 inline 8-bit counters): 384223 [0x2a3744c, 0x2a9512b), 00:08:33.773 INFO: Loaded 1 PC tables (384223 PCs): 384223 [0x2a95130,0x3071f20), 00:08:33.773 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:33.773 INFO: A corpus is not provided, starting from an empty corpus 00:08:33.773 #2 INITED exec/s: 0 rss: 64Mb 00:08:33.773 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:33.773 This may also happen if the target rejected all inputs we tried so far 00:08:33.773 [2024-11-28 16:32:31.353293] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.773 [2024-11-28 16:32:31.353323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.773 [2024-11-28 16:32:31.353377] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.773 [2024-11-28 16:32:31.353395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.031 NEW_FUNC[1/716]: 0x47e2e8 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:34.031 NEW_FUNC[2/716]: 0x48ef68 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:34.031 #11 NEW cov: 12277 ft: 12274 corp: 2/44b lim: 100 exec/s: 0 rss: 72Mb L: 43/43 MS: 4 InsertByte-InsertByte-CrossOver-InsertRepeatedBytes- 00:08:34.290 [2024-11-28 16:32:31.684014] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.290 [2024-11-28 16:32:31.684047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.290 [2024-11-28 16:32:31.684100] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.290 [2024-11-28 16:32:31.684117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.290 #12 NEW cov: 12390 ft: 12845 corp: 3/87b lim: 100 exec/s: 0 rss: 72Mb L: 43/43 MS: 1 CrossOver- 00:08:34.290 [2024-11-28 16:32:31.744115] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:14106333700321166275 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.290 [2024-11-28 16:32:31.744145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.290 [2024-11-28 16:32:31.744198] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:14106333703424951235 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.290 [2024-11-28 16:32:31.744214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.290 #13 NEW cov: 12396 ft: 13192 corp: 4/144b lim: 100 exec/s: 0 rss: 72Mb L: 57/57 MS: 1 InsertRepeatedBytes- 00:08:34.290 [2024-11-28 16:32:31.784170] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.290 [2024-11-28 16:32:31.784198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.290 [2024-11-28 16:32:31.784252] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:2818048 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.290 [2024-11-28 16:32:31.784269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.290 #14 NEW cov: 12481 ft: 13464 corp: 5/187b lim: 100 exec/s: 0 rss: 72Mb L: 43/57 MS: 1 ChangeBinInt- 00:08:34.290 [2024-11-28 16:32:31.844366] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.290 [2024-11-28 16:32:31.844393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.290 [2024-11-28 16:32:31.844432] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.290 [2024-11-28 16:32:31.844448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.290 #20 NEW cov: 12481 ft: 13556 corp: 6/231b lim: 100 exec/s: 0 rss: 72Mb L: 44/57 MS: 1 InsertByte- 00:08:34.290 [2024-11-28 16:32:31.884610] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:14106333700321166275 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.290 [2024-11-28 16:32:31.884638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.290 [2024-11-28 16:32:31.884693] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:14106333703424951235 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.290 [2024-11-28 16:32:31.884710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.290 [2024-11-28 16:32:31.884763] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:14106333703424951235 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.290 [2024-11-28 16:32:31.884781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.290 #21 NEW cov: 12481 ft: 13955 corp: 7/297b lim: 100 exec/s: 0 rss: 72Mb L: 66/66 MS: 1 CopyPart- 00:08:34.549 [2024-11-28 16:32:31.944662] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.549 [2024-11-28 16:32:31.944690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.549 [2024-11-28 16:32:31.944735] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:2818048 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.549 [2024-11-28 16:32:31.944768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.549 #22 NEW cov: 12481 ft: 14036 corp: 8/340b lim: 100 exec/s: 0 rss: 72Mb L: 43/66 MS: 1 ShuffleBytes- 00:08:34.549 [2024-11-28 16:32:32.004954] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.550 [2024-11-28 16:32:32.004981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.550 [2024-11-28 16:32:32.005027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13546827679130451968 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.550 [2024-11-28 16:32:32.005044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.550 [2024-11-28 16:32:32.005097] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.550 [2024-11-28 16:32:32.005113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.550 #23 NEW cov: 12481 ft: 14051 corp: 9/405b lim: 100 exec/s: 0 rss: 73Mb L: 65/66 MS: 1 CopyPart- 00:08:34.550 [2024-11-28 16:32:32.064957] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939008 len:254 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.550 [2024-11-28 16:32:32.064984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.550 [2024-11-28 16:32:32.065023] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.550 [2024-11-28 16:32:32.065039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.550 #24 NEW cov: 12481 ft: 14083 corp: 10/449b lim: 100 exec/s: 0 rss: 73Mb L: 44/66 MS: 1 ChangeBinInt- 00:08:34.550 [2024-11-28 16:32:32.105254] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.550 [2024-11-28 16:32:32.105280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.550 [2024-11-28 16:32:32.105316] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13546827679130468352 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.550 [2024-11-28 16:32:32.105333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.550 [2024-11-28 16:32:32.105391] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.550 [2024-11-28 16:32:32.105407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.550 #25 NEW cov: 12481 ft: 14180 corp: 11/514b lim: 100 exec/s: 0 rss: 73Mb L: 65/66 MS: 1 ChangeBit- 00:08:34.550 [2024-11-28 16:32:32.165275] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939008 len:254 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.550 [2024-11-28 16:32:32.165302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.550 [2024-11-28 16:32:32.165339] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.550 [2024-11-28 16:32:32.165355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.838 #26 NEW cov: 12481 ft: 14186 corp: 12/558b lim: 100 exec/s: 0 rss: 73Mb L: 44/66 MS: 1 ChangeBit- 00:08:34.838 [2024-11-28 16:32:32.225412] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939031 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.838 [2024-11-28 16:32:32.225439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.838 [2024-11-28 16:32:32.225491] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.838 [2024-11-28 16:32:32.225507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.838 NEW_FUNC[1/1]: 0x1c16738 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:34.838 #27 NEW cov: 12504 ft: 14221 corp: 13/602b lim: 100 exec/s: 0 rss: 73Mb L: 44/66 MS: 1 InsertByte- 00:08:34.838 [2024-11-28 16:32:32.265560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.838 [2024-11-28 16:32:32.265587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.838 [2024-11-28 16:32:32.265628] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.838 [2024-11-28 16:32:32.265644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.838 #28 NEW cov: 12504 ft: 14260 corp: 14/646b lim: 100 exec/s: 0 rss: 73Mb L: 44/66 MS: 1 ShuffleBytes- 00:08:34.838 [2024-11-28 16:32:32.305615] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:160366472462336000 len:2561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.838 [2024-11-28 16:32:32.305641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.838 [2024-11-28 16:32:32.305676] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13546827683425419008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.838 [2024-11-28 16:32:32.305693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.838 #29 NEW cov: 12504 ft: 14272 corp: 15/698b lim: 100 exec/s: 0 rss: 73Mb L: 52/66 MS: 1 CMP- DE: "\001\000\000\000\0029\274u"- 00:08:34.838 [2024-11-28 16:32:32.345928] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.838 [2024-11-28 16:32:32.345956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.838 [2024-11-28 16:32:32.345997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13546827679130468352 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.838 [2024-11-28 16:32:32.346015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.838 [2024-11-28 16:32:32.346069] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.838 [2024-11-28 16:32:32.346085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.838 #30 NEW cov: 12504 ft: 14278 corp: 16/763b lim: 100 exec/s: 30 rss: 73Mb L: 65/66 MS: 1 ShuffleBytes- 00:08:34.838 [2024-11-28 16:32:32.406111] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.838 [2024-11-28 16:32:32.406139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.838 [2024-11-28 16:32:32.406173] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13546827679130468352 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.838 [2024-11-28 16:32:32.406190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.838 [2024-11-28 16:32:32.406245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.838 [2024-11-28 16:32:32.406261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.838 #31 NEW cov: 12504 ft: 14351 corp: 17/828b lim: 100 exec/s: 31 rss: 73Mb L: 65/66 MS: 1 ChangeBinInt- 00:08:34.838 [2024-11-28 16:32:32.446050] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:8606711809 len:29953 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.838 [2024-11-28 16:32:32.446078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.838 [2024-11-28 16:32:32.446128] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13546827683425419008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:34.838 [2024-11-28 16:32:32.446144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.098 #32 NEW cov: 12504 ft: 14376 corp: 18/880b lim: 100 exec/s: 32 rss: 73Mb L: 52/66 MS: 1 PersAutoDict- DE: "\001\000\000\000\0029\274u"- 00:08:35.098 [2024-11-28 16:32:32.506064] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.098 [2024-11-28 16:32:32.506091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.098 #33 NEW cov: 12504 ft: 15138 corp: 19/904b lim: 100 exec/s: 33 rss: 73Mb L: 24/66 MS: 1 EraseBytes- 00:08:35.098 [2024-11-28 16:32:32.566260] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4294967296 len:3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.098 [2024-11-28 16:32:32.566287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.098 #34 NEW cov: 12504 ft: 15150 corp: 20/928b lim: 100 exec/s: 34 rss: 73Mb L: 24/66 MS: 1 PersAutoDict- DE: "\001\000\000\000\0029\274u"- 00:08:35.098 [2024-11-28 16:32:32.626577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:160366472462336000 len:2561 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.098 [2024-11-28 16:32:32.626609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.098 [2024-11-28 16:32:32.626648] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13546827683425419008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.098 [2024-11-28 16:32:32.626669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.098 #35 NEW cov: 12504 ft: 15189 corp: 21/980b lim: 100 exec/s: 35 rss: 73Mb L: 52/66 MS: 1 ChangeByte- 00:08:35.098 [2024-11-28 16:32:32.666945] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:14106333700321166275 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.098 [2024-11-28 16:32:32.666974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.098 [2024-11-28 16:32:32.667019] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:14178673876263027908 len:50373 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.098 [2024-11-28 16:32:32.667035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.098 [2024-11-28 16:32:32.667087] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:14178672772456432836 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.098 [2024-11-28 16:32:32.667104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.098 [2024-11-28 16:32:32.667159] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:14106333703424951235 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.098 [2024-11-28 16:32:32.667175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.098 #36 NEW cov: 12504 ft: 15551 corp: 22/1079b lim: 100 exec/s: 36 rss: 73Mb L: 99/99 MS: 1 InsertRepeatedBytes- 00:08:35.098 [2024-11-28 16:32:32.726871] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.098 [2024-11-28 16:32:32.726898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.098 [2024-11-28 16:32:32.726936] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.098 [2024-11-28 16:32:32.726952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.357 #37 NEW cov: 12504 ft: 15562 corp: 23/1122b lim: 100 exec/s: 37 rss: 73Mb L: 43/99 MS: 1 ShuffleBytes- 00:08:35.357 [2024-11-28 16:32:32.767123] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939008 len:254 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.357 [2024-11-28 16:32:32.767149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.357 [2024-11-28 16:32:32.767211] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.357 [2024-11-28 16:32:32.767226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.357 [2024-11-28 16:32:32.767281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.357 [2024-11-28 16:32:32.767298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.357 #38 NEW cov: 12504 ft: 15564 corp: 24/1194b lim: 100 exec/s: 38 rss: 74Mb L: 72/99 MS: 1 InsertRepeatedBytes- 00:08:35.357 [2024-11-28 16:32:32.827147] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.357 [2024-11-28 16:32:32.827174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.357 [2024-11-28 16:32:32.827220] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.357 [2024-11-28 16:32:32.827241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.357 #39 NEW cov: 12504 ft: 15565 corp: 25/1239b lim: 100 exec/s: 39 rss: 74Mb L: 45/99 MS: 1 InsertByte- 00:08:35.357 [2024-11-28 16:32:32.867240] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.357 [2024-11-28 16:32:32.867267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.357 [2024-11-28 16:32:32.867304] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.357 [2024-11-28 16:32:32.867319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.357 #40 NEW cov: 12504 ft: 15577 corp: 26/1284b lim: 100 exec/s: 40 rss: 74Mb L: 45/99 MS: 1 ChangeByte- 00:08:35.357 [2024-11-28 16:32:32.927726] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939031 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.357 [2024-11-28 16:32:32.927754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.357 [2024-11-28 16:32:32.927804] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.357 [2024-11-28 16:32:32.927821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.357 [2024-11-28 16:32:32.927873] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:42949672960 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.357 [2024-11-28 16:32:32.927890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.357 [2024-11-28 16:32:32.927947] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.357 [2024-11-28 16:32:32.927962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.357 #41 NEW cov: 12504 ft: 15584 corp: 27/1364b lim: 100 exec/s: 41 rss: 74Mb L: 80/99 MS: 1 CopyPart- 00:08:35.358 [2024-11-28 16:32:32.987749] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1563036170 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.358 [2024-11-28 16:32:32.987776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.358 [2024-11-28 16:32:32.987822] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:52917295621603392 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.358 [2024-11-28 16:32:32.987839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.358 [2024-11-28 16:32:32.987897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.358 [2024-11-28 16:32:32.987914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.616 #42 NEW cov: 12504 ft: 15606 corp: 28/1430b lim: 100 exec/s: 42 rss: 74Mb L: 66/99 MS: 1 InsertByte- 00:08:35.617 [2024-11-28 16:32:33.048057] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939008 len:254 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.617 [2024-11-28 16:32:33.048085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.617 [2024-11-28 16:32:33.048139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.617 [2024-11-28 16:32:33.048158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.617 [2024-11-28 16:32:33.048211] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.617 [2024-11-28 16:32:33.048226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.617 [2024-11-28 16:32:33.048281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.617 [2024-11-28 16:32:33.048297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.617 #43 NEW cov: 12504 ft: 15641 corp: 29/1520b lim: 100 exec/s: 43 rss: 74Mb L: 90/99 MS: 1 CrossOver- 00:08:35.617 [2024-11-28 16:32:33.107910] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:14106333700321166275 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.617 [2024-11-28 16:32:33.107937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.617 [2024-11-28 16:32:33.107975] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:14106333703424951235 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.617 [2024-11-28 16:32:33.107992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.617 #44 NEW cov: 12504 ft: 15671 corp: 30/1577b lim: 100 exec/s: 44 rss: 74Mb L: 57/99 MS: 1 ShuffleBytes- 00:08:35.617 [2024-11-28 16:32:33.148328] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.617 [2024-11-28 16:32:33.148355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.617 [2024-11-28 16:32:33.148403] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.617 [2024-11-28 16:32:33.148420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.617 [2024-11-28 16:32:33.148472] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.617 [2024-11-28 16:32:33.148487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.617 [2024-11-28 16:32:33.148541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:11 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.617 [2024-11-28 16:32:33.148557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.617 #45 NEW cov: 12504 ft: 15714 corp: 31/1660b lim: 100 exec/s: 45 rss: 74Mb L: 83/99 MS: 1 CrossOver- 00:08:35.617 [2024-11-28 16:32:33.188607] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:14106333700321166275 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.617 [2024-11-28 16:32:33.188637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.617 [2024-11-28 16:32:33.188687] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:14178673876263027908 len:50373 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.617 [2024-11-28 16:32:33.188705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.617 [2024-11-28 16:32:33.188760] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:14178672772456432836 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.617 [2024-11-28 16:32:33.188776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.617 [2024-11-28 16:32:33.188833] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:14106333703428883395 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.617 [2024-11-28 16:32:33.188849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.617 [2024-11-28 16:32:33.188905] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:14106333703424951235 len:50116 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.617 [2024-11-28 16:32:33.188921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:35.617 #46 NEW cov: 12504 ft: 15752 corp: 32/1760b lim: 100 exec/s: 46 rss: 74Mb L: 100/100 MS: 1 InsertByte- 00:08:35.617 [2024-11-28 16:32:33.248137] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4294967296 len:255 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.617 [2024-11-28 16:32:33.248179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.876 #47 NEW cov: 12504 ft: 15799 corp: 33/1784b lim: 100 exec/s: 47 rss: 74Mb L: 24/100 MS: 1 ChangeBinInt- 00:08:35.876 [2024-11-28 16:32:33.308460] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.876 [2024-11-28 16:32:33.308486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.876 [2024-11-28 16:32:33.308522] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:2818048 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.876 [2024-11-28 16:32:33.308538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.876 #48 NEW cov: 12504 ft: 15813 corp: 34/1827b lim: 100 exec/s: 48 rss: 74Mb L: 43/100 MS: 1 ShuffleBytes- 00:08:35.876 [2024-11-28 16:32:33.348636] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:1560939008 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.876 [2024-11-28 16:32:33.348663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.876 [2024-11-28 16:32:33.348699] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.876 [2024-11-28 16:32:33.348715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.876 #49 NEW cov: 12504 ft: 15836 corp: 35/1871b lim: 100 exec/s: 24 rss: 74Mb L: 44/100 MS: 1 CopyPart- 00:08:35.876 #49 DONE cov: 12504 ft: 15836 corp: 35/1871b lim: 100 exec/s: 24 rss: 74Mb 00:08:35.876 ###### Recommended dictionary. ###### 00:08:35.876 "\001\000\000\000\0029\274u" # Uses: 2 00:08:35.876 ###### End of recommended dictionary. ###### 00:08:35.876 Done 49 runs in 2 second(s) 00:08:35.876 16:32:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_24.conf /var/tmp/suppress_nvmf_fuzz 00:08:35.876 16:32:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:35.876 16:32:33 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:35.876 16:32:33 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@79 -- # trap - SIGINT SIGTERM EXIT 00:08:35.876 00:08:35.876 real 1m3.780s 00:08:35.876 user 1m39.495s 00:08:35.876 sys 0m8.184s 00:08:35.876 16:32:33 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:35.876 16:32:33 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:35.876 ************************************ 00:08:35.876 END TEST nvmf_llvm_fuzz 00:08:35.876 ************************************ 00:08:36.136 16:32:33 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:36.136 16:32:33 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:36.136 16:32:33 llvm_fuzz -- fuzz/llvm.sh@20 -- # run_test vfio_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:36.136 16:32:33 llvm_fuzz -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:36.136 16:32:33 llvm_fuzz -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:36.136 16:32:33 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:36.136 ************************************ 00:08:36.136 START TEST vfio_llvm_fuzz 00:08:36.136 ************************************ 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:36.136 * Looking for test storage... 00:08:36.136 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # lcov --version 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:36.136 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:08:36.136 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:36.136 --rc genhtml_branch_coverage=1 00:08:36.136 --rc genhtml_function_coverage=1 00:08:36.136 --rc genhtml_legend=1 00:08:36.136 --rc geninfo_all_blocks=1 00:08:36.137 --rc geninfo_unexecuted_blocks=1 00:08:36.137 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:36.137 ' 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:08:36.137 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:36.137 --rc genhtml_branch_coverage=1 00:08:36.137 --rc genhtml_function_coverage=1 00:08:36.137 --rc genhtml_legend=1 00:08:36.137 --rc geninfo_all_blocks=1 00:08:36.137 --rc geninfo_unexecuted_blocks=1 00:08:36.137 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:36.137 ' 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:08:36.137 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:36.137 --rc genhtml_branch_coverage=1 00:08:36.137 --rc genhtml_function_coverage=1 00:08:36.137 --rc genhtml_legend=1 00:08:36.137 --rc geninfo_all_blocks=1 00:08:36.137 --rc geninfo_unexecuted_blocks=1 00:08:36.137 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:36.137 ' 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:08:36.137 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:36.137 --rc genhtml_branch_coverage=1 00:08:36.137 --rc genhtml_function_coverage=1 00:08:36.137 --rc genhtml_legend=1 00:08:36.137 --rc geninfo_all_blocks=1 00:08:36.137 --rc geninfo_unexecuted_blocks=1 00:08:36.137 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:36.137 ' 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@64 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_AIO_FSDEV=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_UBLK=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_ISAL_CRYPTO=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_OPENSSL_PATH= 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OCF=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_FUSE=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_VTUNE_DIR= 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FSDEV=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_CRYPTO=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_PGO_USE=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_VHOST=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_DAOS=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DAOS_DIR= 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_UNIT_TESTS=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_VIRTIO=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_DPDK_UADK=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_COVERAGE=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_RDMA=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_LZ4=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_URING_PATH= 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_XNVME=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_VFIO_USER=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_ARCH=native 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_HAVE_EVP_MAC=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_URING_ZNS=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_WERROR=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_HAVE_LIBBSD=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_UBSAN=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_IPSEC_MB_DIR= 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_GOLANG=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_ISAL=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_IDXD_KERNEL=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_RDMA_PROV=verbs 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_APPS=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_SHARED=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_HAVE_KEYUTILS=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_FC_PATH= 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_FC=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_AVAHI=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_FIO_PLUGIN=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_RAID5F=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_EXAMPLES=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_TESTS=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_CRYPTO_MLX5=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_MAX_LCORES=128 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_IPSEC_MB=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_PGO_DIR= 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_DEBUG=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_CROSS_PREFIX= 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_COPY_FILE_RANGE=y 00:08:36.137 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_URING=n 00:08:36.138 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:36.138 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:36.400 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:36.401 #define SPDK_CONFIG_H 00:08:36.401 #define SPDK_CONFIG_AIO_FSDEV 1 00:08:36.401 #define SPDK_CONFIG_APPS 1 00:08:36.401 #define SPDK_CONFIG_ARCH native 00:08:36.401 #undef SPDK_CONFIG_ASAN 00:08:36.401 #undef SPDK_CONFIG_AVAHI 00:08:36.401 #undef SPDK_CONFIG_CET 00:08:36.401 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:08:36.401 #define SPDK_CONFIG_COVERAGE 1 00:08:36.401 #define SPDK_CONFIG_CROSS_PREFIX 00:08:36.401 #undef SPDK_CONFIG_CRYPTO 00:08:36.401 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:36.401 #undef SPDK_CONFIG_CUSTOMOCF 00:08:36.401 #undef SPDK_CONFIG_DAOS 00:08:36.401 #define SPDK_CONFIG_DAOS_DIR 00:08:36.401 #define SPDK_CONFIG_DEBUG 1 00:08:36.401 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:36.401 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:36.401 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:36.401 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:36.401 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:36.401 #undef SPDK_CONFIG_DPDK_UADK 00:08:36.401 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:36.401 #define SPDK_CONFIG_EXAMPLES 1 00:08:36.401 #undef SPDK_CONFIG_FC 00:08:36.401 #define SPDK_CONFIG_FC_PATH 00:08:36.401 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:36.401 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:36.401 #define SPDK_CONFIG_FSDEV 1 00:08:36.401 #undef SPDK_CONFIG_FUSE 00:08:36.401 #define SPDK_CONFIG_FUZZER 1 00:08:36.401 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:36.401 #undef SPDK_CONFIG_GOLANG 00:08:36.401 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:36.401 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:08:36.401 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:36.401 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:08:36.401 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:36.401 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:36.401 #undef SPDK_CONFIG_HAVE_LZ4 00:08:36.401 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:08:36.401 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:08:36.401 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:36.401 #define SPDK_CONFIG_IDXD 1 00:08:36.401 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:36.401 #undef SPDK_CONFIG_IPSEC_MB 00:08:36.401 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:36.401 #define SPDK_CONFIG_ISAL 1 00:08:36.401 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:36.401 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:36.401 #define SPDK_CONFIG_LIBDIR 00:08:36.401 #undef SPDK_CONFIG_LTO 00:08:36.401 #define SPDK_CONFIG_MAX_LCORES 128 00:08:36.401 #define SPDK_CONFIG_NVME_CUSE 1 00:08:36.401 #undef SPDK_CONFIG_OCF 00:08:36.401 #define SPDK_CONFIG_OCF_PATH 00:08:36.401 #define SPDK_CONFIG_OPENSSL_PATH 00:08:36.401 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:36.401 #define SPDK_CONFIG_PGO_DIR 00:08:36.401 #undef SPDK_CONFIG_PGO_USE 00:08:36.401 #define SPDK_CONFIG_PREFIX /usr/local 00:08:36.401 #undef SPDK_CONFIG_RAID5F 00:08:36.401 #undef SPDK_CONFIG_RBD 00:08:36.401 #define SPDK_CONFIG_RDMA 1 00:08:36.401 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:36.401 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:36.401 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:36.401 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:36.401 #undef SPDK_CONFIG_SHARED 00:08:36.401 #undef SPDK_CONFIG_SMA 00:08:36.401 #define SPDK_CONFIG_TESTS 1 00:08:36.401 #undef SPDK_CONFIG_TSAN 00:08:36.401 #define SPDK_CONFIG_UBLK 1 00:08:36.401 #define SPDK_CONFIG_UBSAN 1 00:08:36.401 #undef SPDK_CONFIG_UNIT_TESTS 00:08:36.401 #undef SPDK_CONFIG_URING 00:08:36.401 #define SPDK_CONFIG_URING_PATH 00:08:36.401 #undef SPDK_CONFIG_URING_ZNS 00:08:36.401 #undef SPDK_CONFIG_USDT 00:08:36.401 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:36.401 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:36.401 #define SPDK_CONFIG_VFIO_USER 1 00:08:36.401 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:36.401 #define SPDK_CONFIG_VHOST 1 00:08:36.401 #define SPDK_CONFIG_VIRTIO 1 00:08:36.401 #undef SPDK_CONFIG_VTUNE 00:08:36.401 #define SPDK_CONFIG_VTUNE_DIR 00:08:36.401 #define SPDK_CONFIG_WERROR 1 00:08:36.401 #define SPDK_CONFIG_WPDK_DIR 00:08:36.401 #undef SPDK_CONFIG_XNVME 00:08:36.401 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:36.401 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # uname -s 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@140 -- # : v22.11.4 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:08:36.402 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@179 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@179 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@180 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@180 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@185 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@185 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@189 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@189 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@193 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@193 -- # PYTHONDONTWRITEBYTECODE=1 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@197 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@197 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@198 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@198 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@202 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@203 -- # rm -rf /var/tmp/asan_suppression_file 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@204 -- # cat 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@240 -- # echo leak:libfuse3.so 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@242 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@242 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@246 -- # '[' -z /var/spdk/dependencies ']' 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@249 -- # export DEPENDENCY_DIR 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@253 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@253 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@254 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@254 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@257 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@257 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@258 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@258 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@263 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@263 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@265 -- # _LCOV_MAIN=0 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@266 -- # _LCOV_LLVM=1 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV= 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # [[ '' == *clang* ]] 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # [[ 1 -eq 1 ]] 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV=1 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@271 -- # _lcov_opt[_LCOV_MAIN]= 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@273 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@276 -- # '[' 0 -eq 0 ']' 00:08:36.403 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@277 -- # export valgrind= 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@277 -- # valgrind= 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@283 -- # uname -s 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@283 -- # '[' Linux = Linux ']' 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@284 -- # HUGEMEM=4096 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # export CLEAR_HUGE=yes 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # CLEAR_HUGE=yes 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@287 -- # MAKE=make 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@288 -- # MAKEFLAGS=-j112 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@304 -- # export HUGEMEM=4096 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@304 -- # HUGEMEM=4096 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@306 -- # NO_HUGE=() 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@307 -- # TEST_MODE= 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@329 -- # [[ -z 3765462 ]] 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@329 -- # kill -0 3765462 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1666 -- # set_test_storage 2147483648 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@339 -- # [[ -v testdir ]] 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@341 -- # local requested_size=2147483648 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@342 -- # local mount target_dir 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@344 -- # local -A mounts fss sizes avails uses 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@345 -- # local source fs size avail mount use 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@347 -- # local storage_fallback storage_candidates 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@349 -- # mktemp -udt spdk.XXXXXX 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@349 -- # storage_fallback=/tmp/spdk.GKNZfT 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@354 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@356 -- # [[ -n '' ]] 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@361 -- # [[ -n '' ]] 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@366 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.GKNZfT/tests/vfio /tmp/spdk.GKNZfT 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@369 -- # requested_size=2214592512 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@338 -- # df -T 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@338 -- # grep -v Filesystem 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_devtmpfs 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=devtmpfs 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=67108864 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=67108864 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=0 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=/dev/pmem0 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=ext2 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=4096 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=5284429824 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=5284425728 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=spdk_root 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=overlay 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=51631099904 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=61730607104 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=10099507200 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=30860537856 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=30865301504 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=4763648 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=12340129792 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=12346122240 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=5992448 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=30863499264 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=30865305600 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=1806336 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # mounts["$mount"]=tmpfs 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@372 -- # fss["$mount"]=tmpfs 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # avails["$mount"]=6173044736 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # sizes["$mount"]=6173057024 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # uses["$mount"]=12288 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # read -r source fs size use avail _ mount 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@377 -- # printf '* Looking for test storage...\n' 00:08:36.404 * Looking for test storage... 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@379 -- # local target_space new_size 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@380 -- # for target_dir in "${storage_candidates[@]}" 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@383 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@383 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@383 -- # mount=/ 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # target_space=51631099904 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@386 -- # (( target_space == 0 || target_space < requested_size )) 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@389 -- # (( target_space >= requested_size )) 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ overlay == tmpfs ]] 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ overlay == ramfs ]] 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # [[ / == / ]] 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@392 -- # new_size=12314099712 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@398 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@398 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@399 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:36.404 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@400 -- # return 0 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1668 -- # set -o errtrace 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1669 -- # shopt -s extdebug 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1670 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1672 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1673 -- # true 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1675 -- # xtrace_fd 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:36.404 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1680 -- # [[ y == y ]] 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # awk '{print $NF}' 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # lcov --version 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # lt 1.15 2 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:36.405 16:32:33 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1682 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # export 'LCOV_OPTS= 00:08:36.405 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:36.405 --rc genhtml_branch_coverage=1 00:08:36.405 --rc genhtml_function_coverage=1 00:08:36.405 --rc genhtml_legend=1 00:08:36.405 --rc geninfo_all_blocks=1 00:08:36.405 --rc geninfo_unexecuted_blocks=1 00:08:36.405 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:36.405 ' 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # LCOV_OPTS=' 00:08:36.405 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:36.405 --rc genhtml_branch_coverage=1 00:08:36.405 --rc genhtml_function_coverage=1 00:08:36.405 --rc genhtml_legend=1 00:08:36.405 --rc geninfo_all_blocks=1 00:08:36.405 --rc geninfo_unexecuted_blocks=1 00:08:36.405 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:36.405 ' 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1695 -- # export 'LCOV=lcov 00:08:36.405 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:36.405 --rc genhtml_branch_coverage=1 00:08:36.405 --rc genhtml_function_coverage=1 00:08:36.405 --rc genhtml_legend=1 00:08:36.405 --rc geninfo_all_blocks=1 00:08:36.405 --rc geninfo_unexecuted_blocks=1 00:08:36.405 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:36.405 ' 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1695 -- # LCOV='lcov 00:08:36.405 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:36.405 --rc genhtml_branch_coverage=1 00:08:36.405 --rc genhtml_function_coverage=1 00:08:36.405 --rc genhtml_legend=1 00:08:36.405 --rc geninfo_all_blocks=1 00:08:36.405 --rc geninfo_unexecuted_blocks=1 00:08:36.405 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:36.405 ' 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@65 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@67 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # fuzz_num=7 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@69 -- # (( fuzz_num != 0 )) 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@71 -- # trap 'cleanup /tmp/vfio-user-* /var/tmp/suppress_vfio_fuzz; exit 1' SIGINT SIGTERM EXIT 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@74 -- # mem_size=0 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@75 -- # [[ 1 -eq 1 ]] 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@76 -- # start_llvm_fuzz_short 7 1 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=7 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:36.405 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:36.405 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:36.665 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:36.665 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:36.665 16:32:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:36.665 [2024-11-28 16:32:34.075442] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:36.665 [2024-11-28 16:32:34.075526] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3765575 ] 00:08:36.665 [2024-11-28 16:32:34.148449] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.665 [2024-11-28 16:32:34.189183] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.925 INFO: Running with entropic power schedule (0xFF, 100). 00:08:36.925 INFO: Seed: 1490268894 00:08:36.925 INFO: Loaded 1 modules (381459 inline 8-bit counters): 381459 [0x29f7c8c, 0x2a54e9f), 00:08:36.925 INFO: Loaded 1 PC tables (381459 PCs): 381459 [0x2a54ea0,0x3026fd0), 00:08:36.925 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:36.925 INFO: A corpus is not provided, starting from an empty corpus 00:08:36.925 #2 INITED exec/s: 0 rss: 65Mb 00:08:36.925 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:36.925 This may also happen if the target rejected all inputs we tried so far 00:08:36.925 [2024-11-28 16:32:34.431839] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: enabling controller 00:08:37.477 NEW_FUNC[1/668]: 0x4521a8 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:84 00:08:37.477 NEW_FUNC[2/668]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:37.477 #18 NEW cov: 11081 ft: 11036 corp: 2/7b lim: 6 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:08:37.477 #19 NEW cov: 11100 ft: 14902 corp: 3/13b lim: 6 exec/s: 0 rss: 73Mb L: 6/6 MS: 1 ChangeBit- 00:08:37.735 NEW_FUNC[1/1]: 0x1be2b88 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:37.735 #24 NEW cov: 11118 ft: 16528 corp: 4/19b lim: 6 exec/s: 0 rss: 74Mb L: 6/6 MS: 5 EraseBytes-InsertByte-ShuffleBytes-CrossOver-CopyPart- 00:08:37.993 #27 NEW cov: 11118 ft: 17253 corp: 5/25b lim: 6 exec/s: 27 rss: 74Mb L: 6/6 MS: 3 CrossOver-EraseBytes-InsertRepeatedBytes- 00:08:37.993 #28 NEW cov: 11118 ft: 17419 corp: 6/31b lim: 6 exec/s: 28 rss: 75Mb L: 6/6 MS: 1 ChangeBinInt- 00:08:38.252 #29 NEW cov: 11118 ft: 17557 corp: 7/37b lim: 6 exec/s: 29 rss: 75Mb L: 6/6 MS: 1 CrossOver- 00:08:38.510 #30 NEW cov: 11118 ft: 17617 corp: 8/43b lim: 6 exec/s: 30 rss: 75Mb L: 6/6 MS: 1 ChangeBinInt- 00:08:38.769 NEW_FUNC[1/1]: 0x13405b8 in nvmf_prop_get_asq /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1389 00:08:38.769 #36 NEW cov: 11140 ft: 18008 corp: 9/49b lim: 6 exec/s: 36 rss: 75Mb L: 6/6 MS: 1 ChangeBit- 00:08:38.769 #37 NEW cov: 11147 ft: 18387 corp: 10/55b lim: 6 exec/s: 37 rss: 75Mb L: 6/6 MS: 1 CopyPart- 00:08:39.028 #38 NEW cov: 11147 ft: 18452 corp: 11/61b lim: 6 exec/s: 19 rss: 75Mb L: 6/6 MS: 1 CrossOver- 00:08:39.028 #38 DONE cov: 11147 ft: 18452 corp: 11/61b lim: 6 exec/s: 19 rss: 75Mb 00:08:39.028 Done 38 runs in 2 second(s) 00:08:39.028 [2024-11-28 16:32:36.532785] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: disabling controller 00:08:39.287 16:32:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-0 /var/tmp/suppress_vfio_fuzz 00:08:39.287 16:32:36 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:39.287 16:32:36 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:39.287 16:32:36 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:39.287 16:32:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:39.287 16:32:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:39.287 16:32:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:39.287 16:32:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:39.287 16:32:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:39.287 16:32:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:39.287 16:32:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:39.287 16:32:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:39.287 16:32:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:39.287 16:32:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:39.287 16:32:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:39.287 16:32:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:39.287 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:39.287 16:32:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:39.287 16:32:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:39.287 16:32:36 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:39.287 [2024-11-28 16:32:36.815455] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:39.287 [2024-11-28 16:32:36.815537] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3766058 ] 00:08:39.287 [2024-11-28 16:32:36.885240] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.287 [2024-11-28 16:32:36.923536] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.547 INFO: Running with entropic power schedule (0xFF, 100). 00:08:39.547 INFO: Seed: 4219277120 00:08:39.547 INFO: Loaded 1 modules (381459 inline 8-bit counters): 381459 [0x29f7c8c, 0x2a54e9f), 00:08:39.547 INFO: Loaded 1 PC tables (381459 PCs): 381459 [0x2a54ea0,0x3026fd0), 00:08:39.547 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:39.547 INFO: A corpus is not provided, starting from an empty corpus 00:08:39.547 #2 INITED exec/s: 0 rss: 65Mb 00:08:39.547 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:39.547 This may also happen if the target rejected all inputs we tried so far 00:08:39.547 [2024-11-28 16:32:37.157244] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: enabling controller 00:08:39.805 [2024-11-28 16:32:37.237492] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:39.805 [2024-11-28 16:32:37.237519] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:39.805 [2024-11-28 16:32:37.237539] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.064 NEW_FUNC[1/667]: 0x452748 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:71 00:08:40.064 NEW_FUNC[2/667]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:40.064 #12 NEW cov: 11038 ft: 11041 corp: 2/5b lim: 4 exec/s: 0 rss: 72Mb L: 4/4 MS: 5 CrossOver-ChangeByte-ShuffleBytes-InsertByte-CMP- DE: "\020\000"- 00:08:40.064 [2024-11-28 16:32:37.707925] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.064 [2024-11-28 16:32:37.707958] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.064 [2024-11-28 16:32:37.707977] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.323 NEW_FUNC[1/3]: 0x15a3848 in vfio_user_map_cmd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:1687 00:08:40.323 NEW_FUNC[2/3]: 0x15a3ab8 in nvme_map_cmd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:975 00:08:40.323 #13 NEW cov: 11096 ft: 13702 corp: 3/9b lim: 4 exec/s: 0 rss: 73Mb L: 4/4 MS: 1 CopyPart- 00:08:40.323 [2024-11-28 16:32:37.919787] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.323 [2024-11-28 16:32:37.919811] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.323 [2024-11-28 16:32:37.919829] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.582 NEW_FUNC[1/1]: 0x1be2b88 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:40.582 #19 NEW cov: 11113 ft: 15568 corp: 4/13b lim: 4 exec/s: 0 rss: 74Mb L: 4/4 MS: 1 PersAutoDict- DE: "\020\000"- 00:08:40.582 [2024-11-28 16:32:38.127197] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.582 [2024-11-28 16:32:38.127220] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.582 [2024-11-28 16:32:38.127258] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.840 #25 NEW cov: 11113 ft: 16303 corp: 5/17b lim: 4 exec/s: 25 rss: 74Mb L: 4/4 MS: 1 ShuffleBytes- 00:08:40.840 [2024-11-28 16:32:38.320780] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:40.840 [2024-11-28 16:32:38.320802] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:40.841 [2024-11-28 16:32:38.320819] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:40.841 #26 NEW cov: 11113 ft: 16856 corp: 6/21b lim: 4 exec/s: 26 rss: 74Mb L: 4/4 MS: 1 PersAutoDict- DE: "\020\000"- 00:08:41.099 [2024-11-28 16:32:38.509923] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:41.099 [2024-11-28 16:32:38.509946] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:41.099 [2024-11-28 16:32:38.509964] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:41.099 #27 NEW cov: 11113 ft: 17266 corp: 7/25b lim: 4 exec/s: 27 rss: 74Mb L: 4/4 MS: 1 ChangeByte- 00:08:41.099 [2024-11-28 16:32:38.703631] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:41.099 [2024-11-28 16:32:38.703655] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:41.100 [2024-11-28 16:32:38.703672] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:41.358 #33 NEW cov: 11113 ft: 17613 corp: 8/29b lim: 4 exec/s: 33 rss: 75Mb L: 4/4 MS: 1 ChangeBit- 00:08:41.358 [2024-11-28 16:32:38.898133] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:41.358 [2024-11-28 16:32:38.898156] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:41.358 [2024-11-28 16:32:38.898173] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:41.616 #34 NEW cov: 11120 ft: 17922 corp: 9/33b lim: 4 exec/s: 34 rss: 75Mb L: 4/4 MS: 1 ChangeBinInt- 00:08:41.616 [2024-11-28 16:32:39.087426] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:41.616 [2024-11-28 16:32:39.087449] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:41.616 [2024-11-28 16:32:39.087467] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:41.616 #35 NEW cov: 11120 ft: 18047 corp: 10/37b lim: 4 exec/s: 17 rss: 75Mb L: 4/4 MS: 1 ChangeByte- 00:08:41.616 #35 DONE cov: 11120 ft: 18047 corp: 10/37b lim: 4 exec/s: 17 rss: 75Mb 00:08:41.616 ###### Recommended dictionary. ###### 00:08:41.616 "\020\000" # Uses: 3 00:08:41.616 ###### End of recommended dictionary. ###### 00:08:41.616 Done 35 runs in 2 second(s) 00:08:41.616 [2024-11-28 16:32:39.219792] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: disabling controller 00:08:41.875 16:32:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-1 /var/tmp/suppress_vfio_fuzz 00:08:41.875 16:32:39 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:41.875 16:32:39 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:41.875 16:32:39 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:41.875 16:32:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:41.875 16:32:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:41.875 16:32:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:41.875 16:32:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:41.875 16:32:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:41.875 16:32:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:41.875 16:32:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:41.875 16:32:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:41.875 16:32:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:41.875 16:32:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:41.875 16:32:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:41.875 16:32:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:41.875 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:41.875 16:32:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:41.875 16:32:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:41.875 16:32:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:41.875 [2024-11-28 16:32:39.501518] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:41.875 [2024-11-28 16:32:39.501590] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3766589 ] 00:08:42.134 [2024-11-28 16:32:39.571554] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:42.134 [2024-11-28 16:32:39.609872] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:42.392 INFO: Running with entropic power schedule (0xFF, 100). 00:08:42.392 INFO: Seed: 2606310753 00:08:42.392 INFO: Loaded 1 modules (381459 inline 8-bit counters): 381459 [0x29f7c8c, 0x2a54e9f), 00:08:42.392 INFO: Loaded 1 PC tables (381459 PCs): 381459 [0x2a54ea0,0x3026fd0), 00:08:42.392 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:42.392 INFO: A corpus is not provided, starting from an empty corpus 00:08:42.392 #2 INITED exec/s: 0 rss: 65Mb 00:08:42.392 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:42.392 This may also happen if the target rejected all inputs we tried so far 00:08:42.392 [2024-11-28 16:32:39.839133] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: enabling controller 00:08:42.392 [2024-11-28 16:32:39.894639] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:42.392 [2024-11-28 16:32:39.894673] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:42.957 NEW_FUNC[1/670]: 0x453138 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:103 00:08:42.957 NEW_FUNC[2/670]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:42.957 #8 NEW cov: 11068 ft: 10968 corp: 2/9b lim: 8 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:08:42.957 [2024-11-28 16:32:40.380766] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:42.957 [2024-11-28 16:32:40.380818] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:42.957 #14 NEW cov: 11088 ft: 13494 corp: 3/17b lim: 8 exec/s: 0 rss: 73Mb L: 8/8 MS: 1 CopyPart- 00:08:42.957 [2024-11-28 16:32:40.556964] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.214 NEW_FUNC[1/1]: 0x1be2b88 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:43.214 #19 NEW cov: 11106 ft: 15133 corp: 4/25b lim: 8 exec/s: 0 rss: 74Mb L: 8/8 MS: 5 EraseBytes-InsertByte-InsertByte-ChangeBinInt-InsertByte- 00:08:43.214 [2024-11-28 16:32:40.745078] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.214 #22 NEW cov: 11106 ft: 16043 corp: 5/33b lim: 8 exec/s: 22 rss: 75Mb L: 8/8 MS: 3 ChangeByte-CopyPart-InsertRepeatedBytes- 00:08:43.472 [2024-11-28 16:32:40.935829] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.472 #33 NEW cov: 11106 ft: 16728 corp: 6/41b lim: 8 exec/s: 33 rss: 75Mb L: 8/8 MS: 1 CopyPart- 00:08:43.472 [2024-11-28 16:32:41.112753] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.730 #34 NEW cov: 11106 ft: 16844 corp: 7/49b lim: 8 exec/s: 34 rss: 75Mb L: 8/8 MS: 1 ShuffleBytes- 00:08:43.730 [2024-11-28 16:32:41.286327] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:43.730 [2024-11-28 16:32:41.286358] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:43.987 #35 NEW cov: 11106 ft: 17126 corp: 8/57b lim: 8 exec/s: 35 rss: 75Mb L: 8/8 MS: 1 CopyPart- 00:08:43.988 [2024-11-28 16:32:41.461848] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:43.988 #46 NEW cov: 11106 ft: 17205 corp: 9/65b lim: 8 exec/s: 46 rss: 75Mb L: 8/8 MS: 1 ChangeBit- 00:08:44.245 [2024-11-28 16:32:41.640393] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:44.245 [2024-11-28 16:32:41.640424] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:44.245 #47 NEW cov: 11113 ft: 17275 corp: 10/73b lim: 8 exec/s: 47 rss: 75Mb L: 8/8 MS: 1 ShuffleBytes- 00:08:44.245 [2024-11-28 16:32:41.818937] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: no payload for cmd5 00:08:44.245 [2024-11-28 16:32:41.818966] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:08:44.503 #48 NEW cov: 11113 ft: 17298 corp: 11/81b lim: 8 exec/s: 24 rss: 75Mb L: 8/8 MS: 1 CopyPart- 00:08:44.503 #48 DONE cov: 11113 ft: 17298 corp: 11/81b lim: 8 exec/s: 24 rss: 75Mb 00:08:44.503 Done 48 runs in 2 second(s) 00:08:44.503 [2024-11-28 16:32:41.948789] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: disabling controller 00:08:44.761 16:32:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-2 /var/tmp/suppress_vfio_fuzz 00:08:44.761 16:32:42 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:44.761 16:32:42 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:44.761 16:32:42 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:44.761 16:32:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:44.761 16:32:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:44.761 16:32:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:44.761 16:32:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:44.761 16:32:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:44.761 16:32:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:44.761 16:32:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:44.761 16:32:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:44.761 16:32:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:44.761 16:32:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:44.761 16:32:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:44.761 16:32:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:44.761 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:44.761 16:32:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:44.761 16:32:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:44.761 16:32:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:44.761 [2024-11-28 16:32:42.235669] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:44.761 [2024-11-28 16:32:42.235740] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3767130 ] 00:08:44.761 [2024-11-28 16:32:42.305163] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:44.761 [2024-11-28 16:32:42.343405] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:45.019 INFO: Running with entropic power schedule (0xFF, 100). 00:08:45.020 INFO: Seed: 1052345718 00:08:45.020 INFO: Loaded 1 modules (381459 inline 8-bit counters): 381459 [0x29f7c8c, 0x2a54e9f), 00:08:45.020 INFO: Loaded 1 PC tables (381459 PCs): 381459 [0x2a54ea0,0x3026fd0), 00:08:45.020 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:45.020 INFO: A corpus is not provided, starting from an empty corpus 00:08:45.020 #2 INITED exec/s: 0 rss: 65Mb 00:08:45.020 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:45.020 This may also happen if the target rejected all inputs we tried so far 00:08:45.020 [2024-11-28 16:32:42.583304] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: enabling controller 00:08:45.536 NEW_FUNC[1/669]: 0x453828 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:124 00:08:45.536 NEW_FUNC[2/669]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:45.536 #26 NEW cov: 11070 ft: 10954 corp: 2/33b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 4 CopyPart-InsertRepeatedBytes-ChangeByte-InsertRepeatedBytes- 00:08:45.794 #32 NEW cov: 11087 ft: 14198 corp: 3/65b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 CrossOver- 00:08:45.794 NEW_FUNC[1/1]: 0x1be2b88 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:45.794 #33 NEW cov: 11104 ft: 16052 corp: 4/97b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 ChangeBit- 00:08:46.051 #34 NEW cov: 11104 ft: 16542 corp: 5/129b lim: 32 exec/s: 34 rss: 74Mb L: 32/32 MS: 1 ShuffleBytes- 00:08:46.309 #45 NEW cov: 11104 ft: 16798 corp: 6/161b lim: 32 exec/s: 45 rss: 75Mb L: 32/32 MS: 1 ChangeByte- 00:08:46.567 #46 NEW cov: 11104 ft: 17109 corp: 7/193b lim: 32 exec/s: 46 rss: 75Mb L: 32/32 MS: 1 ChangeByte- 00:08:46.567 #47 NEW cov: 11104 ft: 17158 corp: 8/225b lim: 32 exec/s: 47 rss: 75Mb L: 32/32 MS: 1 ChangeByte- 00:08:46.825 #48 NEW cov: 11111 ft: 17401 corp: 9/257b lim: 32 exec/s: 48 rss: 75Mb L: 32/32 MS: 1 ChangeBit- 00:08:47.083 #53 NEW cov: 11111 ft: 17560 corp: 10/289b lim: 32 exec/s: 26 rss: 75Mb L: 32/32 MS: 5 EraseBytes-InsertRepeatedBytes-InsertRepeatedBytes-ChangeBit-CopyPart- 00:08:47.083 #53 DONE cov: 11111 ft: 17560 corp: 10/289b lim: 32 exec/s: 26 rss: 75Mb 00:08:47.083 Done 53 runs in 2 second(s) 00:08:47.083 [2024-11-28 16:32:44.609812] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: disabling controller 00:08:47.342 16:32:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-3 /var/tmp/suppress_vfio_fuzz 00:08:47.342 16:32:44 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:47.342 16:32:44 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:47.342 16:32:44 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:47.342 16:32:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:47.342 16:32:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:47.342 16:32:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:47.342 16:32:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:47.342 16:32:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:47.342 16:32:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:47.342 16:32:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:47.342 16:32:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:47.342 16:32:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:47.342 16:32:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:47.342 16:32:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:47.342 16:32:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:47.342 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:47.342 16:32:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:47.342 16:32:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:47.342 16:32:44 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:47.342 [2024-11-28 16:32:44.896852] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:47.342 [2024-11-28 16:32:44.896928] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3767452 ] 00:08:47.342 [2024-11-28 16:32:44.966498] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:47.600 [2024-11-28 16:32:45.005725] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.600 INFO: Running with entropic power schedule (0xFF, 100). 00:08:47.600 INFO: Seed: 3712330767 00:08:47.600 INFO: Loaded 1 modules (381459 inline 8-bit counters): 381459 [0x29f7c8c, 0x2a54e9f), 00:08:47.600 INFO: Loaded 1 PC tables (381459 PCs): 381459 [0x2a54ea0,0x3026fd0), 00:08:47.600 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:47.600 INFO: A corpus is not provided, starting from an empty corpus 00:08:47.600 #2 INITED exec/s: 0 rss: 65Mb 00:08:47.600 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:47.600 This may also happen if the target rejected all inputs we tried so far 00:08:47.858 [2024-11-28 16:32:45.252739] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: enabling controller 00:08:48.116 NEW_FUNC[1/668]: 0x4540a8 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:144 00:08:48.116 NEW_FUNC[2/668]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:48.116 #101 NEW cov: 11069 ft: 10931 corp: 2/33b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 4 InsertRepeatedBytes-ChangeBit-ChangeByte-InsertRepeatedBytes- 00:08:48.373 NEW_FUNC[1/1]: 0x20da398 in spdk_u32log2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/util/math.c:20 00:08:48.373 #102 NEW cov: 11085 ft: 14385 corp: 3/65b lim: 32 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 ChangeBinInt- 00:08:48.631 NEW_FUNC[1/1]: 0x1be2b88 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:48.631 #108 NEW cov: 11102 ft: 15434 corp: 4/97b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 ChangeByte- 00:08:48.631 #114 NEW cov: 11102 ft: 15753 corp: 5/129b lim: 32 exec/s: 114 rss: 74Mb L: 32/32 MS: 1 ChangeBit- 00:08:48.889 #115 NEW cov: 11102 ft: 16113 corp: 6/161b lim: 32 exec/s: 115 rss: 74Mb L: 32/32 MS: 1 ChangeByte- 00:08:49.147 #116 NEW cov: 11102 ft: 16516 corp: 7/193b lim: 32 exec/s: 116 rss: 74Mb L: 32/32 MS: 1 ShuffleBytes- 00:08:49.147 #117 NEW cov: 11102 ft: 16817 corp: 8/225b lim: 32 exec/s: 117 rss: 74Mb L: 32/32 MS: 1 CopyPart- 00:08:49.405 #118 NEW cov: 11102 ft: 16892 corp: 9/257b lim: 32 exec/s: 118 rss: 74Mb L: 32/32 MS: 1 ChangeBit- 00:08:49.663 #119 NEW cov: 11109 ft: 16910 corp: 10/289b lim: 32 exec/s: 119 rss: 74Mb L: 32/32 MS: 1 ChangeBit- 00:08:49.663 #125 NEW cov: 11109 ft: 17276 corp: 11/321b lim: 32 exec/s: 62 rss: 74Mb L: 32/32 MS: 1 ShuffleBytes- 00:08:49.663 #125 DONE cov: 11109 ft: 17276 corp: 11/321b lim: 32 exec/s: 62 rss: 74Mb 00:08:49.663 Done 125 runs in 2 second(s) 00:08:49.663 [2024-11-28 16:32:47.295799] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: disabling controller 00:08:49.921 16:32:47 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-4 /var/tmp/suppress_vfio_fuzz 00:08:49.921 16:32:47 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:49.921 16:32:47 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:49.921 16:32:47 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:49.921 16:32:47 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:49.921 16:32:47 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:49.921 16:32:47 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:49.921 16:32:47 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:49.921 16:32:47 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:49.921 16:32:47 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:49.921 16:32:47 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:49.921 16:32:47 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:49.921 16:32:47 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:49.921 16:32:47 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:49.921 16:32:47 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:49.921 16:32:47 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:49.921 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:49.921 16:32:47 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:49.921 16:32:47 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:49.921 16:32:47 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:50.180 [2024-11-28 16:32:47.575847] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:50.180 [2024-11-28 16:32:47.575915] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3767951 ] 00:08:50.180 [2024-11-28 16:32:47.646234] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:50.180 [2024-11-28 16:32:47.684553] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:50.437 INFO: Running with entropic power schedule (0xFF, 100). 00:08:50.437 INFO: Seed: 2100368739 00:08:50.437 INFO: Loaded 1 modules (381459 inline 8-bit counters): 381459 [0x29f7c8c, 0x2a54e9f), 00:08:50.437 INFO: Loaded 1 PC tables (381459 PCs): 381459 [0x2a54ea0,0x3026fd0), 00:08:50.438 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:50.438 INFO: A corpus is not provided, starting from an empty corpus 00:08:50.438 #2 INITED exec/s: 0 rss: 65Mb 00:08:50.438 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:50.438 This may also happen if the target rejected all inputs we tried so far 00:08:50.438 [2024-11-28 16:32:47.931162] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: enabling controller 00:08:50.438 [2024-11-28 16:32:47.977718] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.438 [2024-11-28 16:32:47.977756] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.954 NEW_FUNC[1/670]: 0x454aa8 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:171 00:08:50.954 NEW_FUNC[2/670]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:50.954 #110 NEW cov: 11084 ft: 11020 corp: 2/14b lim: 13 exec/s: 0 rss: 72Mb L: 13/13 MS: 3 CopyPart-InsertRepeatedBytes-InsertRepeatedBytes- 00:08:50.954 [2024-11-28 16:32:48.460685] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:50.954 [2024-11-28 16:32:48.460727] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:50.954 #111 NEW cov: 11098 ft: 14739 corp: 3/27b lim: 13 exec/s: 0 rss: 73Mb L: 13/13 MS: 1 ChangeByte- 00:08:51.211 [2024-11-28 16:32:48.654709] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.211 [2024-11-28 16:32:48.654739] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.211 NEW_FUNC[1/1]: 0x1be2b88 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:51.211 #117 NEW cov: 11115 ft: 16397 corp: 4/40b lim: 13 exec/s: 0 rss: 74Mb L: 13/13 MS: 1 ChangeBinInt- 00:08:51.211 [2024-11-28 16:32:48.843759] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.211 [2024-11-28 16:32:48.843791] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.470 #118 NEW cov: 11115 ft: 17257 corp: 5/53b lim: 13 exec/s: 118 rss: 74Mb L: 13/13 MS: 1 ChangeBinInt- 00:08:51.470 [2024-11-28 16:32:49.024697] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.470 [2024-11-28 16:32:49.024728] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.729 #119 NEW cov: 11115 ft: 17642 corp: 6/66b lim: 13 exec/s: 119 rss: 75Mb L: 13/13 MS: 1 CrossOver- 00:08:51.729 [2024-11-28 16:32:49.202965] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.729 [2024-11-28 16:32:49.202996] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.729 #120 NEW cov: 11115 ft: 18092 corp: 7/79b lim: 13 exec/s: 120 rss: 75Mb L: 13/13 MS: 1 ChangeBit- 00:08:51.987 [2024-11-28 16:32:49.380892] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.987 [2024-11-28 16:32:49.380922] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:51.987 #121 NEW cov: 11115 ft: 18202 corp: 8/92b lim: 13 exec/s: 121 rss: 75Mb L: 13/13 MS: 1 CopyPart- 00:08:51.987 [2024-11-28 16:32:49.559819] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:51.987 [2024-11-28 16:32:49.559849] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:52.245 #122 NEW cov: 11115 ft: 18276 corp: 9/105b lim: 13 exec/s: 122 rss: 75Mb L: 13/13 MS: 1 ChangeBit- 00:08:52.245 [2024-11-28 16:32:49.744109] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:52.245 [2024-11-28 16:32:49.744141] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:52.245 #123 NEW cov: 11122 ft: 18296 corp: 10/118b lim: 13 exec/s: 123 rss: 75Mb L: 13/13 MS: 1 CopyPart- 00:08:52.503 [2024-11-28 16:32:49.921884] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:52.503 [2024-11-28 16:32:49.921915] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:52.503 #124 NEW cov: 11122 ft: 18350 corp: 11/131b lim: 13 exec/s: 62 rss: 75Mb L: 13/13 MS: 1 ChangeBit- 00:08:52.503 #124 DONE cov: 11122 ft: 18350 corp: 11/131b lim: 13 exec/s: 62 rss: 75Mb 00:08:52.503 Done 124 runs in 2 second(s) 00:08:52.503 [2024-11-28 16:32:50.047805] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: disabling controller 00:08:52.762 16:32:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-5 /var/tmp/suppress_vfio_fuzz 00:08:52.762 16:32:50 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:52.762 16:32:50 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:52.762 16:32:50 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:52.762 16:32:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:52.762 16:32:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:52.762 16:32:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:52.762 16:32:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:52.762 16:32:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:52.762 16:32:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:52.762 16:32:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:52.762 16:32:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:52.762 16:32:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:52.762 16:32:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:52.762 16:32:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:52.762 16:32:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:52.762 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:52.762 16:32:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:52.762 16:32:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:52.762 16:32:50 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:52.762 [2024-11-28 16:32:50.334577] Starting SPDK v24.09.1-pre git sha1 b18e1bd62 / DPDK 22.11.4 initialization... 00:08:52.762 [2024-11-28 16:32:50.334672] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3768492 ] 00:08:52.762 [2024-11-28 16:32:50.408406] app.c: 917:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:53.021 [2024-11-28 16:32:50.447899] reactor.c: 990:reactor_run: *NOTICE*: Reactor started on core 0 00:08:53.021 INFO: Running with entropic power schedule (0xFF, 100). 00:08:53.021 INFO: Seed: 566412921 00:08:53.021 INFO: Loaded 1 modules (381459 inline 8-bit counters): 381459 [0x29f7c8c, 0x2a54e9f), 00:08:53.021 INFO: Loaded 1 PC tables (381459 PCs): 381459 [0x2a54ea0,0x3026fd0), 00:08:53.021 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:53.021 INFO: A corpus is not provided, starting from an empty corpus 00:08:53.021 #2 INITED exec/s: 0 rss: 65Mb 00:08:53.021 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:53.021 This may also happen if the target rejected all inputs we tried so far 00:08:53.279 [2024-11-28 16:32:50.688556] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: enabling controller 00:08:53.279 [2024-11-28 16:32:50.733657] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.279 [2024-11-28 16:32:50.733689] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.538 NEW_FUNC[1/668]: 0x455798 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:53.538 NEW_FUNC[2/668]: 0x457cb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:53.538 #22 NEW cov: 11058 ft: 10915 corp: 2/10b lim: 9 exec/s: 0 rss: 72Mb L: 9/9 MS: 5 InsertRepeatedBytes-InsertByte-CopyPart-ChangeBit-CrossOver- 00:08:53.796 [2024-11-28 16:32:51.224219] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.796 [2024-11-28 16:32:51.224260] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:53.796 NEW_FUNC[1/1]: 0x1895848 in nvme_pcie_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_pcie_common.c:29 00:08:53.796 #23 NEW cov: 11081 ft: 14337 corp: 3/19b lim: 9 exec/s: 0 rss: 73Mb L: 9/9 MS: 1 CrossOver- 00:08:53.796 [2024-11-28 16:32:51.425691] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:53.796 [2024-11-28 16:32:51.425723] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.054 NEW_FUNC[1/1]: 0x1be2b88 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:656 00:08:54.054 #24 NEW cov: 11098 ft: 15087 corp: 4/28b lim: 9 exec/s: 0 rss: 74Mb L: 9/9 MS: 1 ChangeByte- 00:08:54.054 [2024-11-28 16:32:51.617567] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.054 [2024-11-28 16:32:51.617600] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.312 #25 NEW cov: 11098 ft: 16043 corp: 5/37b lim: 9 exec/s: 25 rss: 74Mb L: 9/9 MS: 1 CopyPart- 00:08:54.312 [2024-11-28 16:32:51.802558] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.312 [2024-11-28 16:32:51.802589] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.312 #26 NEW cov: 11098 ft: 16533 corp: 6/46b lim: 9 exec/s: 26 rss: 74Mb L: 9/9 MS: 1 ShuffleBytes- 00:08:54.571 [2024-11-28 16:32:51.986850] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.571 [2024-11-28 16:32:51.986881] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.571 #27 NEW cov: 11098 ft: 17088 corp: 7/55b lim: 9 exec/s: 27 rss: 75Mb L: 9/9 MS: 1 ChangeBit- 00:08:54.571 [2024-11-28 16:32:52.173497] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.571 [2024-11-28 16:32:52.173528] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:54.828 #33 NEW cov: 11098 ft: 17125 corp: 8/64b lim: 9 exec/s: 33 rss: 75Mb L: 9/9 MS: 1 CMP- DE: "\005\"\353\353UE\223\000"- 00:08:54.828 [2024-11-28 16:32:52.363695] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:54.828 [2024-11-28 16:32:52.363726] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:55.086 #34 NEW cov: 11105 ft: 17287 corp: 9/73b lim: 9 exec/s: 34 rss: 75Mb L: 9/9 MS: 1 CrossOver- 00:08:55.086 [2024-11-28 16:32:52.557079] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:55.086 [2024-11-28 16:32:52.557109] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:55.086 #35 NEW cov: 11105 ft: 17497 corp: 10/82b lim: 9 exec/s: 17 rss: 75Mb L: 9/9 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:55.086 #35 DONE cov: 11105 ft: 17497 corp: 10/82b lim: 9 exec/s: 17 rss: 75Mb 00:08:55.086 ###### Recommended dictionary. ###### 00:08:55.086 "\005\"\353\353UE\223\000" # Uses: 0 00:08:55.086 "\000\000\000\000\000\000\000\000" # Uses: 0 00:08:55.086 ###### End of recommended dictionary. ###### 00:08:55.086 Done 35 runs in 2 second(s) 00:08:55.086 [2024-11-28 16:32:52.691794] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: disabling controller 00:08:55.344 16:32:52 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-6 /var/tmp/suppress_vfio_fuzz 00:08:55.344 16:32:52 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:55.344 16:32:52 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:55.344 16:32:52 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@84 -- # trap - SIGINT SIGTERM EXIT 00:08:55.344 00:08:55.344 real 0m19.377s 00:08:55.344 user 0m27.217s 00:08:55.344 sys 0m1.883s 00:08:55.344 16:32:52 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:55.344 16:32:52 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:55.344 ************************************ 00:08:55.344 END TEST vfio_llvm_fuzz 00:08:55.344 ************************************ 00:08:55.344 00:08:55.344 real 1m23.514s 00:08:55.344 user 2m6.882s 00:08:55.344 sys 0m10.283s 00:08:55.344 16:32:52 llvm_fuzz -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:55.344 16:32:52 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:55.344 ************************************ 00:08:55.344 END TEST llvm_fuzz 00:08:55.344 ************************************ 00:08:55.602 16:32:53 -- spdk/autotest.sh@374 -- # [[ '' -eq 1 ]] 00:08:55.602 16:32:53 -- spdk/autotest.sh@381 -- # trap - SIGINT SIGTERM EXIT 00:08:55.602 16:32:53 -- spdk/autotest.sh@383 -- # timing_enter post_cleanup 00:08:55.602 16:32:53 -- common/autotest_common.sh@724 -- # xtrace_disable 00:08:55.602 16:32:53 -- common/autotest_common.sh@10 -- # set +x 00:08:55.602 16:32:53 -- spdk/autotest.sh@384 -- # autotest_cleanup 00:08:55.602 16:32:53 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:08:55.602 16:32:53 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:08:55.602 16:32:53 -- common/autotest_common.sh@10 -- # set +x 00:09:02.163 INFO: APP EXITING 00:09:02.163 INFO: killing all VMs 00:09:02.163 INFO: killing vhost app 00:09:02.163 INFO: EXIT DONE 00:09:04.065 Waiting for block devices as requested 00:09:04.065 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:04.065 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:04.323 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:04.323 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:04.323 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:04.323 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:04.582 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:04.582 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:04.582 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:04.841 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:04.841 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:04.841 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:05.099 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:05.099 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:05.099 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:05.357 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:05.357 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:08.780 Cleaning 00:09:08.780 Removing: /dev/shm/spdk_tgt_trace.pid3741048 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3738577 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3739787 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3741048 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3741513 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3742594 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3742615 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3743730 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3743735 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3744169 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3744497 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3744820 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3745156 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3745295 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3745528 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3745808 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3746127 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3746984 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3750156 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3750347 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3750512 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3750673 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3751068 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3751185 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3751644 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3751799 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3752201 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3752208 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3752498 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3752503 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3753001 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3753176 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3753461 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3753785 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3754290 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3754830 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3755166 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3755650 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3756179 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3756513 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3757005 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3757540 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3757874 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3758369 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3758899 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3759224 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3759723 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3760261 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3760637 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3761082 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3761584 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3761899 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3762432 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3762791 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3763252 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3763782 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3764071 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3764610 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3764994 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3765575 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3766058 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3766589 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3767130 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3767452 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3767951 00:09:08.780 Removing: /var/run/dpdk/spdk_pid3768492 00:09:08.780 Clean 00:09:09.054 16:33:06 -- common/autotest_common.sh@1451 -- # return 0 00:09:09.054 16:33:06 -- spdk/autotest.sh@385 -- # timing_exit post_cleanup 00:09:09.054 16:33:06 -- common/autotest_common.sh@730 -- # xtrace_disable 00:09:09.054 16:33:06 -- common/autotest_common.sh@10 -- # set +x 00:09:09.054 16:33:06 -- spdk/autotest.sh@387 -- # timing_exit autotest 00:09:09.054 16:33:06 -- common/autotest_common.sh@730 -- # xtrace_disable 00:09:09.054 16:33:06 -- common/autotest_common.sh@10 -- # set +x 00:09:09.054 16:33:06 -- spdk/autotest.sh@388 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:09.054 16:33:06 -- spdk/autotest.sh@390 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:09.054 16:33:06 -- spdk/autotest.sh@390 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:09.054 16:33:06 -- spdk/autotest.sh@392 -- # [[ y == y ]] 00:09:09.054 16:33:06 -- spdk/autotest.sh@394 -- # hostname 00:09:09.054 16:33:06 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:09:09.314 geninfo: WARNING: invalid characters removed from testname! 00:09:11.852 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcda 00:09:15.146 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcda 00:09:17.678 16:33:14 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:25.800 16:33:22 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:29.993 16:33:27 -- spdk/autotest.sh@400 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:35.268 16:33:32 -- spdk/autotest.sh@401 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:40.544 16:33:37 -- spdk/autotest.sh@402 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:45.819 16:33:43 -- spdk/autotest.sh@403 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:51.094 16:33:48 -- spdk/autotest.sh@404 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:09:51.094 16:33:48 -- common/autotest_common.sh@1680 -- $ [[ y == y ]] 00:09:51.094 16:33:48 -- common/autotest_common.sh@1681 -- $ lcov --version 00:09:51.094 16:33:48 -- common/autotest_common.sh@1681 -- $ awk '{print $NF}' 00:09:51.094 16:33:48 -- common/autotest_common.sh@1681 -- $ lt 1.15 2 00:09:51.094 16:33:48 -- scripts/common.sh@373 -- $ cmp_versions 1.15 '<' 2 00:09:51.094 16:33:48 -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:09:51.094 16:33:48 -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:09:51.094 16:33:48 -- scripts/common.sh@336 -- $ IFS=.-: 00:09:51.094 16:33:48 -- scripts/common.sh@336 -- $ read -ra ver1 00:09:51.094 16:33:48 -- scripts/common.sh@337 -- $ IFS=.-: 00:09:51.094 16:33:48 -- scripts/common.sh@337 -- $ read -ra ver2 00:09:51.094 16:33:48 -- scripts/common.sh@338 -- $ local 'op=<' 00:09:51.094 16:33:48 -- scripts/common.sh@340 -- $ ver1_l=2 00:09:51.094 16:33:48 -- scripts/common.sh@341 -- $ ver2_l=1 00:09:51.094 16:33:48 -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:09:51.094 16:33:48 -- scripts/common.sh@344 -- $ case "$op" in 00:09:51.094 16:33:48 -- scripts/common.sh@345 -- $ : 1 00:09:51.094 16:33:48 -- scripts/common.sh@364 -- $ (( v = 0 )) 00:09:51.094 16:33:48 -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:51.094 16:33:48 -- scripts/common.sh@365 -- $ decimal 1 00:09:51.094 16:33:48 -- scripts/common.sh@353 -- $ local d=1 00:09:51.094 16:33:48 -- scripts/common.sh@354 -- $ [[ 1 =~ ^[0-9]+$ ]] 00:09:51.094 16:33:48 -- scripts/common.sh@355 -- $ echo 1 00:09:51.094 16:33:48 -- scripts/common.sh@365 -- $ ver1[v]=1 00:09:51.094 16:33:48 -- scripts/common.sh@366 -- $ decimal 2 00:09:51.094 16:33:48 -- scripts/common.sh@353 -- $ local d=2 00:09:51.094 16:33:48 -- scripts/common.sh@354 -- $ [[ 2 =~ ^[0-9]+$ ]] 00:09:51.094 16:33:48 -- scripts/common.sh@355 -- $ echo 2 00:09:51.094 16:33:48 -- scripts/common.sh@366 -- $ ver2[v]=2 00:09:51.094 16:33:48 -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:09:51.094 16:33:48 -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:09:51.094 16:33:48 -- scripts/common.sh@368 -- $ return 0 00:09:51.094 16:33:48 -- common/autotest_common.sh@1682 -- $ lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:51.094 16:33:48 -- common/autotest_common.sh@1694 -- $ export 'LCOV_OPTS= 00:09:51.094 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:51.094 --rc genhtml_branch_coverage=1 00:09:51.094 --rc genhtml_function_coverage=1 00:09:51.094 --rc genhtml_legend=1 00:09:51.094 --rc geninfo_all_blocks=1 00:09:51.094 --rc geninfo_unexecuted_blocks=1 00:09:51.094 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:51.094 ' 00:09:51.094 16:33:48 -- common/autotest_common.sh@1694 -- $ LCOV_OPTS=' 00:09:51.094 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:51.094 --rc genhtml_branch_coverage=1 00:09:51.094 --rc genhtml_function_coverage=1 00:09:51.094 --rc genhtml_legend=1 00:09:51.094 --rc geninfo_all_blocks=1 00:09:51.094 --rc geninfo_unexecuted_blocks=1 00:09:51.094 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:51.094 ' 00:09:51.094 16:33:48 -- common/autotest_common.sh@1695 -- $ export 'LCOV=lcov 00:09:51.094 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:51.094 --rc genhtml_branch_coverage=1 00:09:51.094 --rc genhtml_function_coverage=1 00:09:51.094 --rc genhtml_legend=1 00:09:51.094 --rc geninfo_all_blocks=1 00:09:51.094 --rc geninfo_unexecuted_blocks=1 00:09:51.094 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:51.094 ' 00:09:51.094 16:33:48 -- common/autotest_common.sh@1695 -- $ LCOV='lcov 00:09:51.094 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:51.094 --rc genhtml_branch_coverage=1 00:09:51.094 --rc genhtml_function_coverage=1 00:09:51.094 --rc genhtml_legend=1 00:09:51.094 --rc geninfo_all_blocks=1 00:09:51.094 --rc geninfo_unexecuted_blocks=1 00:09:51.094 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:51.094 ' 00:09:51.094 16:33:48 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:51.094 16:33:48 -- scripts/common.sh@15 -- $ shopt -s extglob 00:09:51.094 16:33:48 -- scripts/common.sh@544 -- $ [[ -e /bin/wpdk_common.sh ]] 00:09:51.094 16:33:48 -- scripts/common.sh@552 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:51.094 16:33:48 -- scripts/common.sh@553 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:51.094 16:33:48 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:51.094 16:33:48 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:51.094 16:33:48 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:51.094 16:33:48 -- paths/export.sh@5 -- $ export PATH 00:09:51.094 16:33:48 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:51.094 16:33:48 -- common/autobuild_common.sh@478 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:09:51.094 16:33:48 -- common/autobuild_common.sh@479 -- $ date +%s 00:09:51.094 16:33:48 -- common/autobuild_common.sh@479 -- $ mktemp -dt spdk_1732808028.XXXXXX 00:09:51.094 16:33:48 -- common/autobuild_common.sh@479 -- $ SPDK_WORKSPACE=/tmp/spdk_1732808028.LEkO4f 00:09:51.094 16:33:48 -- common/autobuild_common.sh@481 -- $ [[ -n '' ]] 00:09:51.094 16:33:48 -- common/autobuild_common.sh@485 -- $ '[' -n v22.11.4 ']' 00:09:51.094 16:33:48 -- common/autobuild_common.sh@486 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:09:51.094 16:33:48 -- common/autobuild_common.sh@486 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:09:51.094 16:33:48 -- common/autobuild_common.sh@492 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:09:51.094 16:33:48 -- common/autobuild_common.sh@494 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:09:51.094 16:33:48 -- common/autobuild_common.sh@495 -- $ get_config_params 00:09:51.094 16:33:48 -- common/autotest_common.sh@407 -- $ xtrace_disable 00:09:51.094 16:33:48 -- common/autotest_common.sh@10 -- $ set +x 00:09:51.094 16:33:48 -- common/autobuild_common.sh@495 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:09:51.094 16:33:48 -- common/autobuild_common.sh@497 -- $ start_monitor_resources 00:09:51.094 16:33:48 -- pm/common@17 -- $ local monitor 00:09:51.094 16:33:48 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:51.094 16:33:48 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:51.094 16:33:48 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:51.094 16:33:48 -- pm/common@21 -- $ date +%s 00:09:51.094 16:33:48 -- pm/common@21 -- $ date +%s 00:09:51.094 16:33:48 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:51.094 16:33:48 -- pm/common@25 -- $ sleep 1 00:09:51.094 16:33:48 -- pm/common@21 -- $ date +%s 00:09:51.094 16:33:48 -- pm/common@21 -- $ date +%s 00:09:51.094 16:33:48 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1732808028 00:09:51.094 16:33:48 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1732808028 00:09:51.094 16:33:48 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1732808028 00:09:51.094 16:33:48 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1732808028 00:09:51.094 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1732808028_collect-vmstat.pm.log 00:09:51.094 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1732808028_collect-cpu-load.pm.log 00:09:51.094 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1732808028_collect-cpu-temp.pm.log 00:09:51.095 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1732808028_collect-bmc-pm.bmc.pm.log 00:09:52.034 16:33:49 -- common/autobuild_common.sh@498 -- $ trap stop_monitor_resources EXIT 00:09:52.034 16:33:49 -- spdk/autopackage.sh@10 -- $ [[ 0 -eq 1 ]] 00:09:52.034 16:33:49 -- spdk/autopackage.sh@14 -- $ timing_finish 00:09:52.034 16:33:49 -- common/autotest_common.sh@736 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:52.034 16:33:49 -- common/autotest_common.sh@737 -- $ [[ -x /usr/local/FlameGraph/flamegraph.pl ]] 00:09:52.034 16:33:49 -- common/autotest_common.sh@740 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:52.034 16:33:49 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:09:52.034 16:33:49 -- pm/common@29 -- $ signal_monitor_resources TERM 00:09:52.034 16:33:49 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:09:52.034 16:33:49 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:52.034 16:33:49 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:09:52.034 16:33:49 -- pm/common@44 -- $ pid=3777200 00:09:52.034 16:33:49 -- pm/common@50 -- $ kill -TERM 3777200 00:09:52.034 16:33:49 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:52.034 16:33:49 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:09:52.034 16:33:49 -- pm/common@44 -- $ pid=3777202 00:09:52.034 16:33:49 -- pm/common@50 -- $ kill -TERM 3777202 00:09:52.034 16:33:49 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:52.034 16:33:49 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:09:52.034 16:33:49 -- pm/common@44 -- $ pid=3777204 00:09:52.034 16:33:49 -- pm/common@50 -- $ kill -TERM 3777204 00:09:52.034 16:33:49 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:52.034 16:33:49 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:09:52.034 16:33:49 -- pm/common@44 -- $ pid=3777226 00:09:52.034 16:33:49 -- pm/common@50 -- $ sudo -E kill -TERM 3777226 00:09:52.034 + [[ -n 3613806 ]] 00:09:52.034 + sudo kill 3613806 00:09:52.304 [Pipeline] } 00:09:52.320 [Pipeline] // stage 00:09:52.325 [Pipeline] } 00:09:52.340 [Pipeline] // timeout 00:09:52.346 [Pipeline] } 00:09:52.360 [Pipeline] // catchError 00:09:52.365 [Pipeline] } 00:09:52.383 [Pipeline] // wrap 00:09:52.390 [Pipeline] } 00:09:52.405 [Pipeline] // catchError 00:09:52.415 [Pipeline] stage 00:09:52.418 [Pipeline] { (Epilogue) 00:09:52.435 [Pipeline] catchError 00:09:52.437 [Pipeline] { 00:09:52.450 [Pipeline] echo 00:09:52.452 Cleanup processes 00:09:52.458 [Pipeline] sh 00:09:52.742 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:52.742 3777343 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/sdr.cache 00:09:52.742 3777764 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:52.758 [Pipeline] sh 00:09:53.046 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:53.046 ++ grep -v 'sudo pgrep' 00:09:53.046 ++ awk '{print $1}' 00:09:53.046 + sudo kill -9 3777343 00:09:53.046 + true 00:09:53.057 [Pipeline] sh 00:09:53.341 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:53.341 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:53.341 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:54.721 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:04.715 [Pipeline] sh 00:10:04.999 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:10:04.999 Artifacts sizes are good 00:10:05.014 [Pipeline] archiveArtifacts 00:10:05.022 Archiving artifacts 00:10:05.185 [Pipeline] sh 00:10:05.532 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:10:05.546 [Pipeline] cleanWs 00:10:05.557 [WS-CLEANUP] Deleting project workspace... 00:10:05.557 [WS-CLEANUP] Deferred wipeout is used... 00:10:05.564 [WS-CLEANUP] done 00:10:05.565 [Pipeline] } 00:10:05.580 [Pipeline] // catchError 00:10:05.591 [Pipeline] sh 00:10:05.873 + logger -p user.info -t JENKINS-CI 00:10:05.882 [Pipeline] } 00:10:05.896 [Pipeline] // stage 00:10:05.902 [Pipeline] } 00:10:05.916 [Pipeline] // node 00:10:05.921 [Pipeline] End of Pipeline 00:10:05.964 Finished: SUCCESS